Ask questions, get answers, and engage in discussions about B2B integration solutions from CData.
Arc HTTPGet Operation - How to pass Authentication: Bearer header
I’m just trying to authenticate with an API Endpoint which expects a fixed token to be passed as the Authorization HTTP header.Does this look like it should work? I need to pull the token/API key from the Vault.<arc:set attr="http.URL" value="https://api.XXXX.com/carrier/api/shipment-listing/[data.shipmentNumber]" /> <arc:set attr="http.header:Authorization" value="Bearer [Vault(My.API.Token)]"/> <arc:set attr="http.contenttype" value="application/json"/> <arc:call op="httpGet" in="http"> <arc:set attr="output.data" value="[http:content]" /></arc:call>I’m getting a 401 thanks
I am unable to bind my server on port XXXX (which was previously used for the embedded web server) on Windows
One issue that comes up infrequently but can be inscrutable to troubleshoot is a situation where you are unable to start the web server hosting CData Arc on a port that you have previously used for the embedded web server provided with the application. For the purposes of this thread, we will refer to port 8001 here, since this is the default port assigned to the embedded web server distributed with CData Arc. Most commonly, issues with binding to a specific port are due to an existing process that is already listening on that port, but you may find that even after you run a check for ports in the listening statenetstat -a You do not see anything listening on that port. One possibility is that the URL for the port you are selecting has already been reserved on the system and cannot be assigned to another server. To check for this, run: netsh http show urlacl This will list all of the reserved entries on the machine. If the URL that you are trying to bind shows up like this: Reserved U
Custom Signature Authentication in the Webhook Connector
OverviewSometimes, a requirement arises where you may need to receive data from a trading partner via the Webhook connector but that partner might require a bit more security other than just the basic auth username/password that is used to authenticate inbound webhook requests - or maybe you just want to add an additional layer of security on inbound webhook messages.Using the custom response feature of the webhook connector, you can actually create your own HTTP signature authentication logic to perform some additional authentication on inbound webhook requests, via the use of an HMAC signature value assigned to a header on the request.GitHub has a good writeup about this, and this is what has been used to make the below example within CData Arc - https://docs.github.com/en/webhooks-and-events/webhooks/securing-your-webhooksThe ScriptThis script is one that I wrote within the Response event of the Webhook Connector attached to this post:<!-- setting the secret key value to be avail
Connector Backoff Behavior in CData Arc
In CData Arc 2022, a new feature was added to the application which has been coined “backoff”. You may or may not have seen this mentioned in the application log of your instance, but if you haven’t, an entry in the application log would look something like this:~/AutomationService Warning The [ConnectorID] connector has failed more than 3 times and has entered a backoff state. The automation service will skip this connector until [date/time]. The connector will remain in the backoff state until a message is successfully sent manually, the connector settings change, or all messages are sent successfully during an automated send attempt.This is a feature where the automation for the connector will be temporarily paused if the connector fails to send a file successfully with automation three times in a row. If this happens, a 60 minute delay will be added to the automation of that connector before the connector will try to send any files via automation again. If the connector fails
How to automate a Sql Server Connector
I have a stored procedure that I want called once per day and the data pulled sent to a Send Email Connector. When I manually click the Receive button on the Output table of the Sql Server Connector, everything works as desired. However, I set up the Receive Interval on the Automation Settings to Receive the data once per day, but it doesn’t run. Am I doing something wrong? Or is there another way to automate the execution of this stored procedure? Thanks,Chris
Using OAuth 2.0 in the Email Send and Receive Connectors
OAuth 2.0 Overview In today’s standards, OAuth 2.0 is becoming increasingly popular as a means to authorize users who are attempting to access a resource, be it a mail server, API endpoint or database. A common example of this shift is the deprecation of basic authentication in Exchange Online (Microsoft’s mail server for Outlook), which can be read about in more detail in Microsoft’s article, here.OAuth 2.0 is an authorization protocol and not an authentication protocol; a means of granting users access to a set of resources - in the case of what is discussed here, that resource would be a mail server.At a very high level, OAuth 2.0 can be summarized in the following steps: The connecting client acquires both a client ID and a client secret from the authorization server. The client requests to be authorized by the authorization server by providing their client ID and client secret. Additionally, any necessary scopes and a redirect URI are also provided. If the supplied credentials
Execute final stored procedure after multiple tables import.
I have a flow that processes multiple files into multiple Sql Server tables. Once all those database tables are updated with the imported data, I want to run a stored procedure that uses that data to populate other tables. I was thinking I would just add another Sql Server connector after the import_pricing connector, but can I be sure that all the other import connectors have successfully completed first? Is there a proper way to do this in Arc, so that when I run the final stored procedure all the needed import tables are populated appropriately first? Thanks!
Get a filename part while mapping XML
Hello,I need to get a specific filename part value while mapping XML to the desired structure in the XMLMap connector. F. ex. I need to extract ‘1234’ part from a file name 1234_abcd.xml.I have tried to write an expression in the Expression Editor: [getfilename(substring(0, indexof(filename(), "_")))]It does not work and I am not sure that is wrong.Any ideas? Maybe there are even better solutions?
Setting sFTP server thumbprint to public key
Hello - I am having troubles establishing an sFTP ssh key pair connection. I want to set the server thumbprint to the servers public key which I have saved to a certificate but I just keep getting a “thumbprint format” error. Does anyone know exactly how to accomplish this?
Dynamic Bearer Token Authorization in the REST Connector
OverviewBearer token authentication can be one of the more finicky authentication mechanisms to implement within Arc. The reason is because the token lifetime is very short and automatic token refresh isn’t something that is built into that mechanism, like it is with OAuth 2.0. Thus, a new token needs to be generated and manually re-entered every time the old one expires. This isn’t really practical in Arc since many flows rely on automation to continually process files or make requests to APIs. Luckily, in Arc t is possible to create a custom script that gets a new token for each request that you send and then the token is dynamically evaluated within the REST connector.The ScriptThe script that should be used here is going to need to be responsible for two things:1) Obtaining a refresh token from the token endpoint of your API2) Adding the token as a header onto the message within ArcThis will fire every time your input file passes through the script and will get a new token for each
Customized reports & audits in CData Arc
CData Arc’s enterprise edition includes a built-in tool to generate custom reports tracking EDI messages, file exchanges with partners, and much much more. To learn how you can start building multi-dimensional reports out of CData Arc’s transactions in mere minutes, simply watch our 4-minute video tutorial: As always, feel free to ask any questions or provide any feedback as you generate your own reports! -- Matt S
Calling connectors from other workspaces
Hello,Thought I’d post this question here to help build content for the community.I have a situation where I have a workspace dedicated to complete document flows for all EDI trading partners (832, 850, 855, 810, 856). I have a single X12 connector where all documents either send or receive files.I have another workspace where I have specific 1-off flows created where my application interacts with Arc via API.An ongoing issue (in my industry) has always been customers asking us to re-send 810 Invoices. My main workspace is setup to send 810 files in a batch at the end of the day. So I have a flow I created in my “API” workspace where my application allows the user to generate a single 810 file and makes an HTTP POST to the workspaceid and connectorid using /api.rsc/receiveFile through the Admin API.My question is this: Since I already have these trading partners X12 connectors (and corresponding AS2 or SFTP connectors) setup in another workspace, is it possible to use the Branch c
Troubleshooting Common Errors in Arc
While our talented Arc support team is always happy to help resolve errors that you are seeing, it can certainly be nice to handle simpler issues on your own. To that end, we’ve compiled a list of common Arc errors and their solutions to arm you against those pesky red bars. If your error isn’t included in the list and you think it belongs, feel free to post it below! We’re always looking to expand our self-service resources and would love to hear your feedback. -- Matt S
Custom Code Snippets
Hi,I was wondering if it was possible to write and save custom code snippets which would be available for use with any script block. I have a lot of custom scripts that I re-use often and was wondering it there was a way to build upon the default snippet list. Either through the Arc interface or by adding my own code files within one of the Arc directories. Thanks!
Mapping a document with parent-child relationships to CSV (larger files)
This is a continuation of the discussion in Mapping a document with parent-child relationships to CSV (small files)In the previous discussion, relative XPaths were used to map header elements to line values in the XML Map, but this can become impractical if the input XML is very large, as is often the case with large inventory reports and health care claim reports. Loading the XML document model for large files in memory can often require more memory than is available, or at least significantly degrade performance. Fortunately, there is a feature of the XML Map that allows for efficient processing of XML files, available in the Advanced tab called XML Streaming: When XML Streaming is enabled, only the subtree for a particular element is available while the XML document is processed, allowing the connector to only load the relevant chunk of XML at a time. When doing this, relative XPaths as in our previous example are not available, so how can we access elements that are available at
Mapping a document with parent-child relationships to CSV
In the destination map, we need to access elements from two different tiers in the source XML hierarchy - transaction level elements like the name and order number come from the Order level, and line level elements like the item name and cost come from the Order Line level. For smaller source documents, the simplest way to create such a mapping it to begin by creating your Foreach mapping to the destination at the line level that you are mapping, to, creating a Foreach from the Orders/Items element in this case to the OrderLines in the destination: A frequent case that come up in CData Arc is a need to map a more complex data structure (such as an order) into a flat structure, such as a CSV. For this example, we’re going to use a source XML with two orders:<Items> <Orders> <CustomerName>Teddy Blasingame</CustomerName> <PONumber>PO042619</PONumber> <Items> <Name>Number Puzzle</Name> <C
Pagination within the REST Connector
OverviewSometimes APIs return data to the connecting client in a page based format. This can happen when the requested dataset to be returned is too large for the API to return in one response, or it could be the way that the API administrator has built the API to return responses.For example, if an API request is sent to GET the total number of items in an order, and the total number of items is 500, the API might return the results in a series of 10 pages where there are 50 items per page. The client requesting the information would then have to issue 10 separate API calls to get each page. This behavior is often coined “pagination”.The REST Connector and the FlowPagination isn’t something that is natively supported within the REST connector however, it is typically possible for Arc users to develop a solution to accomplish this by use of some custom script within a script connector and the utilization of custom headers to hold the page numbers (or page URLs) and build the URL within
Updating the Source in an XML Map after a mapping has started
Once you have begun a mapping using XML Map, you will find that is you attempt to update the source document, the XML Map connector will prompt you to restart your mapping: If you’ve already begun a really complicated XML Map, you don’t want to lose your progress, so how do you introduce a minor change, like a few additional elements? First, it’s not required that the source of your XML Map exactly match all of the possible nodes for mapping, so if you just need to access a new field, in the same loop - for example, Name, you don’t need to use the designer to drag and drop the element onto the destination, you can just start a new expression and manually type in the XPath of the element you want to grab: Still, it is possible to replace the source XML in the XML Map without starting your mapping over by uploading the new XML file and then overriding the source in the XML Map with the new file without using the designer by doing the following: Create a copy of the updated source temp
Grouping together records in a flat format (CSV) based on a shared key
A common challenge that can be encountered in mapping projects occurs when a data source contains element that can be grouped into multiple transactions, but the format of the data is in a flatten data model. A simple example of this can be seen in many CSV files: OrderNumber Customer Date Item Qty 12345 James Blasingame 3/17/23 Corned Beef 1 12345 James Blasingame 3/17/23 Colcannon 1 12346 Teddy Blasingame 3/17/23 Peanut Butter 1 12346 Teddy Blasingame 3/17/23 Apples 2 With the human eye, it is easy to tell that this table represents two separate transactions, each of which contains 2 line items (Orders 12345 and 12346) - but there is nothing in the CSV itself that indicates that this relationship is in place, and this information is understood by the user that is managing the file. The CSV Connector in CData Arc can convert this into an XML structure for mapping. When doing this, it is best to configure the CSV connector so that the Con
Supporting Multiple EDI relationships in a single flow (X12)
There’s a question that arises in configuring Arc that comes up often:I have a transport connector (like AS2 or SFTP) that receives multiple EDI files that are addressed to different interchange relationships. How can I support this in my flow if each X12 connector expects one EDI relationship (Interchange Sender ID and Interchange Receiver ID)?By default, the X12 connector is meant to be used in a 1:1 relationship with the interchange relationship in the EDI document, so if you had a sender that was using different interchange IDs for different document types - 123A and 123P for instance, you would set up X12 connectors where the Sender Identifier is configured in each one, and EDI files would need to be routed to the correct connector accordingly. Support multiple EDI relationships in a single connector (only supported in unlimited EDI licenses)If you are using a license that supports unlimited use of the EDI connectors, you will find an option in the Advanced tab to relax the valida
Supporting Multiple EDI relationships in a single flow (EDIFACT)
There’s a question that arises in configuring Arc that comes up often:I have a transport connector (like AS2 or SFTP) that receives multiple EDI files that are addressed to different interchange relationships. How can I support this in my flow if each EDIFACT connector expects one EDI relationship (Interchange Sender ID and Interchange Receiver ID)?By default, the EDIFACT connector is meant to be used in a 1:1 relationship with the interchange relationship in the EDI document, so if you had a sender that was using different interchange IDs for different document types - 123A and 123P for instance, you would set up EDIFACT connectors where the Sender Identifier is configured in each one, and EDI files would need to be routed to the correct connector accordingly. Support multiple EDI relationships in a single connector (only supported in unlimited EDI licenses)If you are using a license that supports unlimited use of the EDI connectors, you will find an option in the Advanced tab to rela
Supporting Multiple EDI relationships in a single flow
There’s a question that arises in configuring Arc that comes up often:I have a transport connector (like AS2 or SFTP) that receives multiple EDI files that are addressed to different interchange relationships. How can I support this in my flow if each X12 connector expects one EDI relationship (Interchange Sender ID and Interchange Receiver ID)? By default, the X12 connector is meant to be used in a 1:1 relationship with the interchange relationship in the EDI document, so if you had a sender that was using different interchange IDs for different document types - 123A and 123P for instance, you would set up X12 connectors where the Sender Identifier is configured in each one, and EDI files would need to be routed to the correct connector accordingly. Support multiple EDI relationships in a single connector (only supported in unlimited EDI licenses)If you are using a license that supports unlimited use of the EDI connectors, you will find an option in the Advanced tab to relax the vali
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.