Ask questions, get answers, and engage in discussions about B2B integration solutions from CData.
I am setting up a new trading partner to send EDI X12 850 and they keep coming back to me saying “we received your file but it is not processed with octet-stream. Please send as EDI-X12”.My X12 connector is setup and working just like the dozens of other trading partners.This is a screen shot of what they are sending me:I checked the Advanced settings on my X12 connector and didn’t see where the application/octet-stream is being set. I tried adding payload=application/edi-x12 on my AS2 Extension map to no avail.Any ideas would be appreciated.
OAuth 2.0 Overview In today’s standards, OAuth 2.0 is becoming increasingly popular as a means to authorize users who are attempting to access a resource, be it a mail server, API endpoint or database. A common example of this shift is the deprecation of basic authentication in Exchange Online (Microsoft’s mail server for Outlook), which can be read about in more detail in Microsoft’s article, here.OAuth 2.0 is an authorization protocol and not an authentication protocol; a means of granting users access to a set of resources - in the case of what is discussed here, that resource would be a mail server.At a very high level, OAuth 2.0 can be summarized in the following steps: The connecting client acquires both a client ID and a client secret from the authorization server. The client requests to be authorized by the authorization server by providing their client ID and client secret. Additionally, any necessary scopes and a redirect URI are also provided. If the supplied credentials
Hi I have a complex MySQL query with several JOINs need to only fetch new data from the tables. I have read https://cdn.cdata.com/help/AZH/mft/Database-AC.html#only-process-new-or-changed-records but not sure how to apply this to a Custom Query, Can custom queries us data from the message header? If so I may be able to use the method James suggested at
Hi all, I’m trying to create a mapping to translate a CSV format with parent-child rows into a flat CSV format. My source file is structured like this:Row Type Order number Customer name Total Cost Header 12345 John Smith $15.00 Row Type Item code Item name Cost Line 40123 5mm cable $5.00 Line 40124 10mm cable $10.00 Row Type Order number Customer name Total Cost Header 12346 Jane Lane $10.00 Row Type Item code Item name Cost Line 40124 10mm cable $10.00 Note that the actual file does not have headings, so looks like this:Header 12345 John Smith $15.00 Line 40123 5mm cable $5.00 Line 40124 10mm cable $10.00 Header 12346 Jane Lane $10.00 Line 40124 10mm cable $10.00 I am wanting to convert this to a flat CSV structure that combines data from the parent header and child line rows. So the above would be translated to:Order number Customer name Item code Item name Cost 12345 John
What are Lookups?Lookups describe a situation where you need to find a target piece of data, like an Account ID, but you only have a related public piece of data, like an email domain. You can reference some data store, like a database or SaaS platform, to “look up” the Account ID based on the email domain. If you’ve ever “looked up” the definition of a word in a dictionary, then you already understand the concept -- the spelling of the word is your starting public data (equivalent to the email domain), the definition of the word is the target data (equivalent to the Account ID), and the dictionary itself is the data store.Why are Lookups useful?Lookups solve critical problems that may arise when integrating external data (like EDI documents) into internal systems (like CRM or ERP platforms). When you integrate data into an ERP system, you may need to reference internal identifiers or private values like an Account ID, a warehouse ID, or a contract number. Inbound EDI documents will co
We have a lot of manufacturers that we send X12 850 purchase orders to. We don’t always receive a 997 and I am wondering if its possible to read and parse the MDN and extract the response confirming delivery.My goal is to confirm AS2 delivery and add an Email send connector to the flow and notify our purchasing department that the PO was received. Thanks
I have following script: <arc:set attr="Files#1">[FilePath]</arc:set><arc:call op="xmlDOMSearch?uri=[FilePath]&xpath=/Items/File"> <arc:set attr="Files#[_index | add(1)]">SOME_FILEPATH\[xpath('FileName')]</arc:set></arc:call><arc:enum attr="archiveFiles" expand="true"> <arc:set attr="_log.info">([_index]) [_attr] -> [_value]</arc:set></arc:enum> Whatever I try, the enumeration only shows the first item. If I add a arc:set="_log.info” in the arc:call, each item is logged, but for some reason it is not added in the array. The reason I want to add these files in an array is that I want to compress them all in 1 ZIP file.
One of the trickiest types of documents to map against in the EDI 856, which makes frequent use of a segment loop called HL (Hierarchical Loop). HLLoops represent multiple elements that are part of a hierarchical structure at once. From the EDI standpoint, each HLLoop is treated like a sibling element, but the HLLoop is a representation of an element in a hierachy so two neighboring HLLoops can mean different things based on the values of HL01, HL02, and HL03:HL01 is the index of the loop from the top of the documentHL02 is the index of the immediate parent to this loopHL03 is the type of the loopLet's look at a simplified 856: HL*1**S~ <-- the root shipmentHL*2*1*O~ <-- the first order, child of the shipmentHL*3*2*I~ <-- the first item, child of the first order (index 2)HL*4*2*I~ <-- the second item, child of the first order (index 2)HL*5*1*O~ <-- the second order, child of the shipmentHL*6*5*I~ <-- the first item of the second order (index 5)HL*7*5*I~ <--
I was wondering if anyone had a good solution for purging files from all flow connectors. Maybe a custom script where I could pass in WorkspaceId and a ConnectorId that would delete files from the Send/Receive directories.My initial thought, is it possible to delete files across all connectors within any given flow?I have a few CSV connectors that are used to download 5K+ rows, split them into their own file and pass the data into a SQL stored procedure.Because of the split, that single CSV is now a new XML file for each line. As it travels down the flow I end up with 5K+ XML files in my Send/Receive directories. Usually I get an Arc notification that the size of X directory has exceeded the recommended size..Right now, I use the interface to click Delete All from both Input/Output tabs to keep folders clean.
I don’t know if this is possible but I have a standard 850 map that I am able to use with all of our trading partners. I am using the SQL connector to return the data of a simple Header/Detail table relationship. Up until now, the data from the Detail table has always been sufficient for all of our current connections, but I have a new connection i’m setting up and they require additional data.Currently, I am creating POLoop1 > PO1and sending the following:PO101 linenumberPO102 QuantityPO103 Identifier (EA)PO104 PricePO105 (empty)PO106 Identifier (N4)PO107 UniqueIDThis new setup requires PID (which I have added) but they only need PID05 (item description).My question is: Is is possible to add some arc script within my loop where I can call a SQL stored proc and pass it the value of PO107 and map the result to PID05?Thanks
Just received an error on the Input of my AS2:The Message Integrity Check returned by the server is incorrect. Expected: iPPnthXZ7ofwn1v8b6lzhhPhE+2ZYHp+aHDfRgJsI8c=; Received : hrROR/jpYl1Q7PKbG2tY4AHImdCALRS37NPEDiH+5go=This is the first time I’ve seen this error. What I don’t know is - is this on my side or the trading partner?Anyone familiar?Thanks.
Setting the Batch Input Size in a database connection can increase the performance of that connection (provided the data source itself supports batches), because instead of sending requests one at a time, a batch insert or batch update can commit multiple records in a single request. Ordinarily, this cannot be used with the default UPSERT action for database connections. An UPSERT involves a query that first determines if the record exists so that new records can be INSERTed and existing records can be UPDATEd, but UPSERTs alternate SELECTs and INSERT/UPDATEs and cannot batched. If you were to be able to separate the requests into INSERTs and UPATEs, however, you could batch each types, and it so happens that there is a Lookup action for most database connectors. https://cdn.cdata.com/help/AZJ/mft/Database-Lookup.html If you execute this lookup step into a separate step in your flow, you can determine which records are inserts and which are updates, and separate them to batch them: A
Hi, I have a case where receiving XML files with base64 (e.g. PDF)files inside the file which ones do I want to send as attachments in an email with Email Send connector (Template mode)maybe someone knows what script could be used?XML example<?xml version="1.0" encoding="utf-8"?><ForwardingInstructions xmlns="urn:oasis:names:specification:ubl:schema:xsd:ForwardingInstructions-2" xmlns:dtd="http://dtd.riege.com/scope/shippingorder" xmlns:cbc="urn:oasis:names:specification:ubl:schema:xsd:CommonBasicComponents-2" xmlns:cac="urn:oasis:names:specification:ubl:schema:xsd:CommonAggregateComponents-2"> <cac:DocumentReference> <cbc:ID>523651</cbc:ID> <cbc:DocumentType>comment</cbc:DocumentType> <cbc:DocumentDescription> Transporter: Agile Logistics AS - BIL - Bil (Vei transport)</cbc:DocumentDescription> </cac:DocumentReference> <cac:DocumentReference> <cbc:ID>126242</cbc:ID> <cbc:Status>Docum
One of the most common causes for AS2 connection errors is not having the correct certificates configured for your trading partner within Arc. In an AS2 connection, your partner’s SSL server certificate is validated against your system’s security settings to proceed with an SSL handshake, and their S/MIME certificate is used for the validation of signed messages and encryption of outgoing messages. Ordinarily, these certificates are exchanged as part of the setup of a new AS2 connection, but certificates are commonly replaced, and you may be left with a connection that is not working after an update from your partner. The following methods are not foolproof, but there are possible methods for obtaining these certificates without direct contact from your partner. Obtaining the SSL Server certificate This is straightforward - place your partner’s receiving URL in a browser: If the URL resolves, click on the lock icon and locate the server certificate. the certificate viewer will giv
I started using the new Branch connector with an existing AS2 connector. The Branch connector will be used to save specific messages based on the Filename to a different folder.I've setup the Branch connector as shown in the image provided in the Help for the Branch connector.IF Filename matches glob CUS*.*But instead of the output being send to the provided File Connector to save the message in a different folder, the message is forwarded to the ELSE clause and forwarded to that attached connector.The log-file shows the following information (not really helpful at the moment):[2023-08-25T15:36:47.403] [Info] CUSDEC325-0000000000062014 is routed to File_SHPP2MEP_Test (which is the ELSE clause). [2023-08-25T15:36:47.403] [Info] Message CustomerAS2_TEST-20230825-153646575-eCIr is finalized. Name: CUSDEC325-0000000000062014, Type: Output, Status: Success, Message: , Processing Time: 16ms. [2023-08-25T15:36:47.403] [Info] Arc Version: 23.2.8606.0 [2023-08-25T15:36:47.403] [Info] System Ver
It is often the case in a document mapping that a source file can contain special characters that cause problems later down the line - notably in EDI mappings where non-printable characters cannot be allowed. Arc will treat strings as UTF-8 when reading from files, but sometimes you may want a bit of code to replace the non-printable characters in a string. The following Script: <arc:set attr="file.file" value="[FilePath]" /><arc:call op="fileRead" in="file" out="fileo"> <arc:set attr="output.data" value="[fileo.file:data | regexreplace('\[^ -~\]','')]" /></arc:call><arc:set attr="output.filename" value="[filename]" /><arc:push item="output" /> Can be used to replace any characters outside of the printable ASCII range (here, defined as anything not between the whitespace and tilde characters in the ASCII set) with an empty string. This script will replace characters across a whole file, but you can use the regexreplace statement on a single element
Is #EDI really a tough cookie to crack for you? Are you new to EDI and not sure where to start?If yes, then this one is for you.Join our CData Tech Evangelist @Matt S and me for a three-part webinar series that helps you understand #EDI automation from the ground up. Register now!You can find the details of the three-part webinar series here:
Hello ALL, I have new requirement, From Customer they use API to send data and i need to receive the data, convert to XML ( CDM Customized data format) send it SWAG system via AS2. And is same way, i will be receiving the XML file from SWAG system via AS2, need to convert to customer JASON format and POST the data to Customer using API. Please help to advice how i should be use the API, i never used API before, all i was familiar with connections like AS2 SFTP FTP’s. Hence i cant able to understand API in term of connection method. Thanks for your time and consideration. Thanks Pradeep
Hi CData Community, I am trying to add another layer of security for a webhook. I would like to do the following: 1. Pass @authtoken query parameter for CData API auth 2. Pass Authorization via headers Unfortunately, I am getting an error Error: Authentication is required for access to this resource.It seems like the webhook first checks the Authorization header for CData API auth before checking the query parameter @authtoken.But I need the Authorization header for my additional security. Is this implementation possible?Thanks!
The standard documentation in https://cdn.cdata.com/help/AZH/mft/op_HTTPReceive.html shows the elements needed to use the httpReceive operation in scripting. Is there documentation or examples available to show the structure or syntax on how to use this operation?
EDI acknowledgments provide an important aspect of the EDI standard: a non-repudiation guarantee. In other words, since you’re alerted when your business partners receive your EDI documents, they can’t later claim that they didn’t. Arc simplifies the process of generating and processing EDI acknowledgements by handling the details behind-the-scenes. Here's all you need to understand to ensure that Arc is configured correctly. EDI Connector pairs Arc’s EDI Connectors come in pairs – one for inbound EDI documents and one for outbound EDI documents. Each connector in this pair is configured for the same trading partner, but some settings are mirrored (e.g. the ‘Receiver ID’ from one is the ‘Sender ID’ for the other). Your EDI Connector pairs need to work together to automate EDI acknowledgements. When your inbound EDI connector receives a document, it needs to alert its outbound counterpart to send out an acknowledgement. Similarly, when you send an EDI document to a partner, that transac
This script is co-authored in part by Russel with Jersey Post:https://community.cdata.com/members/russell-jerseypost-190 The following code will enumerate the lines of a multiline file, such as a txt or csv, and output a new file with only the unique lines returned: <arc:set attr="file.file" value="[FilePath]" /><arc:call op="fileReadLine" item="file"> <!-- enumerate the file and add the rows to a collection --> <arc:check attr="newrows.[file.file:data|md5hash(false)]"> <arc:else> <arc:set item="newrows" attr="[file.file:data|md5hash(false)]" value="[file.file:data]" /> <arc:set attr="tmp.keys#" value="[file.file:data|md5hash(false)]" /> </arc:else> </arc:check></arc:call><!-- repopulate the row data --> <arc:set attr="output.data"> <arc:enum attr="tmp.keys">[newrows.[_value]]\n</arc:enum></arc:set><arc:push item="output" />
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.