Ask questions, get answers and engage with your peers
Start discussions, ask questions, get answers
The latest CData product news and updates
Explore and RSVP for upcoming events
Guides, documentation and support articles
Share product questions and get support
Join a group and share common interests
Hi Folks, Please help to advice what will be the max number of files / max Workers a connector will be utilized in a time, since many time i could see files getting stagged up for few hours not getting processed. Then manually sending the files also take time to process, but the server CPU utilization is in optimal level. This causes the business team unable to receive files in time and causing issues, above is the example from today 9.27 AM MYT files got queued up and not processed for nearly 1.5 hrs. Now i am manually processing them. This is simple Copy Connector.
Is there a way to conduct a WebSocket connection and exchange in Arc? Maybe something that can be done in ArcScript?
Hello!I know that I can define an out environment variable in pre-job event and use that in the task replication query (documented here and here excellently).I’m probably stretching here but can I define that out-variable to read from a file or any other source and then run the sync task for ALL values of that variable? Kind of a loop? Thanks in advance for you answers!
Hi Ethem,thanks for your replay, i have updated the latest drivers in my system. after that i can see the WQL extension objects. all objects i can see but however i am seeing some more problem in this query i am attaching error image for your reference. and also one more is, please suggest how to get all the objects listed in workday, same like in sql server usually SELECT * FROM information_schema.Tables sintax used to get all objectssimilar kind of thing any sintax is it available in Workday? kidnly advise.
Hi Team, while connecting Workday Cdata SSIS connector below error is coming kidnly advise.“Invalid FROM clause starting with character: 6405. Invalid WQL syntax.”i am using below query from workday.select * from creditcardtransactionswhere "company_Prompt" = 'LE020 Redwood Software, Inc.'
Learn why a new approach to data virtualization is needed to remove the biggest bottleneck faced by modern data teams.IT leaders: Watch to learn how to think about this new approach to data virtualization within your data management stack. Data analysts: Hear from fellow end users on the benefits of accessing live data directly from a variety of source systems. CIOs & CDOs: See how self-service data initiatives can be accelerated with a governed approach to live data access across teams.Data Virtualization, Reimagined - A Look Back and a Look Ahead with Amit SharmaCData's co-founder and CEO shares his perspective on the data virtualization market and what led the company to introduce a new approach specifically designed for modern data stacks and teams. Data Virtualization, Reimagined - A Customer's Perspective with Nathan Thompson (Scorpion)Learn how Nathan Thompson's financial planning and analysis team at Scorpion dramatically accelerated their reporting process in enabling live
I am reaching out to propose an enhancement to our data sync product, specifically regarding the introduction of validation and reconciliation functionalities. As our data source requirements caters to various data synchronization needs, it's imperative to offer users tools necessary for ensuring data integrity and peace of mind during synchronization processes. Run Validation: Introduce a validation feature that allows users to compare the data between the source and destination systems. Provide a simple comparison of the number of records in both the source and destination, enabling users to quickly identify any discrepancies. This basic validation functionality serves as a valuable troubleshooting tool and provides users with immediate insights into the synchronization status. Advanced Validation Mode: Offer an advanced validation mode that provides detailed insights into differences between the source and destination data. Highlight discrepancies in terms of Inserts, Deletes,
Exception happened:Failed to start database 'C:\ProgramData\CData\sync\db\cdata_sync' with class loader WebAppClassLoader{CData Sync}@20ce78ec, see the next exception for details.ServletPath: /login.rst ContextPath:
Hi everyone! I would like to know if is it possible to retrieve data from the SAP ECC 6.0 Function READ_TEXT, using the CData SSIS Components for SAP ERP 2023. Thanks.
I am having an issue with my sync installation. I am able to successfully start logminer on the oracle instance and query the results of that logminer session.However, when I run a CDC job from Sync, it will complete the initial data load successfully but then fail on the incremental loads after the initial one with the below error:logminer failed to start If I have it delete the table ot truncate the data then it reloads just fine. It just won’t start logminer to run the CDC loads.Has anyone run into this issue or found a way to resolve this so I can get CDC loads from oracle to sql server working correctly?
HiIs support for the new objects in Salesforce’s Non Profit Cloud (NPC) built into the latest version of dbamp; this would be for objects like GiftTransaction, etc?If not, is there a timeline to provide support?Many thanks
I’m evaluating the QuickBooks Destination SSIS connector and have a specific use case of inserting checks into QuickBooks from an OLEDB source query. When mapping the fields from the query, I noticed that the QuickBooks Destination depicts the Check Amount field as read only. Is there a workaround that would allow me to insert new check information into QuickBooks Desktop Premier 2020 to include the check amount?
Hey! I have written a query in CData Sync to load rows from TransactionAccountingLine only for a date range, Feb 2024.REPLICATE [TransactionAccountingLine]SELECT tranacc.* FROM [TransactionAccountingLine] tranaccLEFT JOIN [transaction] tranON tranacc.transaction = tran.idWHERE tran.trandate between '2024-02-01' and '2024-02-29'Upon running this query, I am getting the error, [0] Column 'lastmodifieddate' is ambiguous. But I should not be getting this error since I am selecting all the columns only from the TransactionAccountingLine table, where there is only one lastmodifieddate.Please help me with this issue.
Hello everyone ,I am trying to retrieve a Bearer Token with the GET Bearer Token script on ARC script connector : Below is my script :<arc:set attr="http.URL" value="https://prodautok.authentication.eu10.hana.ondemand.com/oauth/token?grant_type=client_credentials&token_format=jwt" /><arc:set attr="http.postdata" value='{"password":"XXXXXXX=","username":"sb-21f210cb-9a40-44f8-a78f-c23d13ceec27!b185659|it-rt-prodautok!b117912"}' /><arc:call op="httpPost" in="http" out="response"> <!-- Pass in the json response that contains the token to jsonOpen. --> <arc:set attr="token.text" value="[http:content]" /> <arc:call op="jsonOpen" in="token" out="handle"> <!-- Call jsonDOMGet with the handle that was obtained from jsonOpen. --> <arc:set attr="json.handle" value="[handle.handle]" /> <arc:set attr="json.map:token" value="/json/access_token" /> <arc:call op="jsonDOMGet" in="json" out="result" > <!-- Se
HI,Is there a direct path available to migrate data from Cassandra to MySQL RDS on AWS??
Explore e-Books, whitepapers, webinars and more!
Contact our support team and we'll be happy to help you get up and running!
Already have an account? Login
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.