Ask questions, get answers and engage with your peers
Start discussions, ask questions, get answers
The latest CData product news and updates
Explore and RSVP for upcoming events
Guides, documentation and support articles
Share product questions and get support
Join a group and share common interests
Hi,Following the instructions on the help I have set the property “SSLServerCert” to “*” to allow all certificates. If not specified, any certificate trusted by the machine is accepted.Use '*' to signify to accept all certificates. Note that this is not recommended due to security concerns. I do get authorized on the Zoho CRM server. However on connector side of the CData component I get the error message as shown on the last image below. Error Message
Using the CData JDBC Driver for Snowflake in AWS Glue, you can easily create ETL jobs for Snowflake data, whether writing the data to an S3 bucket or loading it into any other AWS data store. In this article, we walk through uploading the CData JDBC Driver for Snowflake into an Amazon S3 bucket, the necessary IAM Role permissions, storing the connection properties in AWS Secrets Manager, creating a custom connector and creating a connection for that connector. Step 1: Upload the CData JDBC Driver for Snowflake to an Amazon S3 Bucket. To work with the CData JDBC Driver for Salesforce in AWS Glue, follow these steps to upload it (and any relevant license files) to an Amazon S3 bucket: Open the Amazon S3 Console. Select an existing bucket (or create a new one). Click Upload Select the JAR file (cdata.jdbc.snowflake.jar) found in the lib directory in the installation location for the driver. Step 2: IAM Role You should create a new role or update permissions to an existing role. T
The API Connector in CData Connect Cloud lets you connect to any cloud-based API that produces XML or JSON data. This means that you can work with data from custom or proprietary APIs from your preferred tools and applications, just like you can with data from any of the 250+ natively supported sources. This article guides you through the process of creating a connection to an API and creating tables for various API endpoints. In this article, we will use the Trip Pin RW OData v4 reference services, but the principals apply to any API. Create the connection and configure global settings Navigate to the Connections page of Connect Cloud and click "+ Add Connection." Search for "API" and choose the API connector. Next name the connection and configure the Global Settings. Authentication By default, an API connector is set to No Auth for its Authentication Type. The sample OData API doesn't require authentication, so we will leave the Type as "No Auth." For more information on the d
Hi,I'm using the CData ODBC Driver for NetSuite (version: 21.0.7867.0 release) to access a customer's NetSuite data. Within the ODBC driver's connection properties, there are no options available to explicitly specify whether to connect to Netsuite.com or Netsuite2.com. Instead, I can provide the Account Id and Version (default 2021_1) for the connection. What data source the CData ODBC Driver for NetSuite is utilizing—Netsuite.com or Netsuite2.com when connecting to the customer's NetSuite? Where can I get this information, thanks.
🗓 Event Dates: September 22-29, 2023📍 Location: Right here in this threaded discussion on the CData Community Site👥 Experts on Deck: Jerod Johnson, Riley James, Jon Tye, and rock stars from the CData Champions Program 👋 Hello CData Community,The day we've all been waiting for is finally here! Welcome to our exclusive Ask Me Anything (AMA) session focused on Workday and its integrations with CData. Why You Should Participate:Get expert insights from Jerod Johnson, Riley James, Jon Tye, and other rock stars from our CData Champions Program. Learn best practices for integrating Workday with CData. Discover tips and tricks to optimize your Workday experience.How to Participate:Ask Away: Use this thread to post your questions about Workday and CData integrations. Share Your Knowledge: If you have tips or experiences to share, we'd love to hear them! Engage: Feel free to comment on other community members' questions and answers. To get the ball rolling, here are some questions for the ex
Change Data Capture (CDC) is a crucial technique in modern data management that allows you to track and capture changes made to a database. It provides real-time insights into data modifications, enabling businesses to react swiftly to evolving trends and requirements. In this article, we'll explore what CDC is and how to implement it using CData Sync. Understanding Change Data Capture (CDC) CDC revolves around capturing data changes at the source using a log file, eliminating the need for constant querying. Instead, CData Sync reads the log file for events (Insert, Update, or Delete) and extracts these changes in near real time, ensuring accurate replication for future use. Supported CDC Sources: MySQL: Utilizes binary logs. Oracle: Leverages Oracle Flashback. PostgreSQL: Employs logical replication. SQL Server: Uses either change tracking or change data capture. Creating a CDC Job in CData Sync To initiate a job in CData Sync, ensure you have pre-configured source and dest
Hello Team,Can you are aware for connectors authentication in restricted environment would require URL Whitelisting and for Facebook , I have to “facebook.com” done.Still doesn't work. Am I missing anything,Any quick response would be appreciated.BTW we are working on Facebook, Instagram, LinkedIn and YouTube CDATA Conectors. RegardsNS
SAS Studio environment can use any JDBC data source thanks to its JDBC interface. On previous issues we have monitored a certain query modification done by the tool or the SAS/ACCESS interface resulting in badly formatted queries, throwing both server and driver errors. On these occasions using the following code to connect and query enables a ‘’passthrough mode” which preserves the query from any modification done by SAS platform. Example: proc sql; CONNECT TO JDBC as conn (classpath="/Public/Drivers/cdata.jdbc.databricks.jar" class="cdata.jdbc.databricks.DatabricksDriver" schema=<databricks_schema> URL="jdbc:Databricks:Server=<your_server>; HTTPPath=<yourhttppath>; AuthScheme=PersonalAccessToken;Token=<your_token>"); Select * from connection to conn (Create table test1 (Name Varchar(10))); disconnect from conn; quit; Additional details can be found on the following documentation: SAS Help Center: SQL Pass-Through Facility Specifics for JDBC
When publishing a Workbook on Tableau Server, the necessity to configure User and Password properties may arise. In case when using the cdata.excel.legacy.taco file in the "C:\ProgramData\Tableau\Tableau Server\data\tabsvc\vizqlserver\Connectors" directory, the following window will appear in the Tableau Server interface: The User and Password properties need to be configured for successful data retrieval. However, even when these values are set in their respective fields (Username and Password), the driver fails to return the data. To ensure proper functionality, these properties must be set in the Connection String field. However, this approach introduces a security concern, as the Password remains unmasked. This means that any user with access to the published Workbook can potentially view the exposed Password. To address this issue, you should take the following steps: Remove the "legacy.taco" file from the directory located at "C:\ProgramData\Tableau\Tableau Server\data\
In order to enable Connect Server to communicate with the Facebook API during the sign-in process, the authorization server (OAuth) will redirect the user to the Connect Server redirect URL, which by default is: https://oauth.cdata.com/oauth/. This URL must correspond to the callback URL specified in the application configuration, as illustrated in the accompanying screenshot. You will have the option to enter this value in the Meta for Developers section of your Facebook login -> Settings page.
I have an angular project that is being built with the command: `npm run start`which runs the script: "start": "set NODE_OPTIONS=--openssl-legacy-provider && ng serve --proxy-config proxy.conf.json --prod",When the app is opened in "http://localhost:4200" it seems to load ok, but as soon as I change the user using the command: _setUser(window._user.sally_superAdmin);The user info is returned but I get an error message in the console: main.js:1 ERROR Malformed UTF-8 data rc @ main.js:1 handleError @ main.js:1 next @ main.js:1 __tryOrUnsub @ main.js:1 next @ main.js:1 _next @ main.js:1 next @ main.js:1 next @ main.js:1 emit @ main.js:1 main.js:1 ERROR TypeError: Cannot read properties of undefined (reading 'filter') at a.project (main.js:1:4756133) at a._next (main.js:1:3551188) at a.next (main.js:1:3529941) at a._next (main.js:1:3551286) at a.next (main.js:1:3529941) at h._next (main.js:1:3557774)
Explore e-Books, whitepapers, webinars and more!
Contact our support team and we'll be happy to help you get up and running!
Already have an account? Login
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.