On the Server side, we’ve added a new REGCLASS function and integrated two new CData JDBC drivers: SharePoint and Email. We’ve also updated the CData JDBC Driver for Jira to v25.0.9389.0 and the Neo4J JDBC driver to version 6.6.1.
The Server also now has a system option for user session timeout and email sending retries and response timeouts. Two other improvements extend model properties: importer.schemaPattern now supports fully-qualified schema names, and importer.tableNamePattern, fully-qualified table names.
For Git Integration, we’ve added HTTPS support for remote operations and fixed a bug preventing user limits from being tracked in Git.
We’ve also worked on our clustering implementation: clustering is now supported on Azure, including deployment on Azure Kubernetes Service, and we’ve fixed three bugs: one causing new nodes to fail to start and join the cluster due to serialization issues with jobs in state RUNNING, another where virtual view and procedure definitions were not distributed on ALTER and REPLACE, and yet another where Google BigQuery data source creation was not distributed properly due to missing key file on other nodes.
For Snowflake, we’ve fixed four bugs: one causing multi-catalog data source creation to fail if importer.loadMetadataWithJdbc=false and DB were omitted in the connection settings, another with missing column size for binary types and missing fractional seconds scale for time and timestamp columns, a third one resulting in inability to recreate data source with Microsoft Entra ID
authentication, and a fourth one causing the TRIM function to produce incorrect results.
We’ve also resolved a bug where failures in sending email notifications for a single job could block the execution of other jobs, a bug where a stopped refresh query operation (refreshTables, refreshDataSource, copyOver with refreshTarget=true) blocked subsequent refreshes of affected data source, and a bug where UNION of a query with a GROUP BY and a query with an ORDER BY parts resulted in IndexOutOfBoundsException error.
As for the Studio, we’ve enabled the "Test connection" button and test connection step in the data source and Analytical Storage wizards for Oracle ADWC, updated the user key file path field in data source and Analytical Storage creation and editing wizards to support Base64 encoding of files, and updated the Snowflake data source wizards to support key-pair authentication. Also, we’ve implemented a solution to handle a HAProxy idle connection issue and a related issue with maxIdleTime configuration causing performance issues on the server.
For the Exporter, we’ve created an interface to return server export to the caller as a script.
As for the Connectors, we’ve worked extensively on our Walmart connector. We’ve added functionality to get WFS inventory, added extra fields to the Report_Recon_JSON procedure, and resolved a bug causing this procedure to crash with HTTP Error 520, and we’ve added columns for three reports: orders, inboundshipment, and inboundshipmentitems.
For orders, the following changes have been introduced:
-
field shippingInfo_carrierMethodName was added into Orders procedure;
-
fields originalCarrierMethod, item_condition were added into additional table _Lines;
-
field subSellerId was added into additional table '_Lines_status.
For inboundshipment and inboundshipmentitems, the changes are as follows:
-
fields shipmentStatus, shipmentType, itemsSubmitted, receivedUnitsAtFC, poType, shipmentCarrierType, isExceptionOccurred, isPOBoxEnabled, carrierName, receiptStartDate,
receiptEndDate were added to the Inboundshipment procedure; -
fields fillRateAtFc, chargeDetails_chargeType, chargeDetails_netChargeAmount, receivedUnitsAtFc, damagedUnitsAtFc were added to the InboundshipmentItems
procedure.
We've also fixed two bugs affecting the Walmart connector: one causing the InboundShipmentItemsprocedure to failt with error "arraycopy: length -1 is negative" and another causing theInboundShipments procedure to failt due to incorrect date format in request.
For Facebook, we’ve updated the connector to v23.
For Amazon Ads, we’ve made the Amazon Brand Stores data retrievable through the Amazon Ads connector, and for Amazon Selling Partner, we’ve updated the report_FBALongTermStorageFeeCharges to the latest changes, which involved removing some columns, adding one new column, and changing the data type of some other columns. Here’s the list of the changes:
The following columns were removed:
-
long_time_range_long_term_storage_fee
-
qty_charged_long_time_range_long_term_storage_fee
-
qty_charged_short_time_range_long_term_storage_fee
-
short_time_range_long_term_storage_fee
New columns were added:
-
qty_charged of type decimal
The following columns changed data type:
-
asin - from string to string(10)
-
sku - from string to string(40)
-
condition - from string to string(100)
-
country - from string to string(2)
-
currency - from string to string(3)
-
fnsku - from string to string(40)
-
surcharge_age_tier - from string to string(50)
Last but not least, for Google Analytics Data, we’ve resolved the issue with incorrect parsing for custom events. Now all is well.
Here are all issues in this release:
Server
-
DVCORE-9013 (New Feature): Add REGCLASS function
-
DVCORE-9098 (Improvement): Update CData JDBC Driver for Jira to v25.0.9389.0
-
DVCORE-9089 (Improvement): Integrate CData SharePoint JDBC driver
-
DVCORE-9070 (Improvement): Clustering: add support for setting up a cluster on Azure
-
DVCORE-9061 (Improvement): Integrate CData Email JDBC driver
-
DVCORE-9015 (Improvement): Add HTTPS support for Git Integration remote operations
-
DVCORE-9011 (Improvement): Neo4J: update JDBC driver to version 6.6.1
-
DVCORE-8991 (Improvement): Add system option for user session timeout
-
DVCORE-8984 (Improvement): Introduce email sending retries and response timeout
-
DVCORE-8916 (Improvement): Extend "importer.schemaPattern" model property to support fully-qualified schema names
-
DVCORE-8915 (Improvement): Extend "importer.tableNamePattern" model property to support fully-qualified table names
-
DVCORE-9108 (Bug Fix): Clustering: new nodes fail to start and join the cluster due to
serialization issues with jobs in state RUNNING -
DVCORE-9101 (Bug Fix): Clustering: virtual view and procedure definitions are not distributed on ALTER and REPLACE
-
DVCORE-8835 (Bug Fix): Clustering: Google BigQuery data source creation is not distributed properly due to missing key file on other nodes
-
DVCORE-9058 (Bug Fix): Failures in sending email notifications for a single job can block the execution of other jobs
-
DVCORE-9051 (Bug Fix): Snowflake: multi-catalog data source creation fails if importer.loadMetadataWithJdbc=false and DB is omitted in the connection settings
-
DVCORE-9048 (Bug Fix): Snowflake: missing column size for binary types and missing fractional seconds scale for time and timestamp columns
-
DVCORE-9025 (Bug Fix): Snowflake: unable to recreate data source with Microsoft Entra ID
authentication -
DVCORE-8799 (Bug Fix): Snowflake: TRIM function produces incorrect results
-
DVCORE-9045 (Bug Fix): Web Business Data Shop: items published in MAINTENANCE mode get incorect names and states in "SYSADMIN.WebBusinessDataShopPublished" table after disabling it
-
DVCORE-9044 (Bug Fix): Web Business Data Shop: it is possible to publish the same item a
second time after recreating it with the same name, but different casing -
DVCORE-9021 (Bug Fix): Checking role permissions on the configuration database causes a delay on procedure execution
-
DVCORE-8912 (Bug Fix): Stopped refresh query blocks subsequent refreshes of affected data source
-
DVCORE-8779 (Bug Fix): Problems connecting to the configuration database cause jobs to hang
-
DVCORE-8665 (Bug Fix): LDAP Authentication: gathering permissions from the configuration database causes a delay at server startup
-
DVCORE-6434 (Bug Fix): Some numeric data types are mapped incorrectly in CData Virtuality connector
-
DVCORE-8139 (Bug Fix): ONCE schedules use the server restart time instead of the schedule creation time to calculate the delay after the server restart
-
DVCORE-8113 (Bug Fix): Oracle ADWC: "SYSADMIN.testConnection" procedure fails for Oracle ADWC connections
-
DVCORE-8559 (Bug Fix): Git Integration: user limits are not tracked in Git
-
DVCORE-8251 (Bug Fix): ROW DELIMITER is escaped incorrectly in TEXTTABLE function
-
DVCORE-8093 (Bug Fix): UNION of a query with a GROUP BY and a query with an ORDER BY parts results in IndexOutOfBoundsException error
-
DVCORE-8532 (Bug Fix): History update job requires an Analytical Storage even if the target is a regular data source
Studio
-
DVCORE-9072 (Improvement): Enable the "Test connection" button and test connection step in the data source and Analytical Storage wizards for Oracle ADWC
-
DVCORE-9064 (Improvement): Update the user key file path field in data source and Analytical Storage creation and editing wizards to support Base64 encoding of files
-
DVCORE-9023 (Improvement): Update Snowflake data source wizards to support key-pair authentication
-
DVCORE-9093 (Bug Fix): Handle HAProxy idle connection issue without introducing maxIdleTime re-login problem
-
DVCORE-9092 (Bug Fix): maxIdleTime configuration causes performance issues on the server
Exporter
-
DVCORE-8897 (Improvement): Create an interface to return server export to the caller
Connectors
-
SQL-1110 (Improvement): Walmart: add columns for orders report
-
SQL-1107 (Improvement): Walmart: Report_Recon_JSON procedure crashes with HTTP Error 520
-
SQL-1104 (Improvement): Walmart: add extra fields to the Report_Recon_JSON procedure
-
SQL-1101 (Improvement): Walmart: add missing columns for inboundshipment and inboundshipmentitems
-
SQL-1084 (Improvement): Walmart: add functionality to get WFS inventory (new)
-
SQL-1087 (Improvement): Facebook: update connector to v23
-
SQL-1083 (Improvement): Amazon Ads: make the Amazon Brand Stores data retrievable through the Amazon Ads connector
-
SQL-1078 (Improvement): Amazon Selling Partner: update the Report_FBALongTermStorageFeeCharges to the latest changes
-
SQL-1057 (Improvement): Braze connector: add POST functionality to update customer
attributes -
SQL-1105 (Bug Fix): Walmart: InboundShipmentItems proc fails with error "arraycopy:
length -1 is negative" -
SQL-1099 (Bug Fix): Walmart: InboundShipments procedure fails due to incorrect date
format in request -
SQL-1085 (Bug Fix): Google Analytics Data: incorrect parsing for custom events
-
SQL-1080 (Bug Fix): Awin: trackedCurrencyAmount variable in the Awin internal_Transactions.sql is type decimal but returns a JSON
-
SQL-982 (Bug Fix): Amazon SP: wrong job status when report_fbamanageinventory is FATAL

