Skip to main content

TimeXtender Data Enrichment 25.2

Today, we’ve started the rollout of the next release of TimeXtender Data Enrichment, v. 25.2, that contains the changes listed below.NewTimeXtender Master Data Management has been renamed to TimeXtender Data Enrichment Web URLs have been updated to use “timextender” instead of “exmon”. To access the web application, users will now go to {customer}.de.timextender.com instead of {customer}.exmon.com. Our login page login.exmon.com has been replaced with login.timextender.com, and even though login.exmon.com will continue to work we encourage you to use login.timextender.com from now on.ImprovedThe overview in User Configuration now shows the email of the users Better error message when app registrations for SSO did not have ‘Group.Read.All’ permission and user tried to search for Entra Groups in user configuration. Added support for more data types when importing a table from a database.  When clicking save after opening Database User Configuration, all users get Select permission to ‘exmondm’ schema and Execute permission to ‘exmondm.GetHierarchyValue’ function. Emails use ‘TimeXtender Data Enrichment’ instead of ‘Master Data Management’. Install file has updated name ‘TimeXtenderDESetup’ Use const string variable for "[None]" and "[Fixed Value]" instead of manual written string around the codebase. Using new icons and graphics.FixedFixed a bug where the Portal skipped the "Is Required" check when doing validation. If a table had Lookup columns with custom setup they could not be displayed in the web application User could not import into a table with a Large number column in the web application Fixed a bug where importing data into a table with a "Large Number" data type did not work.  More support for Availability Group connection Small fixes to the New User UI for on premise users Saved View did not work in the web application if it was filtering on a blank item list value During Import, Primary Key has Unique Key Action, Read Only has Ignore Action, else Update Action. During Import, Action column is no longer text editable and works as drop down selector. While DE is open and network connection is lost (i.e., during Windows supension) sql connection error is logged with log4net in install-folder\Log4NetDataEnrichment.txt Fixed a few bugs affecting percentage formatted decimal columns: Min/Max value for constraints was not always saved properly, importing data into a column did not always work and comparisons sometimes used wrong values.  On Table import, the column action selection functions the same way in web and desktop and on web it’s no longer possible the edit combobox value. All previous failing unit tests For lookup column, switching from Popup Window – ‘Custom Visible Columns’ to ‘Use Display Value’ re-adds ‘Custom Display Value’ to visible columns so that combobox can display the values (and not data type). Fixed a bug where the Project, Lookup Table and Key Column values were not always saved correctly for Hierarchy Lookup Attributes.  Fixed a bug where the scrollbar was not visible when giving permissions/privileges to a user.  When logging in the Microsoft and Google icons were cropped User was unable to authorize Fivetran connection Better error message when there is an error with finding Entra Groups Adding new user/group in User Configuration while having a filter does not cause any errors. Changing column name and then changing column type from normal to item list in table designer does not cause any issues.

Related products:TimeXtender Data Enrichment
TimeXtender Data Integration 7142.1

TimeXtender Data Integration 7142.1

It's almost like you cannot say fall without Fabric. At least this October release of TimeXtender Data Integration (Desktop v. 7142.1) has Microsoft Fabric features as the headliner along with a new logging system for Ingest and a bunch of big and small fixes and improvements. Dive in below!  New and improvedFabric Lakehouse on Prepare gets views, shortcut tables and much moreWith this release, we're taking a major step towards full support for Microsoft Fabric Lakehouse as Prepare instance storage. Validations, history tables, table inserts, custom data, custom hash fields and junk dimensions are now supported on Fabric. On the programmability side of things, you can now add Notebook views, a new type of custom views that are deployed on a Fabric notebook and can be referenced and used in custom scripts. Regular custom views are also supported through the Fabric storage's SQL endpoint.Utilizing the OneLake shortcut feature, the Fabric flavor of the enable/disable physical valid table option is now available. Disabling the valid table creates a shortcut table where data is loaded directly from the Ingest instance. This saves execution time and storage space, but on the flipside, the table in Prepare must be 100% identical to the source table in Ingest - no transformations allowed. As a final improvement, Fabric instances now support service principal authentication, so a non MFA-user is no longer required. Redesigned Ingest loggingWe've taken a fresh look at logging in the Ingest instance in order to make it more useful and more robust. It includes more thorough logs, more options, an improved UI and, last but not least, file-based logging.Logs are now stored in the local file system and not in the Ingest instance, which is handy when you're investigating, e.g., issues with connecting to the instance in the cloud. It also makes them immediately available if you want to use them for analysis purposes. To aid in that, we're using the open W3C Extended Log File Format for the logs.When setting up the Ingest service, you can configure the log retention in days, as well as the maximum file size of the logs to keep the logs from hoarding to much storage space. You can also select the minimum severity level for logs to the log file as well as the Windows event log.The old logging system has been deprecated, but is still available for the time being when Show Deprecated Features is enabled in the View menu.  PowerBI endpoints now support Direct LakeThe Power BI endpoint in Deliver now supports Direct Lake on SQL and Direct Lake on OneLake which facilitates much faster data access. This is especially useful when you have really large amounts of data. Control access to instance blueprintsBased on partner feedback, we've updated the access controls for instance blueprints. Partners can now control which customers are granted access to specific shared blueprints. The initial implementation worked more like a "shared folder" where all blueprints were available to all customers.Note: As a partner, you must update your existing blueprints to grant access to the customers that should have access, since the new default is no access. And much moreIn addition to the headliners, the release also includes a bunch of smaller improvements:Snowflake storage now supports key-pair authentication in preparation for Snowflake's requirement for multi-factor authentication for all users that kicks in by November. In the Ingest instance, we've changed the name of the default transfer task to "Load Data" and added a second default task called "Load Data (Full Load)". Previously, the default task was called "Full Load", but would actually used incremental load if it was available. The Ingest Service Configuration can now import the configuration of a previously installed version, making upgrading a bit easier. In the Portal, we've made deleting data source connections easier. You'll now see what instances the data source is mapped to and those mappings will be deleted along with the connection. Previously, you'd have to delete all mappings before you could delete the data source connection.   We've tuned the data lineage queries to get you better performance when you want to view data lineage for an object.  REST data sources now makes URL encoding of query parameters optional. ODATA discovery now includes certificates (PFX/PEM) for authentication.Fixed DesktopFixed freezing UI and missing feedback when generating End-to-End Tasks and Packages. Fixed an issue in the Deploy and Execute dialog which could cause out of memory exception when having deployment and/or execution steps. Fixed an issue causing an 'Unauthorized' error. Fixed an issue where very large error messages could cause TDI to not correctly communicate with Orchestration Zoom in/out should now behave correctly in the data lineage window. Fixed an issue where data lineage would find old relation and potentially make the search very large. Changed some misleading labels in the Primary Keys and Synchronize windows. Added option to disable "windowed incremental load" for ADO and OleDB data sources to make incremental load work for specific ODBC sources.  Fixed an issue in the remapping dialog in the Metadata Manager where sorting the columns would cause an "Index out of range" error and close the dialog. Fixed issue with installing new data source versions in a multi threaded environment. Fixed issue where ODBC/Ansi syntax for ADO/OleDB ignores incremental subtract value. Search & replace in REST now saves empty replace values as empty strings instead of null.  Fixed errors with data type conversions on data area transfers using Fabric prepare. Fixed an issue with custom transformations and transformations on lookup fields returning null on Fabric prepare. Fixed an issue with empty date tables when working with multiple schemas on Prepare instances with Fabric storage. Fixed an issue with selection rules not working after renaming a field on Prepare instances with Fabric storage. Fixed an issue where the stored procedure used for direct read between data areas could become too long. Fixed an issue where adding and deleting a table mapping in a data area could crash the application. Fixed an issue where the SQL endpoint in a Prepare instance could cause slow UI updates. Fixed an issue where renaming a table where a field is used as a lookup field on a conditional lookup field would not mark the table where the conditional lookup field belongs as modified. Fixed an issue where deploying a history table with a dot in the table name would fail. Custom selection rules in Deliver instances would loose variables when closing the project. Fixed an issue where the order of the columns from Prepare instance would change. Made sure the Add Calculation Group and Edit Column Description dialogs have a minimum height and only vertical scrollbars. Fixed an issue where the semantic endpoint would resolve the Endpoint Name parameter incorrectly on execution.  Fixed an issue where Power BI endpoint deployment with RLS fails with error AD Service Principal Authentication is not supported with this SQL Server version. Fixed an issue where deploying a Prepare instance could crash the application. Fixed an issue with parsing SqlDecimal data in Parquet files in Ingest. PortalFixed a duplicate/incorrect log entry on user (contact person) deletion  Fixed an issue where activity logs required a refresh to show new entries  Fixed issue where users were added despite invitation failure; failures return proper error messages now.  Fixed activity log order for data source creation with simultaneous ingest mapping.  Standardized table design across most of the Portal. Fixed issue where Double value types were not correctly interpreted in data source connection forms, leading to a failure to save. Fixed issue where the "Use Microsoft Entra members in Ingest instance security roles" checkbox would automatically reselect itself after being deselected and saved on Ingest instances with Azure SQL storage.

Related products:TimeXtender Data IntegrationTimeXtender Data Integration Portal

Data source providers r. 2025-10-06

Today, we’ve released updated data source providers. See the changes below. Azure Data Factory - OracleVersion: 17.5.1.0 (TDI) / 10.4.5.0 (20.10 ODX)Switched to use v2 connector in ADF, since the previous one is being deprecated.Azure Data Factory - PostgreSQLVersion: 17.2.0.0 (TDI) / 10.4.5.0 (20.10 ODX)Switched to use v2 connector in ADF, since the previous one is being deprecated.CSVVersion: 23.15.4.0 (TDI) / 1.7.0 (20.10 BU) / 16.4.16.0 (20.10 ODX)Added support for handling multiple metadata URIs. Added trimming of long column names. Added culture settings when parsing numbers for metadata. Added support for FTPS location. Changed pattern matching for files to be case insensitive. Fixed issue with numbers parsed incorrectly with one decimal digit in some cases. Fixed issue with metadata URI handling for Sharepoint location.Dynamics 365 Business Central - OnlineVersion: 23.1.0.0 (TDI)Fixed an issue where the subtract value was not applied during incremental load.Exact OnlineVersion: 11.2.0.0 (TDI)Added support for incremental loading (TDI only). Added better handling of JSON edge cases. Added support for Azure App Registration with certificate. Added a fallback strategy to handle invalid characters in XML if it fails to read it. Added option to not URL encode query parameters. Added handling for non-standard Authorization header value. Fixed error message for timeout, it now shows text indicating that the request timed out. Fixed issue when data type is overridden in TDI. Fixed some scaling issues in the REST dialog for BU/ODX. Fixed issue with table builder when 'Only list flattened tables' is enabled where it would not return the correct schema.ExcelVersion: 23.16.1.0 (TDI) / 1.6.1 (20.10 BU) / 16.4.17.0 (20.10 ODX)Added support for handling multiple metadata URIs. Added trimming of long column names. Added support for FTPS location. Changed pattern matching for files to be case insensitive. Improved file aggregation logic. Fixed excel engine to allow for more files in metadata uri. Fixed issue with 'Treat Empty as Null' for table definitions. Fixed issue with metadata URI handling for Sharepoint location. Removed duplicate file aggregation pattern settings.HubspotVersion: 11.2.0.0 (TDI)Added support for incremental loading (TDI only). Added better handling of JSON edge cases. Added support for Azure App Registration with certificate. Added a fallback strategy to handle invalid characters in XML if it fails to read it. Added option to not URL encode query parameters. Added handling for non-standard Authorization header value. Fixed error message for timeout, it now shows text indicating that the request timed out. Fixed issue when data type is overridden in TDI. Fixed some scaling issues in the REST dialog for BU/ODX. Fixed issue with table builder when 'Only list flattened tables' is enabled where it would not return the correct schema.Infor SunSystemsVersion: 23.2.0.0 (TDI)Added support for the force Unicode optionODATAVersion: 11.2.0.0 (TDI) / 1.3.0 (20.10 BU) / 16.4.7.0 (20.10 ODX)Added support for certificates in metadata discovery. Added support for incremental loading (TDI only). Added better handling of JSON edge cases. Added support for Azure App Registration with certificate. Added a fallback strategy to handle invalid characters in XML if it fails to read it. Added option to not URL encode query parameters. Added handling for non-standard Authorization header value. Fixed error message for timeout, it now shows text indicating that the request timed out. Fixed issue when data type is overridden in TDI. Fixed some scaling issues in the REST dialog for BU/ODX. Fixed issue with table builder when 'Only list flattened tables' is enabled where it would not return the correct schema.OneLake Delta ParquetVersion: 23.4.0.0 (TDI)Fixed an error ingesting metadata when the Lakehouse doesn't support schemas.OneLake Finance & OperationsVersion: 23.5.0.0 (TDI)Fixed an error ingesting metadata when the Lakehouse doesn't support schemas.ParquetVersion: 23.13.1.0 (TDI) / 1.5.0 (20.10 BU) / 16.4.12.0 (20.10 ODX)Added logging of file names. Added support for handling multiple metadata URIs. Added trimming of long column names. Added support for FTPS location. Changed pattern matching for files to be case insensitive. Fixed issue with metadata URI handling for Sharepoint location.RESTVersion: 11.2.0.0 (TDI) / 1.9.0 (20.10 BU) / 16.4.21.0 (20.10 ODX)Added support for incremental loading (TDI only). Added better handling of JSON edge cases. Added support for Azure App Registration with certificate. Added a fallback strategy to handle invalid characters in XML if it fails to read it. Added option to not URL encode query parameters. Added handling for non-standard Authorization header value. Fixed error message for timeout, it now shows text indicating that the request timed out. Fixed issue when data type is overridden in TDI. Fixed some scaling issues in the REST dialog for BU/ODX. Fixed issue with table builder when 'Only list flattened tables' is enabled where it would not return the correct schema.SQL ServerVersion: 23.1.0.0 (TDI) / 1.1.0 (20.10 BU) / 16.4.2.0 (20.10 ODX)Added support for the force Unicode optionXML/JSONVersion: 23.12.0.0 (TDI) / 1.6.0 (20.10 BU) / 16.4.13.0 (20.10 ODX)Added support for FTPS location.

Related products:Data source providers

TimeXtender Orchestration & Data Quality 25.2

Today, we’ve published a new release of TimeXtender Orchestration & Data Quality that contains the changes listed below. NewAdded support for setting the number of times a failed package should retry.  New schedule type - ‘Continuous’ - that starts a new execution right after previous execution finishesImprovedRedesigned the start page and fixed a bug where buttons/links did not always work.  Improved user experience and execution for Data Transfer merge. Added clearer error messages for the execution of Azure package types. Improved an error message shown when running an invalid Data Factory package. Removed the Subscription field from the Power BI Refresh package type UI, as it is no longer required. Web URLs have been updated to use ‘timextender’ instead of ‘exmon’ (i.e., https://dev.odq.timextender.dev/cmdservice/ExpectusCommandService.svc). In the Gateway desktop client, some paths in ‘ExmonClientConfig’ file is changed to ‘timextender’ (from ‘exmon’) during service startup. Increased the length of the data fields in Azure Data Providers so that Tenant Ids and App Ids will be fully visible The flow for sending emails from ODQ has been simplified. The default timeout for TDI package type is 6 hours and is now properly indicated in the TDI package UI in the Desktop client.FixedFixed a bug where compare query previews sometimes showed the wrong number of variance errors.  Fixed bug in the Initializing database step in On-Premise install Fixed bug where upgrading the Command Service failed when upgrading On-Premise Fixed an issue where Schedule Groups would show an incorrect icon for the Databricks package type Fixed an issue where Active Directory queries in ORC would not work if they returned 0 rows Fixed an issue where Azure Function packages would not check for the validity of credentials before starting execution Fixed an issue with the duplication and renaming of packages Fixed an issue with the duplication of Ingest packages Fixed an issue with VM names disappearing and not being saved in Cloud Optimization packages Fixed an issue where trying to create a Fabric package without any Fabric Data Providers present resulted in an "index out of range" error Fixed an issue with saving Properties changes in packages Improved error messages when Azure packages fail without any further information about the problem being sent from Azure In ODQ Portal, the ETA in the process popup now handles time zone differences between local machine and time zone setting in ODQ Desktop. Fixed an issue where images were not displayed correctly in emails Fixed an issue where pressing sync for TDI Data provider in Turnkey would timeout Better error message when there is an issue finding Entra Groups Fixed an issue where TDI packages would not sync correctly to object groups Fixed an issue with creating Entra groups and new users in ODQ Desktop Fixed an issue where execution of Databricks packages would not work Fixed an issue with duplication of Azure Cloud Optimizer packages where the Capacity option would be disabled in the duplicated package Fixed issue for Gateway trying to log control characters like esc via xml serialization.

Related products:TimeXtender Orchestration & Data Quality

Data source providers r. 2025-06-04

On 4 June, we made a hotfix release with the changes listed below.CSVVersion: 23.5.3.0 (TDI) / 1.1.5 (20.10 BU) / 16.4.6.0 (20.10 ODX)Fixed issue with SharePoint when reading more than one file (Cannot access disposed object).Exact OnlineVersion: 10.2.0.0 (TDI)Fixed an issue with the ‘Set empty fields as null’ feature where it was applying the null on the wrong dataset. Fixed an issue where datetime was parsed into local time format instead of UTC.ExcelVersion: 23.7.0.0 (TDI) / 1.1.5 (20.10 BU) / 16.4.7.0 (20.10 ODX)Fixed issue with SharePoint when reading more than one file (“Cannot access disposed object”). Fixed issue where Excel was now trying to process unrelated files and failing.HubspotVersion: 10.2.0.0 (TDI)Fixed an issue with the ‘Set empty fields as null’ feature where it was applying the null on the wrong dataset. Fixed an issue where datetime was parsed into local time format instead of UTC.ODATAVersion: 10.2.0.0 (TDI)Fixed an issue with the ‘Set empty fields as null’ feature where it was applying the null on the wrong dataset. Fixed an issue where datetime was parsed into local time format instead of UTC.ParquetVersion: 23.6.1.0 (TDI) / 1.0.5 (20.10 BU) / 16.4.5.0 (20.10 ODX)Fixed issue with SharePoint when reading more than one file (“Cannot access disposed object”).RESTVersion: 10.2.0.0 (TDI) / 1.2.4 (20.10 BU) / 16.4.8.0 (20.10 ODX)Fixed an issue with the ‘Set empty fields as null’ feature where it was applying the null on the wrong dataset. Fixed an issue where datetime was parsed into local time format instead of UTC.XML/JSONVersion: 23.4.0.0 (TDI) / 1.0.5 (20.10 BU) / 16.4.5.0 (20.10 ODX)Fixed issue with SharePoint when reading more than one file (“Cannot access disposed object”).

Related products:Data source providers
TimeXtender Data Integration 7017.1

TimeXtender Data Integration 7017.1

Spring has turned to summer and we’re celebrating with a new release of TimeXtender Data Integration (desktop v. 7017.1). When you open the desktop application, you'll notice a new refreshed look, but we've also implemented a ton of improvements under the hood. See all the news below. NewRefreshed desktop UIWe've refreshed the design of the desktop UI with refined theme colors, and a new, but less prominent, blue accent color. As another UI improvement, we've streamlined the names and order of the show/hide options - 'show data types', 'highlight descriptions', etc. - in the View menu. We've also saved many users a few regular clicks by enabling them all by default.Choose where in the world your metadata is storedYou can now choose a metadata storage region for your organization that specifies where in the world your new instances will be created. Current options are West Europe (default), Central US & South East Asia. Choose the region closest to you for the best TDI experience.ImprovedOne-step Fabric executionIntroduced one-step executions for Prepare instances on Microsoft Fabric, where executions are now automatically added to an execution plan and processed in a single step.Warning: If you're using Microsoft Fabric for Ingest or Prepare storage, make sure the Spark Runtime Version is set to 1.2 in your Fabric Workspace settings. We're working on support for Runtime version 1.3, which is the new default in Fabric.Performance improvements for Ingest instancesWe've improved the indexes on the tables in the Ingest instance repository database for better query performance, and improved and optimized various Ingest repository database queries to reduce data load and increase speed.Updated data sourcesAlong with this release of TDI, we’ve released new updated versions of our data source providers. Among the changes for the REST-based providers - Exact online, HubSpot, etc. - are support for certificates as authentication and global table flattening. For providers for static files - CSV, Excel, etc. - we’ve fixed a few bugs, including an issue with connecting to Azure Blob Storage.  For more information, see the full release notes.Copy scripts between Tabular and PowerBIYou can now copy all custom measure scripts in a Deliver instance from Tabular to PowerBI and vice versa. For context, custom measures have a script for each endpoint type, but PowerBI and Tabular share the same DAX syntax with few exceptions. For that reason, migrating from Tabular to PowerBI endpoints could entail copy-pasting hundreds of scripts. Since that’s no fun, we implemented this little shortcut.Improved Ingest Service Configuration tool The Ingest Service Configuration tool now automatically imports deprecated 'Managed ADO.NET' data source providers from the default component folder used in previous installations of the Ingest service (known as ‘ODX SaaS’ before v. 6744.1). This change eliminates manual steps in the upgrade process for users of these data source providers as they are no longer available for download from our repository. FixedTDI PortalIn the Portal, the Instances page now loads significantly faster if you have a lot of instances. ‘Send sign-in invitation’ would sometimes fail due to password restrictions. Adding a Microsoft or Google account as a login option would fail. Fixed a bug on the Deliver Qlik endpoint causing the wrong settings to be shown in the Authentication section.TDI DesktopWhen an execution task in an Ingest instance completes with no tables included, it will now set the state to 'Complete with Warnings'. Fixed an issue where executing an execution task in the Ingest while using a case-sensitive SQL Storage database would fail when listing existing SQL objects. Fixed an issue where executing an execution task in an Ingest instance while using a case-insensitive SQL Storage database would fail for a schema where an unused version of the schema with the same name in different casing exists. Fixed a memory leak in the metadata manager. Improved the UI performance of the metadata manager. Fixed an issue where a REST data source where a renamed table would not get mapped to the old name properly. Fixed an issue with synchronizing a Prepare instance with an Ingest instance where the loading animation would disappear before the synchronization logic had applied all the changes causing the UI to freeze. Fixed various issues with Prepare instances on Fabric storage, including: errors when capacity is turned off, problems with conditional lookups, issues when transferring tables from a TimeXtender F&O data source and notebook syntax errors with some aggregate functions. Data Cleansing Procedure includes NULL checks on underlying fields on SuperNatural key - this has now been fixed including no null check on Custom Hash fields. In some cases, the repository would block execution of execution package because of a deadlock issue. This could happen when multiple execution packages were scheduled to run at the same time. Fixed an issue with the Integrate Existing Objects feature where the simple mode option on newly created data areas would have invalid settings. Fixed a miss aligned information icon in the Table Settings window. Fixed a bug that prevented column values exceeding 43,679 characters from being displayed in the table preview, and also caused the query tool to throw an exception when result values exceeded 32,000 characters. Fixed an issue where Synchronization with Remapping on Deliver instances would show an error. Resolved an issue with Qlik endpoints using certificate authentication, where specifying a non-existent certificate would result in a null reference exception. Upgraded Qlik SDK to version 16.9 to ensure compatibility with the latest Qlik Sense release. Fixed an issue where adding fields to a table in the Deliver instance by dragging them from the Data Movement Pane would position them below the Relations node. Fixed an issue where Generate End-to-End Tasks and Packages would fail if the flow included an Ingest and the Ingest was not open in TDI. Fixed an issue that would cause the error "Execution package [name] is already running" when executing a migrated package in multiple environments at the same time

Related products:TimeXtender Data IntegrationTimeXtender Data Integration Portal

Data source providers r. 2025-06-03

Today, we’ve released updated data source providers. See the changes below.CSVVersion: 23.4.3.0 (TDI) / 1.1.4 (20.10 BU) / 16.4.5.0 (20.10 ODX)Fixed a bug where connecting to Azure Blob storage did not work. Fixed a bug where Skip Top would not apply to all aggregated files.Exact OnlineVersion: 10.0.0.0 + 9.5.0.0 (TDI)Added support for certificates. Added support for setting a culture when interpreting data types. Added support for global table flattening. Changed override headers behavior. It will now not remove all headers, it will instead replace the headers that are defined in the list. To remove a header, add it with empty value. Fixed a bug where running in parallel could produce duplicate headers for authentication.ExcelVersion: 23.5.0.0 (TDI) / 1.1.3 (20.10 BU) / 16.4.5.0 (20.10 ODX)Improved logging when reading files, making it easier to track down problematic files. Fixed a bug where connecting to Azure Blob storage did not work. Fixed a bug where having a . in a folder name would cause it to try and read it as a file.HubspotVersion: 10.0.0.0 + 9.5.0.0 (TDI)Added support for certificates. Added support for setting a culture when interpreting data types. Added support for global table flattening. Changed override headers behavior. It will now not remove all headers, it will instead replace the headers that are defined in the list. To remove a header, add it with empty value. Fixed a bug where running in parallel could produce duplicate headers for authentication.ODATAVersion: 10.0.0.0 + 9.5.0.0 (TDI)Added support for certificates. Added support for setting a culture when interpreting data types. Added support for global table flattening. Changed override headers behavior. It will now not remove all headers, it will instead replace the headers that are defined in the list. To remove a header, add it with empty value. Fixed a bug where running in parallel could produce duplicate headers for authentication.ParquetVersion: 23.5.1.0 (TDI) / 1.0.4 (20.10 BU) / 16.4.4.0 (20.10 ODX)Fixed a bug where connecting to Azure Blob storage did not work. Fixed a bug where loading data from a file with multiple row groups would not work.RESTVersion: 10.0.0.0 + 9.5.0.0 (TDI) / 1.2.2.0 (20.10 BU) / 16.4.6.0 (20.10 ODX)Added support for certificates. Added support for setting a culture when interpreting data types. Added support for global table flattening. Changed override headers behavior. It will now not remove all headers, it will instead replace the headers that are defined in the list. To remove a header, add it with empty value. Fixed a bug where running in parallel could produce duplicate headers for authentication. Fixed bug with preview table not working in Business Unit.XML/JSONVersion: 23.3.0.0 (TDI) / 1.0.4 (20.10 BU) / 16.4.4.0 (20.10 ODX)Fixed a bug where connecting to Azure Blob storage did not work.

Related products:Data source providers

Data source providers r. 2025-05-15

Today, we've released updated data source providers. See the changes below.CSVVersion: 23.3.30 (TDI) / 1.1.3 (20.10 BU) / 16.4.4.0 (20.10 ODX)Added default file types. Fixed bug where parallel execution could lock files. Fixed missing root path handling for test connection when using SharePoint connection. Fixed a bug where the ordinals of the columns were not preserved when there are no headers.Exact Online Version: 9.2.0.0 (TDI)Fixed a bug where an empty header name could cause an issue.ExcelVersion: 23.3.0.0 (TDI) / 1.1.1 (20.10 BU) / 16.4.3.0 (20.10 ODX)Added default file types. Fixed bug where parallel execution could lock files. Fixed missing root path handling for test connection when using SharePoint connection. Fixed bug where all columns had to be selected in order to load data into Azure Data Lake.HubspotVersion: 9.2.0.0 (TDI)Fixed a bug where an empty header name could cause an issue.ODataVersion: 9.2.0.0 (TDI)Fixed a bug where an empty header name could cause an issue.OracleVersion: 23.1.4.0 (TDI) / 17.1.0.0 (TDI ADF) / 1.0.1 (20.10 BU) / 16.4.1.0 (20.10 ODX) / 10.4.1.0 (20.10 ODX ADF)Added support for ‘RAW’ and ‘LONG RAW’ data types. ‘RAW’ will be translated to a ‘varbinary(2000)’, and ‘LONG RAW’ will be translated to ‘varbinary(max)’. Fixed an issue where data types not recognized by the Oracle data source would throw an exception instead of marking them as ‘Unknown’.ParquetVersion: 23.4.1.0 (TDI) / 1.0.3 (20.10 BU) / 16.4.3.0 (20.10 ODX)Added default files types. Fixed bug where parallel execution could lock files. Fixed missing root path handling for test connection when using SharePoint connection. Updated Parquet library to support the latest Parquet metadata standards.RESTVersion: 9.2.0.0 (TDI) / 1.0.2 (20.10 BU) / 16.4.2.0 (20.10 ODX)Fixed a bug where an empty header name could cause an issue.XML/JSONVersion: 23.2.0.0 (TDI) / 1.0.3 (20.10 BU) / 16.4.3.0 (20.10 ODX)Added default file types. Fixed bug where parallel execution could lock files. Fixed missing root path handling for test connection when using SharePoint connection. Fixed issue where table flattening could not execute in the BU and ODX versions.

Related products:Data source providers

TimeXtender Data Integration 6963.1

Today, we’ve published a minor release of TimeXtender Data Integration (v. 6963.1) that contains the changes listed below.ImprovedImproved the Ingest logic that manages data source providers to no longer try to download deprecated providers which caused confusing error messages. Renamed display names to no longer include "Semantic" Removed the limitation of reserved words for custom field validations and custom conditionsFixedFixed a wrong icon for super natural key fields in data lineage and Perpare table selection. Fixed various typos in the TDI application. Fixed an issue where the Metadata Manager in Ingest would produce a change notification for columns without the ‘original data type’ metadata. Fixed an issue with deploying Hierarchy tables with ‘Null check approach’ set to ‘Record Based’ Fixed an issue in Ingest on Fabric Lakehouse storage where tables were not transferred correctly  if they contained ancient dates. Fixed an issue where the ‘StepODXDataFactoryExecute’ was not cleaned up when the last table was removed from an execution package that allowed that step to exist. Fixed an issue where Convert to Mapping Set was incorrect for fields with custom transformations. Fixed an issue where Integrate Existing Objects was affecting existing views. 23785: Primary key check is getting skipped when a table has no mappings This issue was corrected - It affected tables with no mappings but having custom data, table inserts, and/or related records.

Related products:TimeXtender Data Integration

TimeXtender Orchestration & Data Quality and TimeXtender Data Enrichment 25.1

It’s our pleasure to announce release 25.1.0 of TimeXtender Data Enrichment and TimeXtender Orchestration and Data Quality, featuring exciting new updates and enhancements.SummaryThis update introduces Azure Databricks integration, enabling job execution via orchestration packages. The Data Transfer package now supports SQL MERGE for updating destination tables, with new UI options for primary keys, custom values, and column collation. Data Enrichment web enforces desktop column restrictions with clear error indicators. Bug fixes enhance hierarchy limits, lookup columns, scheduling, data imports, and dataset publishing.Our releases follow a phased rollout to ensure stability and performance. We begin by upgrading a select set of services for initial testing. After that, we gradually roll out the update to all customers, starting with low-risk environments and expanding systematically. Customers who prefer an earlier upgrade can request one at any time, by sending a request to our support, and we will schedule their update on an agreed date.GeneralAccess to Previous VersionsUsers can now easily download executable versions of O&DQ and Data Enrichment via the provided links to TimeXtender's SharePoint. These versions don't require installation, allowing for seamless switching between different versions as needed. No special permissions are needed to access the links, and further details are available here.TimeXtender Orchestration & Data QualityAzure Databricks PackageThe option to connect to and run Azure Databricks jobs from TimeXtender Data Orchestration has been added. To use this option the user must first create an Azure Connection Data Provider with the authentication information for the Databricks job to be executed by the package. Then, an Orchestration package can be created to connect to and run the Databricks job. Read more about Azure Databricks packages here.Data Transfer Package UpdateThis release includes an update to the existing Data Transfer package. Enabling the 'merging' option will prompt the Data Transfer package to create a staging table, which will be used as a source for an SQL MERGE query, allowing updates to existing entries in the destination table. The O&DQ UI will also offer options to select the Primary Key, use a Custom Value to replace values in a destination column, and define collation at the column level when selecting data from the source. Read more about this new feature here.TimeXtender Data EnrichmentRequired columns in web applicationData Enrichment Web now enforces the column restrictions set up in Data Enrichment Desktop. This version ensures that all column restrictions are applied in the web interface. The Web will prevent you from saving changes until all restriction violations are resolved. A small red 'X' will indicate which cell requires attention, and hovering over it will explain the rule being violated—just like in Desktop.Bug fixes and smaller improvements TimeXtender Data EnrichmentThe maximum height in Hierarchies has been increased to 10. Disabling users prompted them to update their subscription. Hierarchy attributes used as lookup columns did not work. Import failed for decimal columns. Saving an existing "Import from Database" action caused the action to break. Pre/post execution did not work when importing from the database. Importing from the database into a lookup column did not work. The schema drop-down was not ordered alphabetically. If the Embedded view was in "Recently Used," the start page showed an error. Users could not open tables with a lookup column if the lookup table had been deleted. Added support for availability group connection strings.TimeXtender Orchestration and Data Quality (O&DQ)Compare query column names were updated when the column type was changed. New schedule groups were automatically assigned holidays. Running the schedule manually did not work. The SharePoint Data Provider did not work. The Data Transfer package could not create a table with two or more unique fields. The Help page was unusable. The Data Transfer package was not using the configured timeout. The schedule overview was shown as empty for TX-only customers. No holidays were displayed for TX-only customers. The sync process in the process map did not work for packages. The next run for a schedule did not update when changes were made to the schedule and saved. Creating a cloud optimizer package did not work. Active Directory queries did not work when they returned zero rows. The schedule for a package was not displayed in package properties. When a package's Windows process died, the package could not be executed again. Users stopped being able to execute tasks after having the Desktop application open for a while.TurnkeyNot all drop-downs order their values alphabetically. Columns added to a data source were defaulted to hidden in datasets. Preview did not work when a dataset column had a comma in the name. Filter in the rule automatically changed from "is not blank" to "does not equal null." An existing rule stopped working if the dataset name was updated. Links in the Exception action did not work properly. A published rule did not run if it was edited unless it was published again. Publishing a dataset was not possible from the column settings tab.

Related products:TimeXtender Data EnrichmentTimeXtender Orchestration & Data Quality
TimeXtender Data Integration 6926.1

TimeXtender Data Integration 6926.1

Two months into 2025 we're ready with the second major release of the year. Even though it's only been a month since the last major release, we have a lot of good stuff for you, including access to instance metadata for analysis, new data source providers, and a couple of much-requested features for Deliver and Ingest instances. And especially for partners, the new blueprints feature can be a real timesaver.When you upgrade to the new release, the Ingest service and data sources must be upgraded at the same time (i.e. you cannot upgrade the ingest service without upgrading data sources or vice versa). The reason is that we’ve redesigned the data sources architecture to enable the TDI TimeXtender providers in both V20 and Classic. See Compatibility of Data Sources and TDI/TIS Versions for an overviewNewCollect metadata and logs from instances for analysis (public preview)If you'd like detailed statistics on execution times, or any other metadata created by TimeXtender, this release is good news for you. With the new meta-collection feature in the Portal, you can analyze TimeXtender metadata and logs - in TimeXtender! 24 hours' worth of metadata and logs from the instances you select are exported to a data lake hosted by TimeXtender once a day. Using a regular TimeXtender data source, configured for you with the click of a button, you can copy the data into TimeXtender just like any other data source. Note that you'll need to be on the latest version of TDI.Three new data source providersIn our quest to provide high-quality first-party data source providers for basically everything, we've added three new providers:XML & JSON joins the CSV, Parquet and Excel providers for common data files.  Azure Data Factory - SAP Table enables connection to SAP through Azure Data Factory. Infor SunSystems makes the existing business unit data source available in TDI in an updated form that supports SunSystems version 5 and up.TimeXtender Enhanced data source providers replace CDataFrom this release, the TimeXtender Enhanced data source providers replace the third-party 'Managed ADO.net' providers from CData. As we're no longer distributing CData providers, they will not receive updates and no new providers are available for use. If you have data sources that use CData providers, we recommend that you begin migration to the TimeXtender Enhanced providers. For more information on how to change a data source provider please see: Data selection for Deliver endpointsWe've added support for data selection, instance variables and usage conditions in deliver instances. These features have long been available in the Prepare instance and make data selection rules on tables much more versatile. Adding these features to the deliver instance, makes it possible to, for example, use the same Deliver instance to deploy endpoints with different data (e.g. departmental data) in each endpoint.Add timestamp to tables in the Ingest instanceIf you'd like to know when data has been copied from a specific source, you can now have the good old DW_Timestamp column added to tables in the Ingest instance. For now, this is supported when you use Azure Data Lake Storage, Fabric, or SQL as your Ingest instance storage.Partners - share instance blueprints between customers  (public preview) As a partner working with many TimeXtender customers with roughly the same setup, you might feel a slight deja vu when you create the same data warehouse structure for the third time. Because time matters, we've created the blueprints feature to save you from that repetitive work.A blueprint is an instance where anything remotely sensitive, such as logs and usernames, is removed. In the new version, you can, with the consent of the customers, share a blueprint of Customer A's instancewith Customer B. Once a blueprint has been shared, Customer B can add a new instance based on that blueprint instead of starting from scratch.ImprovedImproved UI for setting up REST data source connectionsWe've improved the experience when setting up TimeXtender REST data source connections so that you can show and hide the sections that matter to you, as well as adding additional validations for essential fields. In addition to that, based on feedback that the old name could be misleading, the “global values” setting has been renamed to “connection variables”.Edit deleted instancesYou can now edit deleted instances. If this sounds like something you’re not likely to do, you’re right, but it can be useful in a few edge cases. For example, you can rename a deleted instance if you want to create a new instance with the same name.FixedTDI PortalIt wasn't possible to rename an environment to the same string with different capitalization (e.g. "Prod" -> "PROD") On the Instances page, fixed an issue with deleting environments containing only deleted instances Fixed a bug that would allow a mapped data source connection to be deleted after upgrading it to the most recent version Filters are now still applied after deleting a data source connection. Fixed a bug in the REST provider where the connection variables were not applied to the dynamic values from endpoint query. We now take into consideration the value of ‘Empty fields as null’ when finding data types. This can help find the correct data types when the data is a mix of values and empty values/null. Updated the look of the 'Multi-factor sign-in' card on the 'Basic info' page to fix a visual inconsistency. When you migrate an Ingest instance from one environment to another, we've made the error message more useful should the validation of the data source mappings fail.TimeXtender Data IntegrationIn the Create Index window, it was impossible to see all fields if you had a lot of fields on a table since the list did not have a scroll bar. Using the Skip option when loading tables for the Ingest data source query tool failed with a null parameter exception.

Related products:TimeXtender Data IntegrationTimeXtender Data Integration Portal