Skip to main content

TimeXtender Data Integration 6926.1

Related products:TimeXtender Data IntegrationTimeXtender Portal
TimeXtender Data Integration 6926.1
Thomas Hjermitslev
Community Manager

Two months into 2025 we're ready with the second major release of the year. Even though it's only been a month since the last major release, we have a lot of good stuff for you, including access to instance metadata for analysis, new data source providers, and a couple of much-requested features for Deliver and Ingest instances. And especially for partners, the new blueprints feature can be a real timesaver.

When you upgrade to the new release, the Ingest service and data sources must be upgraded at the same time (i.e. you cannot upgrade the ingest service without upgrading data sources or vice versa). The reason is that we’ve redesigned the data sources architecture to enable the TDI TimeXtender providers in both V20 and Classic. See Compatibility of Data Sources and TDI/TIS Versions for an overview

New

Collect metadata and logs from instances for analysis (public preview)

If you'd like detailed statistics on execution times, or any other metadata created by TimeXtender, this release is good news for you. With the new meta-collection feature in the Portal, you can analyze TimeXtender metadata and logs - in TimeXtender! 24 hours' worth of metadata and logs from the instances you select are exported to a data lake hosted by TimeXtender once a day. Using a regular TimeXtender data source, configured for you with the click of a button, you can copy the data into TimeXtender just like any other data source. Note that you'll need to be on the latest version of TDI.

Three new data source providers

In our quest to provide high-quality first-party data source providers for basically everything, we've added three new providers:

  • XML & JSON joins the CSV, Parquet and Excel providers for common data files. 
  • Azure Data Factory - SAP Table enables connection to SAP through Azure Data Factory.
  • Infor SunSystems makes the existing business unit data source available in TDI in an updated form that supports SunSystems version 5 and up.

TimeXtender Enhanced data source providers replace CData

From this release, the TimeXtender Enhanced data source providers replace the third-party 'Managed ADO.net' providers from CData. As we're no longer distributing CData providers, they will not receive updates and no new providers are available for use. If you have data sources that use CData providers, we recommend that you begin migration to the TimeXtender Enhanced providers. For more information on how to change a data source provider please see: 

Data selection for Deliver endpoints

We've added support for data selection, instance variables and usage conditions in deliver instances. These features have long been available in the Prepare instance and make data selection rules on tables much more versatile. Adding these features to the deliver instance, makes it possible to, for example, use the same Deliver instance to deploy endpoints with different data (e.g. departmental data) in each endpoint.

Add timestamp to tables in the Ingest instance

If you'd like to know when data has been copied from a specific source, you can now have the good old DW_Timestamp column added to tables in the Ingest instance. For now, this is supported when you use Azure Data Lake Storage, Fabric, or SQL as your Ingest instance storage.

Partners - share instance blueprints between customers  (public preview) 

As a partner working with many TimeXtender customers with roughly the same setup, you might feel a slight deja vu when you create the same data warehouse structure for the third time. Because time matters, we've created the blueprints feature to save you from that repetitive work.

A blueprint is an instance where anything remotely sensitive, such as logs and usernames, is removed. In the new version, you can, with the consent of the customers, share a blueprint of Customer A's instance

with Customer B. Once a blueprint has been shared, Customer B can add a new instance based on that blueprint instead of starting from scratch.

Improved

Improved UI for setting up REST data source connections

We've improved the experience when setting up TimeXtender REST data source connections so that you can show and hide the sections that matter to you, as well as adding additional validations for essential fields. In addition to that, based on feedback that the old name could be misleading, the “global values” setting has been renamed to “connection variables”.

Edit deleted instances

You can now edit deleted instances. If this sounds like something you’re not likely to do, you’re right, but it can be useful in a few edge cases. For example, you can rename a deleted instance if you want to create a new instance with the same name.

Fixed

TDI Portal

  • It wasn't possible to rename an environment to the same string with different capitalization (e.g. "Prod" -> "PROD")
  • On the Instances page, fixed an issue with deleting environments containing only deleted instances
  • Fixed a bug that would allow a mapped data source connection to be deleted after upgrading it to the most recent version
  • Filters are now still applied after deleting a data source connection.
  • Fixed a bug in the REST provider where the connection variables were not applied to the dynamic values from endpoint query. We now take into consideration the value of ‘Empty fields as null’ when finding data types. This can help find the correct data types when the data is a mix of values and empty values/null.
  • Updated the look of the 'Multi-factor sign-in' card on the 'Basic info' page to fix a visual inconsistency.
  • When you migrate an Ingest instance from one environment to another, we've made the error message more useful should the validation of the data source mappings fail.

TimeXtender Data Integration

  • In the Create Index window, it was impossible to see all fields if you had a lot of fields on a table since the list did not have a scroll bar.
  • Using the Skip option when loading tables for the Ingest data source query tool failed with a null parameter exception.

9 replies

rvgfox
Problem Solver
Forum|alt.badge.img+4
  • Problem Solver
  • 222 replies
  • March 7, 2025

Can you explain how to use this feature?

 


Christian Hauggaard
Community Manager
Forum|alt.badge.img+5

​Hi
@rvgfox it works the same way as in the prepare instance 

 


rvgfox
Problem Solver
Forum|alt.badge.img+4
  • Problem Solver
  • 222 replies
  • March 10, 2025

Yes, I guessed so, but the doubt is in the phrase "use the same deliver instance".

 


rory.smith
TimeXtender Xpert
Forum|alt.badge.img+7
  • TimeXtender Xpert
  • 649 replies
  • March 10, 2025

Hi ​@rvgfox ,

you can use this to have one model supply different data to different endpoints. The variables are instance-specific as far as I am aware.


rvgfox
Problem Solver
Forum|alt.badge.img+4
  • Problem Solver
  • 222 replies
  • March 10, 2025

Thanks ​@rory.smith, now it’s clear:

We can use the same Deliver Instance using variables to redirect the data to diferent end point of the sam deliver instance.


Christian Hauggaard
Community Manager
Forum|alt.badge.img+5

@rvgfox I have updated the text above. This refers to the fact that you can add an endpoint name deliver instance variable and then use this in the in the data selection rules for the deliver instance tables. This means you can now have multiple endpoints for the same deliver instance, each one delivering different data


rvgfox
Problem Solver
Forum|alt.badge.img+4
  • Problem Solver
  • 222 replies
  • March 10, 2025

@Christian Hauggaard Now it’s clear. Thanks


rvgfox
Problem Solver
Forum|alt.badge.img+4
  • Problem Solver
  • 222 replies
  • March 12, 2025

@Christian Hauggaard I’ve installed the new version and try to use the new Deliver Instance’s variables to use it in the selection of the Enpoint and I’m not be able.

Can you show an example?

 


Christian Hauggaard
Community Manager
Forum|alt.badge.img+5

@rvgfox please see the article below. please note that we currently have a bug for this, so if you have upgraded an existing instance and are trying to implement this then you might receive an error similar to:

An error occurred while updating. SemanticInstanceVariables
Could not find stored procedure 'UpdateSemanticInstanceVariables'.

The product team are currently working on a fix for this

 


Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings