Skip to main content

Spring has sprung, and we're happy to announce the release of a new version of TimeXtender (Desktop v. 6221.1). See what we've been up to below.

Note: These Release Notes have been updated to reflect that the TimeXtender API is now live and no longer in closed BETA.

New

  • All semantic endpoints are now supported for Snowflake: If you have a data warehouse on Snowflake, you can now use it with all the semantic endpoints supported by TimeXtender. The Power BI, Tableau and Tabular endpoints join Qlik and CSV file as supported endpoints for this type of data warehouse storage.
  • SQL Server 2022 support: TimeXtender now supports the latest and greatest major release of Microsoft SQL Server for use as a data warehouse or ODX data storage.
  • Official support for Amazon RDS for SQL Server: Amazon's cloud SQL Server offering is now officially supported for use as a data warehouse or ODX data storage. Some of our enterprising customers have already paved the way by just doing it, and we're happy to put the "officially supported" stamp on their endeavor.
  • Easy data source provider updates: We've made it much simpler to update a data source provider to take advantage of new features or bug fixes. You'll now see an aptly named 'Update' button whenever an update is available. Previously, you'd have to add a new data source in the TimeXtender Portal and switch the connection in TimeXtender Desktop. 
  • TimeXtender API for integrating with external systems: As an important step in our march towards world domination, we've created an API that can be used by external systems that want to, among other things, trigger and monitor task executions. Currently in closed beta, this feature can be compared to the feature in TimeXtender 20.10 and older that allows you to trigger an execution package from the command prompt.

 

Changed

  • A Job can now be scheduled multiple independent times.
  • On the ODX, we've added support for data-on-demand for Managed ADO.net data sources.
  • 'Show data types' have been implemented on semantic models
  • Tabular endpoints now show more details when an error occurs during execution.

 

Fixed 

  • Managed ADO.net data sources now support multi-line properties that automatically add the correct line endings ('CR LR' or '\r\n')

Portal

  • When adding or editing a Qlik endpoint, you would get a "some fields have invalid values" validation error.
  • It was not possible to delete a data source if the name of the data source contained whitespace or special characters in a specific way.
  • Add/edit/clone data sources would not show a loading spinner when loading the form.
  • When cloning a data source, the 'Clone' submit button was not disabled if validation failed.
  • Users on the Free tier could clone a data source to exceed the limit of data sources.
  • Fixed various other issues with data source cloning.
  • Minor tweaks and adjustments to the styling of the Add/Edit Instance forms
  • We fixed some technical debt relating to customer types left over from the implementation of the Free tier in our previous release.

Desktop

  • A few outdated or incorrect icons have been changed.
  • Data would be missing from the "valid" table on tables that had a specific setup with a mapping set, a primary key field, and a data selection rule.
  • Configuring the Execution server would, in some cases, not take the lock on an instance.
  • When a test notification failed, it would not give the user a useful error message.
  • Changing a snippet didn't always update the script.
  • Opening the Error view would result in an error in a specific setup involving the 'Keep field values up to date' option.
  • Jobs would on rare occasions show execution packages from other instances if these instances were made as a copy of another instance.
  • In the Add Jobs wizard, some text was truncated at the end.
  • Fixed an issue with the "Execute ODX Data Factory Merge Transfer" step that caused data sources with 'Data on demand' enabled to fail or be skipped when transferring data from the ODX to the MDW using Azure Data Factory.
  • Fixed an issue on execution where excluding the "Execute ODX Data Factory Merge Transfer" step was ignored and executed anyway.
  • Fixed an issue where transfers with Azure Data Factory from the ODX to the data warehouse did not set the batch count.
  • Fixed issue with transfer from the ODX to a data warehouse on Snowflake when the table had incremental load with updates enabled in the ODX.
  • Resuming an execution would skip 'table insert' and 'related records' steps.
  • Fixed a misleading label in the Table Settings window.
  • For data warehouse storage, 'Additional connection properties' were not added to the connection string.
  • After changing storage on a data warehouse instance from on-prem SQL Server to an Azure SQL database, deployment would fail because extended properties were not created for functions and views.
  • For data warehouses on an Azure SQL database, 'custom table insert' requires the 'xact_abort' setting to be enabled, which it was not.
  • When synchronizing a mapping set with lots of tables, the window would be bigger than the display and therefore you would not be able to see and click the buttons at the end.
  • The CSV endpoint would always use UTF8-BOM encoding, ignoring the user's choice.
  • It was possible to add fields from different source tables to a semantic model even though it should not be possible.
  • In a semantic model, deleting a measure or a hierarchy that was included in a perspective would not clean up the perspective properly.
  • In a semantic model, deleting a field that was included in a perspective would throw an error during deployment.
  • In a semantic model, adding a field to a table when having a custom field would cause an error.
  • In a semantic model, dynamic role security setup values were not reselected on edit.