Skip to main content


New year, new features to make your TimeXtender life more enjoyable and productive! We're happy to announce the release of a new major version of TimeXtender (Desktop v. 6505.1) that includes all of the belated holiday gifts listed below.

 

New

  • Automatic data source creation in the ODX: When you map a data source connection to an ODX instance, a data source using the data source connection will automatically be created in the ODX. In addition to that, when you add a new data source connection you can now map a data source connection to an ODX instance right on the same page. 
  • Test connection from the Portal: You can now test if the ODX can establish a connection to a data source when you add or edit a data source connection in the Portal.
  • Improved step-by-step Get Started guide: We've created a new and improved step-by-step Get Started guide in the Portal. You can access it right from the Home page where it has its very own card. As you check off the steps, your progress is saved - on a company basis - so you can see how far you've come. And if you're already a TimeXtender champion, the card can be dismissed so it doesn't clutter up your Home page.
  • New TimeXtender REST Provider: The brand new TimeXtender REST data source provider simplifies connection to REST-based data sources. Among other improvements, the new provider allows you to set up endpoints without fiddling with configuration files.
  • Instances grouped by environment in Desktop: As an improvement to the multiple environments feature we added in our previous major release, instances are now grouped by environment in TimeXtender Desktop. We hope this will bring some peace to people who like things well organized!
  • Generate end-to-end execution packages and tasks: To make it easier to set up a refresh of the data in a specific semantic model, you can now generate the data warehouse execution packages and ODX tasks that will update all data for a specific semantic model. When you make changes to the semantic model, you can regenerate the flow and the logic is smart enough to keep any customizations you made to the auto-generated objects.
  • Calculation groups in semantic models: You can now add calculation groups to semantic models and deploy them to Tabular and PowerBI semantic endpoints. To make that work, we've added the 'discourage implicit measures' option to the endpoints. It defaults to 'automatic', which means 'true' when you've added calculation groups, and 'false' otherwise.
  • Snippets in semantic modelsIt's now possible to add DAX, Qlik, and Tableau snippets, and use them in semantic custom fields, custom measures, and calculation group items.

 

Changed

  • We've tightened up the design of the add/edit data source connection pages in the Portal. In addition to the general improvements, some connections now have nice-to-have fields and categories hidden in an 'Advanced' section per default so you can set up a new connection faster. 
  • We've improved the Desktop logic to more flawlessly support it when you rename instances in the Portal.
  • In custom scripts in semantic models, you can now use the 'Value' parameter.

 

Fixed

 

Portal

  • Fixed an issue where users could see - but not access - instances that they hadn't been granted access to.
  • Public job endpoints weren't able to handle unknown states.
  • Endpoints were added out of order in the SSL form.
  • Fixed issue with the "move customer" operation.
  • Storage types weren't always loaded on the MDW form.
  • Fixed floating info icon on SSL form.
  • Fixed issue where the Portal throws a "not signed in" error - usually due to your token having expired - but then fails to route you back to sign in.
  • The deployment target option for Analysis Services 2022 was missing from the Tabular SSL endpoint.
  • Cloning a data source connection would route you to the original form, instead of the clone form.
  • Disabling automatic firewall rules didn't always get handled correctly when handing out connections.

Desktop

  • Fixed an issue with data lineage sometimes failing when trying to aggregate the display values in SQL
  • Fixed an issue where the ODX service would sometimes fail to validate the data source connection version of TimeXtender enhanced and TimeXtender ADF transfer components causing an error.
  • Updated some logic to better handle unsupported data sources instead of throwing an unclear error message.
  • Fixed an issue where using an already created SQL database as storage for an ODX instance would reject the database due to the validation of the data storage version.
  • Fixed issue with data lineage and reverse sign transformations not working
  • Fixed an issue where using a dot (.) as the last character of a table name would cause executing a task in the ODX using a data lake to fail. The dot character will be replaced by an underscore when the dot is the last character of a folder name in the data lake.
  • Fixed an issue deployment would fail when a source table DW_Id is mapped to a destination table DW_Id
  • Fixed an issue where the TimeXtender BC365 online data source was failing to validate before inserting system fields during transfer.
  • Fixed an issue where Synapse data warehouse would fail when adding a selection rule on a renamed field.
  • Fixed issue setting up an incremental rule with year subtraction.
  • Fixed an issue where generating documentation when only having an ODX open would throw an error.
  • Fixed an issue where mapping would fail for tables that used system fields as column names.
  • Fixed an issue where a table with multiple lookup fields would return incorrect results in Snowflake data warehouse.
  • TimeXtender SAP Table data source provider: 
    • Added support for DecimalFloatingPoint34 and DecimalFloatingPoint16 data types
    • Fixed issue where fields starting with '/' could not be added to incremental rules
    • Fixed issue where max. row setting was limiting number of data rows to be transferred
    • Improved logging
  • Fixed an issue where the default relation was not set correctly when relating tables in a semantic model.
  • Optimized instance updates during task initialization