TimeXtender 6618.1

It's been just one month since our last major release of TimeXtender, but we already have a new release ready for you. This release, however, is all about consolidation. It doesn't contain a lot of new features. Instead, we've been busy improving the existing features and fixing various bugs. ImprovedEasier selection for data source connections: When you set up a new data source connection in the Portal, you can now choose the provider and the version of the provider from separate lists, which makes it much easier to get an overview of the provider list. With the possible exception-to-prove-the-rule, the TimeXtender-branded providers will be the best choice when multiple providers are available for the same data source. For that reason, we've also created a Recommended category in the list for our "homemade" providers.  More complete data lineage with improved UI: Data lineage now traces through aggregate tables, measures, and lookups. We've simplified the UI to give more space to the actual content and a dearly missed feature from the 20.10 branch returns: You can now double-click an object to jump to that object in the main window. To facilitate that, the Data Lineage window is now non-modal, which means that it can be opened and used next to the main window. Links from Desktop to Portal for better integration of the two: In TimeXtender, you sometimes need to go back and forth between the Desktop and the Portal a lot, especially when you set up new data sources. To make the Desktop and the Portal feel more integrated, we've added links in Desktop that open the relevant page in the Portal, e.g. for adding a new data source connecting, managing environments, adding a new instance, and managing semantic model endpoints. REST API table builder: Taking data from a REST API and transforming that into a table structure can be a bit of a hassle unless you really like writing XSLT scripts. Our REST API data source provider now includes a tool that can generate the script for you. You just need to drag-and-drop the structure you'd like. Some data sources that uses a connection string when connecting to a data source (eq. ODBC, OLE DB, TimeXtender SQL, TimeXtender Oracle) now has support for adding additional connection string properties. You can now use the TimeXtender REST data source when you use Data Fabric as your ODX storage.  You can now set the background job timeout in the TimeXtender SAP Table data source. The TimeXtender Oracle data source now supports the ODP.NET provider which has improved performance. It's now possible to change the security filtering direction on semantic relations for Tabular/PowerBI. The character limit on FormatString in a calculation group item on a semantic model has been increased to 1000 characters and is now a multiline textbox. We've improved the Tabular model generated by the Tabular semantic endpoint to have better efficiency and reduce the size of the model. Fixed PortalPassword fields were unmasking incorrectly. Fixed an error that would occur when accepting the license agreement. Fixed an issue where Connection cache was duplicated Data source connections with a null category would fail to render. Error messages for transferring instances have been improved in case of timeout issues. Optimized calls to retrieve instance lists on the Portal frontend. Optimized calls to retrieve data source connections on the Portal frontend. Fixed password field on the Add/Edit Semantic Instance page not showing correct string when unmasked DesktopSSL: Previously decimal fields were being deployed as double. This has been corrected. SSL: Removed the option to include other tables and fields in a Custom Field script, than the table the field is being created on, as it didn't work and would just be empty anyway. MDW: Using an aggregate table on SQL Synapse would fail during execution due to a wrong script. This has been corrected. TimeXtender SAP Table Data Source:  Fixed issue where subtraction settings were not applied on incremental transfers Fixed an issue where deployment would fail when disabling physical tables on Snowflake data warehouse Fixed issue where ODX transfer tasks were blocking other tasks from running concurrently during bulk inserts Desktop proxy settings were not parsed from Execution Service to TimeXtender

Related products:TimeXtender DesktopTimeXtender Portal

TimeXtender 6590.1

It's officially spring in the northern hemisphere and incidentally, we have a bouquet of features and improvements ready for you in TimeXtender 6590.1.NewODX on OneLake: We are excited to introduce Microsoft Fabric OneLake as ODX storage on our platform. This enhancement enables users to seamlessly harness OneLake for all ODX operations, from initial setup and configuration in the Portal to comprehensive integration within TimeXtender workflows. This is the first of many planned integrations with Microsoft Fabric, so stay tuned! Note, however, that you currently cannot use OneLake as your ODX storage if you use Snowflake as your data warehouse storage.  New data source provider for OneLake: A new data source provider for seamless and efficient ingestion of delta parquet tables from Microsoft Fabric OneLake, directly into your preferred storage solution via the TimeXtender ODX. Optimized for ingesting data from OneLake, this feature is not intended for use with Fabric OneLake ODX storage. Publish data as a REST API endpoint: Added a new semantic endpoint, REST API, that works together with a server component installed on-premises or on a virtual machine in the cloud to publish data through REST API endpoints. As getting data through a REST API is a very common use case, the new endpoint type opens up a host of opportunities for integrating TimeXtender with other tools. In our previous major release, 6505, we introduced a new REST data source provider. This means that you can now publish and ingest data from your TimeXtender solution through a REST API using first-party components. New and improved data source providers for Hubspot and Exact Online: The new providers dramatically improve upon the previous CData options with enhanced usability and performance. These new options allow you to easily add custom endpoints and flatten complex tables. To upgrade to the new connectors today just search for the "TimeXtender Hubspot" or "TimeXtender Exact" when adding a new data source connection. Then, in ODX, you can edit an existing data source configuration and change to the new TimeXtender data source connection. Read more about editing data sources. ImprovedYou can now have multiple data warehouse instances open at the same time in Desktop. We've reshuffled the shortcut menu on Data Sources in ODX instances. "Add Data Source" now redirects to the 'Add data source connection' page in the Portal, while the previous "Add Data Source" functionality is now "Map Existing Connection". The intention is to make it clearer that adding a brand new data source happens in the Portal, while "adding" a data source in Desktop is using one of the data source connections mapped to the instance in the Portal.  We've upgraded a lot of our underlying UI framework. You might notice a few changes and improvements to the Portal UI as a result of that. When adding firewall rules to instances, your own IP is now automatically suggestedFixed (Portal)Fixed an issue where data source connection categories would not be shown if the category was assigned a null value  Fixed an issue where MDW/SSL transfers could lead to errors Cloning TimeXtender REST data sources could lead to incorrect data on password fields ODX connection timeout value did not get set correctly Changes to users would not get propagated to identity provider Inputs did not get correctly disabled on SSL Qlik endpoint form Disabled fields would show up as required on the ODX form Fixed issue where the SSL form would sometimes incorrectly flag inputs as invalid, thereby making it impossible to save Fixed incorrect short name suggestion on data source mappingFixed (Desktop)Default settings for key stores were not remembered in the key generation menu for data warehouse instances using SQL Synapse or Snowflake. The parameter '%Instance%' was not working for email notifications. The CSV endpoint was not escaping the text qualifier. On-demand execution from ODX to data warehouse would fail when having the same ODX table in a data area multiple times while using ADF to transfer from an ODX ADLS2 storage. ODX to data warehouse transfer would fail when using ADF and having the same table mapped multiple times, but using different columns in each table. When using a custom field as parameter in a custom measure or calculation group item in a semantic model, the parameter would disappear after closing and opening the semantic model instance and editing the custom measure or calculation group item. Cleaning up mapped objects between an ODX and data warehouse would never clean up the instance mapping causing copy instance in the Portal to always request a remapping of an instance even though it was no longer used. Fixed issue in the semantic model repository upgrade scripts where calculation group and custom script tables were not marked as tables to be included in instance copying  Fixed an issue where jobs completed with errors or warnings were incorrectly displayed as "Completed". They are now accurately labeled as "Completed With Errors or Warnings.   Custom Data on a data warehouse table did not rename the custom data column when the corresponding field was renamed. The data warehouse field validation type 'Is Empty' would mark empty string values as invalid. Fixed an issue where raw-only fields were included in the Data Movement pane. Fixed an issue where preview in Filter Rows in the ODX would always select the first table. Fixed issue where incremental transfer from a ADLS2 ODX with 'Limit memory use' setting failed for empty tables. Fixed issue where transfers from ADLS2 ODX to SQL Synapse would fail if the source table contained reserved system field names.

Related products:TimeXtender DesktopTimeXtender Portal

TimeXtender 6505.1

New year, new features to make your TimeXtender life more enjoyable and productive! We're happy to announce the release of a new major version of TimeXtender (Desktop v. 6505.1) that includes all of the belated holiday gifts listed below. NewAutomatic data source creation in the ODX: When you map a data source connection to an ODX instance, a data source using the data source connection will automatically be created in the ODX. In addition to that, when you add a new data source connection you can now map a data source connection to an ODX instance right on the same page.  Test connection from the Portal: You can now test if the ODX can establish a connection to a data source when you add or edit a data source connection in the Portal. Improved step-by-step Get Started guide: We've created a new and improved step-by-step Get Started guide in the Portal. You can access it right from the Home page where it has its very own card. As you check off the steps, your progress is saved - on a company basis - so you can see how far you've come. And if you're already a TimeXtender champion, the card can be dismissed so it doesn't clutter up your Home page. New TimeXtender REST Provider: The brand new TimeXtender REST data source provider simplifies connection to REST-based data sources. Among other improvements, the new provider allows you to set up endpoints without fiddling with configuration files. Instances grouped by environment in Desktop: As an improvement to the multiple environments feature we added in our previous major release, instances are now grouped by environment in TimeXtender Desktop. We hope this will bring some peace to people who like things well organized! Generate end-to-end execution packages and tasks: To make it easier to set up a refresh of the data in a specific semantic model, you can now generate the data warehouse execution packages and ODX tasks that will update all data for a specific semantic model. When you make changes to the semantic model, you can regenerate the flow and the logic is smart enough to keep any customizations you made to the auto-generated objects. Calculation groups in semantic models: You can now add calculation groups to semantic models and deploy them to Tabular and PowerBI semantic endpoints. To make that work, we've added the 'discourage implicit measures' option to the endpoints. It defaults to 'automatic', which means 'true' when you've added calculation groups, and 'false' otherwise. Snippets in semantic models: It's now possible to add DAX, Qlik, and Tableau snippets, and use them in semantic custom fields, custom measures, and calculation group items. ChangedWe've tightened up the design of the add/edit data source connection pages in the Portal. In addition to the general improvements, some connections now have nice-to-have fields and categories hidden in an 'Advanced' section per default so you can set up a new connection faster.  We've improved the Desktop logic to more flawlessly support it when you rename instances in the Portal. In custom scripts in semantic models, you can now use the 'Value' parameter. Fixed PortalFixed an issue where users could see - but not access - instances that they hadn't been granted access to. Public job endpoints weren't able to handle unknown states. Endpoints were added out of order in the SSL form. Fixed issue with the "move customer" operation. Storage types weren't always loaded on the MDW form. Fixed floating info icon on SSL form. Fixed issue where the Portal throws a "not signed in" error - usually due to your token having expired - but then fails to route you back to sign in. The deployment target option for Analysis Services 2022 was missing from the Tabular SSL endpoint. Cloning a data source connection would route you to the original form, instead of the clone form. Disabling automatic firewall rules didn't always get handled correctly when handing out connections.DesktopFixed an issue with data lineage sometimes failing when trying to aggregate the display values in SQL Fixed an issue where the ODX service would sometimes fail to validate the data source connection version of TimeXtender enhanced and TimeXtender ADF transfer components causing an error. Updated some logic to better handle unsupported data sources instead of throwing an unclear error message. Fixed an issue where using an already created SQL database as storage for an ODX instance would reject the database due to the validation of the data storage version. Fixed issue with data lineage and reverse sign transformations not working Fixed an issue where using a dot (.) as the last character of a table name would cause executing a task in the ODX using a data lake to fail. The dot character will be replaced by an underscore when the dot is the last character of a folder name in the data lake. Fixed an issue deployment would fail when a source table DW_Id is mapped to a destination table DW_Id Fixed an issue where the TimeXtender BC365 online data source was failing to validate before inserting system fields during transfer. Fixed an issue where Synapse data warehouse would fail when adding a selection rule on a renamed field. Fixed issue setting up an incremental rule with year subtraction. Fixed an issue where generating documentation when only having an ODX open would throw an error. Fixed an issue where mapping would fail for tables that used system fields as column names. Fixed an issue where a table with multiple lookup fields would return incorrect results in Snowflake data warehouse. TimeXtender SAP Table data source provider:  Added support for DecimalFloatingPoint34 and DecimalFloatingPoint16 data types Fixed issue where fields starting with '/' could not be added to incremental rules Fixed issue where max. row setting was limiting number of data rows to be transferred Improved logging Fixed an issue where the default relation was not set correctly when relating tables in a semantic model. Optimized instance updates during task initialization

Related products:TimeXtender DesktopTimeXtender Portal

TimeXtender 6429.1

And it's about time for a new release of TimeXtender! The new version (Desktop v. 6429.1) includes a bunch of much-requested and, dare we say, exciting features, that we hope will improve your day-to-day. It doesn't have to be crazy at work.NewAs with any innovative feature release, there may be some quirks along the way. Though we’ve done extensive initial testing, we encourage you to report any bugs you may find so we may release further improvements as rapidly as possible.Multiple environments and instance transfers in the Portal: You can now group instances in environments to keep everything nice and ordered. In addition to that, you can transfer the contents of one instance to another, enabling a true DEV -> TEST -> PROD workflow right in the Portal​​​​​   Data source "adapters" for selected ERP systems: We've added new data sources for Microsoft Dynamics 365 Finance & Operations ("AX") and Microsoft Dynamics 365 Business Central ("NAV") that make it easier for you to handle accounts as well as other functionality that make these systems easier to work with. In the Portal, you'll find them in the data sources list as "TimeXtender Dynamics 365 Business Central" and "TimeXtender Dynamics 365 Finance" with "- SQL Server" or "- Online" appended.   Improved support for Snowflake as a data warehouse: We've taken a big step towards supporting each and every TimeXtender feature when you use Snowflake as data warehouse storage. The newly supported features include incremental load, conditional lookup fields, field transformations, field validations, history tables, supernatural keys, and custom views. Aggregate tables, custom data, custom hash fields, junk dimensions, pre- & post-scripts, related records, and table inserts are not supported yet.    XPilot integrated in the Desktop: You'll now find a handy link to XPilot, our data integration chatbot, right from the toolbar. Try all the features for 14 days: You can now try all TimeXtender features for free for 14 days before you decide if you're ready to sign up for a paid subscription. The feature- and resource-limited Free tier has been retired. Automated migration from 20.10 to the newest version of TimeXtender: If you are still on the 20.10 branch of TimeXtender, you can now upgrade to the newest version without starting from scratch. The 20.10.45 release of TimeXtender can convert existing projects to cloud-based instances to minimize the work you need to do to move up.   ChangedWe've standardized terminology around instances and data source connections in the Portal. Among other things, we wanted to fix the common confusion around data sources. Now, in the Portal, you add "data sources connections" that can be used on "data sources" in the ODX in Desktop.FixedPortalOn the Instances card on the Home page, the instances as now ordered with the newest first. On the 'Add/editing data source connection' page, the left-side section navigation was not displayed. On the 'Add/editing data source connection' page, SSL secrets are now hidden. The Portal would show incorrect data source variable names. A data source connection would fail to save due to an incorrect validation error. In some cases, the activity list would fail to load or the pagination would break. The Customer Details page would be displayed for deleted customers. We've improved the loading of the Home page with better loading animations. If a company can't be moved, you'll be notified without the Move modal popping up.Desktop18959: Updated sql server 2022 assembly dependencies 18988: When executing an object in the data warehouse with ‘Data on demand’ enabled on the data source, the transfer from the data source to the ODX storage would not be visible in the log. Now, the transfer from the source has a separate entry in the log in the details for the "outbound" transfer. 19123: Fixed an issue with sql spatial types and sql server 2022 in the MDW. 19191: Added support for data source connections without descriptions. 19199: Deploying only modified tables was very slow while deploying all tables was faster. 19261: An issue where you cannot add fields to semantic model tables with custom fields has been resolved. 19265: Changed a label from "data area" to "data warehouse" in the Execution Server Configuration tool. 19269: Fixed an out of memory exception. 19304: Empty tables would be created when using SQL Server as ODX Storage. 19317: Optimize StepRowCountLoggingExecute.cs (logging of rows). The logic behind the Step Row Count has been optimized. 19323: Mapping same field to multiple fields in a MDW table from ADF not possible. Using Azure Data Factory transfer from ODX to MDW doesn't support mapping the same column from the ODX to multiple fields on the same table in the MDW. We have added a validation that blocks this scenario in a deploy/execute scenario. 19326: We fixed the issue with losing new lines when saving or updating a Query Table on ODX. 19343: Improved labeling of Edit MDW instances. 19358: The version number was sometimes replaced with a random word in the Copy to Instance dialog. 19367: We resolved an issue where, when adding a job, the color for an invalid selection did not get grayed out, and there was a misalignment on the control for validating an item. 19386: Fixed a scaling issue with the documentation template dialog 19400: Can't authenticate the D365 BC provider on creation. 19412: Fixed an issue with "show translations" not working on custom measures for Power BI premium. 19415: Fixed an issue where data formatting settings were not enabled on SSL fields for Power BI 19429: Removed unnecessary warnings in the ODX when synchronizing OLE DB based data sources 19457: Fixed an issue with remapping SSL table when the mapped MDW gets deleted. 19464: Added syntax highlighting for various scripts. 19505: Fixed an issue with clone fields and lookup transformation templates. 19519: There was an issue with incremental load into Azure Data Lake storage where the source type is handled as UniversalDataTypes.Datetime. This caused the incremental datetime value to be UTC + local time offset. 19526: Improved error message for when loading an instance fails. 19533: Added support for OAuth login during the add data source flow in the ODX. 19540: Fixed an issue with enabling 'Keep fields up-to-date' with XML data type fields. 19560: The ODX would continue transferring data even though the task was stopped in the ODX Execution Queue. 19562: Fixed an issue with running a job schedule set to "run once on this day". 19627: Fixed an issue where running execution packages with prioritization didn't work. 19678: Fixed an issue where deleting a data source in the ODX would not always do cleanup of its task history.

Related products:TimeXtender DesktopTimeXtender Portal

TimeXtender 6346.1

Today, we’ve published a minor release of TimeXtender (Desktop v. 6346.1) with the following changes:ChangedYou can now choose between a dynamic or static IP address filter when configuring firewall rules for instances in the Portal. This should help the - luckily - very few users for whom the dynamic system doesn't work. The 'description' field for data sources in the Portal is now optional, simplifying the data entry process. The system will now automatically scroll to validation errors when you try to save a data source in the Portal, making it easier to identify and correct issues. Fixed (Desktop)18883: In the Monitor window that shows the status of jobs, a job would show the status “Completed” with no indication of errors or warnings. Now, the status will tell you if there were issues during the execution. 19110: Switching between advanced and simple selection in the Add Data Source wizard, would sometimes result in an “Object reference not set to an instance of an object” error. 19113: The Row-level Security Setup window would “forget” the settings for the “Values field” and “Members field” options when the window was closed and then opened again. 19133: On some data sources, usually with lots of tables, loading a list of tables would take so long that the Query Tool window would be unusable. The list will now load asynchronously to avoid this issue. 19163: The setting for ‘Disable simple selection’ was not included when cloning a data source. 19168: Execution would fail with the error “An item with the same key has already been added” when using Azure Data Lake transfer and having renamed, then re-added a table in the data warehouse. 19173: On some data sources, usually with lots of tables, loading a list of tables would take so long that the Query Tables window would be unusable. The list will now load asynchronously to avoid this issue. 19178; Trying to delete a custom period in a date table would sometimes result in a “Deleted row information cannot be accessed” error.  19181: When using the Dynamics 365 Business Central data source provider, execution of the “G/L Entry” table would fail for tables with many records. Fixed (Portal)Trying to configure the ODX Server service using a user account with an unverified e-mail address would result in a blank error in the ODX Configuration app. The error message now explains the issue and how to resolve it. Monthly usage details were not displaying correctly in the Portal. Comments were not being saved or displayed for certain entries in the activity log. Changing the email address of a user in the Portal required that the user was activated (invited/signed in). This requirement has been removed. On the Activities page, it was possible to select dates that do not make sense, e.g. a 'From' date in the future. It was possible to change your email address to the email address of another user. It was not possible to set up sign-in with social accounts even though it should only require a verified e-mail address. When listing new fields on the 'Update data source' page, new values in a drop-down field would not count as a change. The 'Team development' information icon was displayed in the wrong place on the Edit Instance page. The Delete button would be positioned wrong on the Data Source page in the Portal. Trying to send a “critical notification” e-mail would result in an error if not username or password was provided.

Related products:TimeXtender DesktopTimeXtender Portal

TimeXtender Desktop v. 6284.1

Today, we’ve published a minor release of TimeXtender Desktop (v. 6284.1) with the following changes:Fixed18896: Can't open Select Tables menu for data sources with large schemas Added option to default to advanced selection when creating or selecting tables from a data source 18670:  SSL field renaming not working for Tabular endpoint on Snowflake SSL Tabular endpoint: Fixed issue where renaming a field from a table on Snowflake storage would cause an error during execution. 18883: Job logs lacks detailed information We have addressed an issue where we failed to inform the user about job completion with errors or warnings. Now, when they check the job monitor, the status will be displayed as "completed" along with an indication if there are any issues encountered. 18838: Adding a Conditional Lookup Field to a SSL model gives a wrong field type until synchronized When adding a Conditional Lookup Field to an SSL model, a wrong field type would be shown until the SSL model was synchronized. 18662: Data Factory resources names are not unique when using "copy instance" Fixed an issue where creating resources for data factory would not be unique between instances created by the "copy instance" functionality. The naming convention has been changed to now include an identifier for the instances in the data factory resources names. 18599: Data Lineage is always using dbo as schema name when displaying object names Data lineage was always using dbo as schema. This has been corrected to show the actual schema that is being used. 18575: Default Hashing is referred to as Project Hashing and not Instance Hashing Updated an incorrect display name of the Default Hashing option 18720: Fix issue with mdw/ssl "copy to instance" corrupting destination Fixed an issue where using "copy to instance" for MDW and SSL instances could potentially corrupt the destination instance, if the destination instance was empty due to multiple transactions. 18713: Improve upgrade instance message Improved upgrade instance message for MDW and SSL instances 18633: Incremental load with deletes not working with a data lake Fixed an issue where incremental load with deletes didnt work in the ODX while using a data lake and no new data was transferred in the execution 18619: Issue with ODX ProjectLockVersions check Fixed an issue where a ProjectLock check was being run before the database was created. 18710: Issue with renaming script actions used in pre/post steps Fixed an issue with pre/post steps could not be edited when changing the name of a script action used as the pre/post step. 18640: Issues with Lookup Transformation Templates Fixed issue with Lookup Transformation Templates where names were not qualified Fixed issue with Lookup Transformation Templates where generating the lookup transformation script would fail since the fixed join value was missing formatting and escaping Fixed issue with Lookup Transformation Templates where the lookup transformation script would fail because the template selection statement was not formatted and escaped Fixed an issue with Lookup Transformation Templates where the lookup transformation script would fail because the default fallback value was also '' (empty string) for all data types. It is now NULL for all data types. 18688: Jobs sometimes select the wrong instance when using copy instances Fixed an issue with Jobs sometimes using the wrong instance in a "copy instance" setup. The execution packages in the job would use the first available instance across the copied instances when creating the job. 18610: Show error - details with incorrect redirect Fixed an issue where the details button on warnings and errors sometimes would incorrectly redirect to the support site. 18585: Startpage is not displaying the entire support site text Fixed UI issue where some of the text for the TimeXtender Support Site was not shown 18682: The ODX sometimes use old CData components Fixed issue where the ODX would sometimes use the wrong version of a managed ADO.NET component, due to the identifier of some external managed ADO.NET components not being unique. 18537: The Upgrade required pop-up appears with misaligned buttons Fixed issue with scaling on Upgrade job repository confirmation button 18571: Typo in Execution Server Configuration window Fixed a typo in the Execution Server Configuration 18907: Validate data source connection menu is not working Removed the 'Validate Data Source Command' from ODX data sources, as the logic that happened in the command was automated elsewhere. 18787: You cannot add multiple schedules to a job at a time Fixed an issue where trying to add multiple schedules to a job at the same time would only add one of them.

Related products:TimeXtender Desktop

TimeXtender 6221.1

Spring has sprung, and we're happy to announce the release of a new version of TimeXtender (Desktop v. 6221.1). See what we've been up to below.Note: These Release Notes have been updated to reflect that the TimeXtender API is now live and no longer in closed BETA.NewAll semantic endpoints are now supported for Snowflake: If you have a data warehouse on Snowflake, you can now use it with all the semantic endpoints supported by TimeXtender. The Power BI, Tableau and Tabular endpoints join Qlik and CSV file as supported endpoints for this type of data warehouse storage. SQL Server 2022 support: TimeXtender now supports the latest and greatest major release of Microsoft SQL Server for use as a data warehouse or ODX data storage. Official support for Amazon RDS for SQL Server: Amazon's cloud SQL Server offering is now officially supported for use as a data warehouse or ODX data storage. Some of our enterprising customers have already paved the way by just doing it, and we're happy to put the "officially supported" stamp on their endeavor. Easy data source provider updates: We've made it much simpler to update a data source provider to take advantage of new features or bug fixes. You'll now see an aptly named 'Update' button whenever an update is available. Previously, you'd have to add a new data source in the TimeXtender Portal and switch the connection in TimeXtender Desktop.  TimeXtender API for integrating with external systems: As an important step in our march towards world domination, we've created an API that can be used by external systems that want to, among other things, trigger and monitor task executions. Currently in closed beta, this feature can be compared to the feature in TimeXtender 20.10 and older that allows you to trigger an execution package from the command prompt. ChangedA Job can now be scheduled multiple independent times. On the ODX, we've added support for data-on-demand for Managed ADO.net data sources. 'Show data types' have been implemented on semantic models Tabular endpoints now show more details when an error occurs during execution. Fixed Managed ADO.net data sources now support multi-line properties that automatically add the correct line endings ('CR LR' or '\r\n')PortalWhen adding or editing a Qlik endpoint, you would get a "some fields have invalid values" validation error. It was not possible to delete a data source if the name of the data source contained whitespace or special characters in a specific way. Add/edit/clone data sources would not show a loading spinner when loading the form. When cloning a data source, the 'Clone' submit button was not disabled if validation failed. Users on the Free tier could clone a data source to exceed the limit of data sources. Fixed various other issues with data source cloning. Minor tweaks and adjustments to the styling of the Add/Edit Instance forms We fixed some technical debt relating to customer types left over from the implementation of the Free tier in our previous release.DesktopA few outdated or incorrect icons have been changed. Data would be missing from the "valid" table on tables that had a specific setup with a mapping set, a primary key field, and a data selection rule. Configuring the Execution server would, in some cases, not take the lock on an instance. When a test notification failed, it would not give the user a useful error message. Changing a snippet didn't always update the script. Opening the Error view would result in an error in a specific setup involving the 'Keep field values up to date' option. Jobs would on rare occasions show execution packages from other instances if these instances were made as a copy of another instance. In the Add Jobs wizard, some text was truncated at the end. Fixed an issue with the "Execute ODX Data Factory Merge Transfer" step that caused data sources with 'Data on demand' enabled to fail or be skipped when transferring data from the ODX to the MDW using Azure Data Factory. Fixed an issue on execution where excluding the "Execute ODX Data Factory Merge Transfer" step was ignored and executed anyway. Fixed an issue where transfers with Azure Data Factory from the ODX to the data warehouse did not set the batch count. Fixed issue with transfer from the ODX to a data warehouse on Snowflake when the table had incremental load with updates enabled in the ODX. Resuming an execution would skip 'table insert' and 'related records' steps. Fixed a misleading label in the Table Settings window. For data warehouse storage, 'Additional connection properties' were not added to the connection string. After changing storage on a data warehouse instance from on-prem SQL Server to an Azure SQL database, deployment would fail because extended properties were not created for functions and views. For data warehouses on an Azure SQL database, 'custom table insert' requires the 'xact_abort' setting to be enabled, which it was not. When synchronizing a mapping set with lots of tables, the window would be bigger than the display and therefore you would not be able to see and click the buttons at the end. The CSV endpoint would always use UTF8-BOM encoding, ignoring the user's choice. It was possible to add fields from different source tables to a semantic model even though it should not be possible. In a semantic model, deleting a measure or a hierarchy that was included in a perspective would not clean up the perspective properly. In a semantic model, deleting a field that was included in a perspective would throw an error during deployment. In a semantic model, adding a field to a table when having a custom field would cause an error. In a semantic model, dynamic role security setup values were not reselected on edit.

Related products:TimeXtender DesktopTimeXtender Portal

TimeXtender 6143.1

We've released a new version of TimeXtender (Desktop v. 6143.1) with a bunch of new features and even more fixes - see what's new below.Warning: The new version of TimeXtender does not support version 11 of the following data source providers:Azure Data Factory - MySQL Azure Data Factory - Oracle Azure Data Factory - PostgreSQL Azure Data Factory - SQL ServerPlease use version 12 of these data sources with the new release.NewFree tier replaces trials: You can now use TimeXtender for free as long as you like without worrying about running out of credits. When you sign up for TimeXtender, you now start on the Free tier that never runs out, but comes with a few limitations. Existing trial accounts will be converted to free.Limitations of the Free tier: One user One semantic model One data warehouse One ODX One data source Azure Data Lake Storage cannot be used for ODX storage Dedicated SQL Pool (SQL DW) and Snowflake cannot be used for data warehouse storage   Data warehouse on Snowflake: We've added support for Snowflake and now, for the first time, you can deploy a TimeXtender data warehouse to non-SQL data storage, and, of course, take advantage of Snowflake features. Our initial implementation requires an ODX that uses Azure Data Lake Storage with SAS authentication and only works with the Qlik, and CSV file endpoints in the semantic layer. On the data warehouse, only features supported by simple mode are available. Read more on how to Use Snowflake as data warehouse.   Improved scheduling (Desktop): You can now schedule execution packages from DWH and SSL instances in the same job. This is useful if you, for instance, want to execute a semantic model just after the relevant tables in your data warehouse. Note that the instances must be mapped to the same TimeXtender Execution Server service.   On-Demand data warehouse ingestion: When the data on demand option is enabled, the data source will refresh each table in the ODX storage before transferring it to the data warehouse storage. This will work without configuring an explicit "transfer task" under the data source.  Changed (Portal)For consistency, we've added an 'Edit' button for each item on the 'Data sources' list.  Fixed (Portal)17587: It was not possible to add a data warehouse with Azure AD as authentication (released as hotfix). 16800: 'Clone data source' had the wrong "breadcrumb". 17809: The input box for the 'Batch size' option on ODX and data warehouse storage would max out at 65536 when using the "up" button which is far below the valid maximum value. 17485: The Permissions list is now hidden from 'Edit company details' when the list is empty. 17319: The Merge button is now disabled when you've clicked it to prevent accidental additional clicks.  Fixed (Desktop)16902: Issue with misleading text in the Synchronize window when synchronizing a data warehouse with an ODX 16686: An unnecessary 'Connection Changed' message could show up when using the Query Tool on the data warehouse  17878: Issue where "resume execution" would skip Table Insert and Related Records 16865: Data lineage for views in data warehouse to data warehouse fields was not working 16599: Previewing a query table in the ODX sometimes wouldn't suggest the query table's statement, but instead use "Select * from..." 16036:  When reloading an instance using 'Save and Reload', the previously open tabs were not reopened accordingly. This has been fixed. 17482: Removing a table that was included in an Object Security Setup, would cause the next deployment of that Object Security Setup to fail, as the references from the deleted table were still there.  16708: Using Export Deployment Steps to a CSV file would cause a null reference error 17249: Allowing a table to be compressed could not be combined with having history enabled. Enabling page compression on a table would result in the message "System field 'Is TombStone' cannot be removed". 16825: Data lineage tracing between a data warehouse view and a semantic model did not work. The semantic model did not track lineage through a mapped custom view. 17687: TimeXtender would crash when using the Deploy and Execute hotkey on views based on SQL snippets 16704: Using Select Columns to remove columns from query tables would fail on execution when transferring from an ODX on Azure Data Lake Storage. 17653: The Edit Data Area dialog would allow more than 15 characters in the area name. 16645: Enter didn't call search function in remap table when remapping a ODX This has now been corrected. 15407: Primary key validation error would remove all rows for the primary key in the valid table when using incremental load with hard deletes. 17148: It was not possible to change letter casing in the name of a conditional lookup field by clicking on the field and pressing the F2 "rename" keyboard shortcut. 17267: ODX DL to DW Azure Synapse Dedicated SQL Pool Incremental Load & hard delete results in valid table truncation. There was an issue where incremental load from the ODX using data lake as storage to a Data Warehouse using Synapse Dedicated SQL Pool would not transfer primary keys when no new data exists in the ODX, which would cause the valid table to be truncated.  17591: Adding both pre- and post steps on deployment for an incremental table would not redeploy the valid and incremental tables on "full load deploy" 16836: Trying to send a test mail in Notifications on Critical Errors would throw an error instead of sending an e-mail. 16729: Reconnecting to an Azure service in TimeXtender would fail after 12 hours without prior activity to the Azure service. 15995: Data lineage was missing information when a default relation was used instead of a join on a conditional lookup field. 17359: You would see an error message when testing a mail notification in Notifications on Critical Errors if the server returned "2.6.0 Queued mail for delivery" which isn't actually an error. 17115: SMTP authentication without a password did not work.

Related products:TimeXtender DesktopTimeXtender Portal

Welcome to the new cloud-enabled version of TimeXtender!

IntroductionWelcome to the new cloud-enabled version of TimeXtender! The biggest change by far is the introduction of cloud-powered ODX, data warehouse, and semantic model instances and with that, a turn towards a more web-based infrastructure and software-as-a-service business model. Instances are self-contained objects that can be strung together in the software to create the flow of data you need. No more messing with local repositories and projects, everything is managed by TimeXtender to create a more trouble-free experience. What's new?The new version of TimeXtender is a major reimagining of the application. All the changes are too many to list, but below is an overview of the major additions and changes. At the end of this article, you'll find links to all the new documentation - including screenshots - if you want to get into the nitty-gritty of the new release.Cloud-powered instances as building blocks of the Data Estate As mentioned above, you build your TimeXtender Data Estate with ODX, data warehouse, and semantic model instances. They are created and managed by company admins in the TimeXtender Portal, and available for developers in TimeXtender Desktop when they've signed in. Instances are the basis for our new usage-based pricing model. Manage data sources in the Portal Admins can set up data sources in the Portal. Data source credentials can be kept confidential while allowing users to use the data source. User management and access control in the Portal Users - and what instances they can access - are managed in the Portal. No more hassle with license keys and client secrets Sign-in is now required to use TimeXtender Desktop. This means that you no longer need to enter license keys and activate the software. Setting up ODX services also makes use of sign-in to remove the need for messing with client secrets. Full documentation - incl. the ODX You can now create documentation of your ODX just like you know it from the data warehouse. Context-sensitive help In TimeXtender Desktop, you can click '?' in the title bar or press F1 in the different windows to be redirected to a relevant article on the support site, support.timextender.com. If there's no relevant article to show, you'll be redirected to the front page. We'll continuously improve the list of articles to ensure that there's a relevant article for the windows that spawn the most requests. ODX: Oracle Data Source TimeXtender-enhanced data source with date and numeric range conversion and exclusion of system schemas. ODX: Dynamics 365 Business Central data sources In the TimeXtender-enhanced data source for Dynamics 365 Business Central, you can select accounts. This saves you the hassled of filtering out tables that belong to accounts you don't need. In addition to that, we have a data source for getting a table with Business Central "option values" that you can use to enrich your data model. ODX: Rollup data files in Azure Data Lake If you're using an Azure Data Lake for your ODX storage, each incremental load from the source creates a new set of files with data. To improve performance, you can now set up storage management tasks to rollup - or merge - these files to improve performance when loading from the ODX. The rollup utilizes Azure Data Factory. ODX: Sync data sources through Azure Data Factory You no longer need direct access from an ODX to sync a data source that you are transferring data from using Azure Data Factory. You can now use an Azure Data Factory to sync data sources as well. ODX: "Subtraction from value" option for incremental load You can now subtract a value or an amount of time from the incremental selection rule when you do incremental load, which is nice for making sure all changes are copied from the data source. MDW: Data profiling As a supplement to the plain preview option in TimeXtender, you can run a standard analysis on a table to get an overview of the data profile. MDW: Execute PowerShell Script as 'external step' You can execute PowerShell scripts as an 'external step' which provides endless possibilities for interacting with external components. MDW: Lookup transformation With a new kind of transformation, you can manage small or trivial lookups in a more effective way compared to conditional lookup fields. MDW: Multiple ODX table mappings based on filters You can now map multiple tables into one data warehouse table from the ODX based on filters. MDW: Support for multiple ODXs as sources for one data warehouse You can now use data from multiple ODXs in one data warehouse. SSL: Support for Power BI XMLA endpoint We've added a new semantic endpoint to semantic models, Power BI Premium, to support the Power BI Premium XMLA Read/Write Endpoint. SSL: Semantic endpoint for CSV files We've added a new endpoint to enable export to CSV files. This was previously enabled by Data Export, which we have removed from the semantic layer.Other changesIn addition to all the new stuff, we have changed or removed a bunch of the existing functionality in TimeXtender. That includes the following:Improved synchronization from MDW->SSL and ODX->MDW Synchronization has been improved to give you a better overview of changes with the option to accept or reject changes as well as remapping fields in bulk. ODX: Improvements to table selection You can now select tables for copying to the ODX from a list, much the same way as you might be used to from the business unit. This is an addition to the current rule-based system that you use to select what tables to copy from a data source in the ODX can be cumbersome to use when you're only interested in a few specific tables or the data source naming schema is too "chaotic" for rules to make sense. ODX: Improved creation of ADF pipelines for transfer between for data lake and data warehouse We've changed the logic for transferring data from an ODX data storage on Azure Data Lake to a data warehouse through Azure Data Factory to use fewer pipelines which improves performance and reduces cost. MDW: Removed SSIS as a transfer option The MDW layer has been simplified by removing SSIS as a transfer option. MDW: Goodbye to Business Units Using the ODX has long been the preferred way to copy data from data sources. In the new version, it will be the only way, which means a goodbye to the legacy Business Units. SSL: Goodbye to SSAS Multidimensional (OLAP) Cubes, the Qlik modeler, and Data Export We've simplified the semantic layer by removing everything that is not semantic models built in the shared semantic layer. Removed deprecated features We've removed all previously deprecated features. This includes the following: Azure Data Lake Storage Gen1 as ODX data storage, regular expressions in the ODX, 'data aggregations' on data warehouse tables, 'Enable BK hash key' and 'Use left outer join' options on data warehouse tables, 'Force sub select' and 'Use temporary table' options on conditional lookup fields, 'SQL mode->Partition By' option on lookup fields, 'split' and 'concatenate' options on field-to-field data movement, 'time tables' in the data warehouse.Next steps - planned featuresWhen you build software, you're never truly done and we also have a bunch of stuff planned for the next releases. This includes the following:Improved multiple environments Currently, you can copy one instance to another instance as a basic form of multiple environments functionality. However, we plan to implement much more extensive support for multiple environments. It will be available in the web interface and will include updating the relevant connections to save manual work. End-to-end scheduling We plan to offer the option to execute and schedule ODX tasks and MDW/SSL execution packages together in one job that can automatically calculate dependencies between the included objects to ensure the correct execution order. Automated migration The new TimeXtender contains some big and breaking changes, but we will provide a tool that allows you to migrate projects from previous versions of TimeXtender into the new structure in an automated fashion. Custom(er) Data Sources - (Open Interface Data Source) With a new interface, you can create your own data source providers for the ODX.Learn more about all the new featuresOur customer success team has been hard at work documenting all the new stuff. Below you'll find links to the articles they've created - click one and take a deep dive into the latest release of the world's premier data estate builder! Getting Started - Setup TimeXtenderInstall TimeXtender Setup and Configure an ODX Instance TimeXtender Prerequisites Configure your Firewall Copying Instances to Implement Multiple EnvironmentsGetting Started - Configure Azure ServicesUse Azure Data Factory for Data MovementKnowledge Base - Connecting to DataTasks in an ODX Instance (includes incremental rollup storage management task) Table and Column Selection in an ODX Instance Transfer data from an ODX to a Data Warehouse Instance TimeXtender Dynamics 365 Business Central TimeXtender Oracle Data Source provider for ODXKnowledge Base - TimeXtender PortalGrant and Revoke Access to Instances Add an ODX Instance Add a Data Warehouse instance Add a Semantic Model Instance (includes CSV endpoint) Credits and BillingKnowledge Base - Incremental Load, Execution & SchedulingExecute PowerShell Script as External Executable Scheduling Executions using Jobs Incremental load in Data Warehouse Instances Incremental load in an ODX instanceKnowledge Base - Data Validation, Quality, and ProfilingDocumentation for Instances (includes documentation for ODX instances)Knowledge Base - Design, Modelling and TransformationsLookup Transformation Template Mapping Set Data ProfilingKnowledge Base - Semantic ModelsSemantic Model Synchronization Power BI XMLA Endpoint 

Related products:TimeXtender DesktopTimeXtender Portal