Skip to main content
featured-image

TimeXtender Data Integration 6814.1

We hope you've had time to put the pumpkins away because now it's time for a new major release of TimeXtender (v. 6814.1). The release has a focus on data ingestion with improved synchronization from data source to Ingest instance, new data source providers, and better orchestration and scheduling, but that's not all - check out the list below! NewRedesigned metadata synchronization and table selection: We've completely reimagined how you manage metadata and select tables in the Ingest layer. With these changes, we aim to make it easier to combat schema drifts, e.g. when you change data source providers, and put you in firm control of what goes into your Ingest storage. 'Synchronize Tasks' are now known as 'Metadata Import Tasks' and will no longer do a full synchronization of the data source. Rather, it will import the metadata from the data source and store it in the data storage of the Ingest. The Data Source Explorer has become the Metadata Manager and is now the place for synchronizing data sources - selecting and mapping tables in the data source to tables in the Ingest storage - all based on the metadata imported by the Metadata Import Tasks.  Easier orchestration with synchronization from TDI: Your transfer tasks and execution packages in TimeXtender Data Integration can now be synchronized with TimeXtender Orchestration for more feature-rich orchestration and scheduling than possible with Jobs in TDI. To get started, grab an API key from the TDI portal and use it to create a new "TimeXtender Data Integration" data provider in TimeXtender Orchestration. Redesigned Instances page: We've redecorated the Instances page to make it easier to use. Among the changes are a new list view to complement the card-based view, collapsible cards to help you focus on the environments you're working on, and a consolidated "toolbar" with a Search box and buttons to add instances and manage environments. Prepare instance on Microsoft Fabric Lakehouse: You can now use Fabric Lakehouse as Prepare Instance storage. However, in this first version, the functionality for Prepare instances on Fabric Lakehouse is limited to what's possible with Simple Mode enabled.   New data sources: In our quest to make connecting to data sources easier and more consistent, we're ready with three new TimeXtender-branded data source providers:  Parquet (similar to the existing CSV and Excel providers), OData (similar to the existing REST provider) and Finance & Operations OneLake which supports transferring data to Ingest instances using Azure Data Lake Gen 2 or Fabric storage. If both the Ingest and Prepare instances use Fabric storage, the data will bypass the Ingest storage and be transferred directly into the Prepare storage, leading to better performance and saved storage space. Bring instances back from the dead: Possibly inspired by the recent Halloween spookiness, we've implemented a soft delete feature for instances. You can now restore a deleted instance for up to 30 days after deletion. ImprovementsThe Migrate Instance modal has been restructured into steps, includes a review section, and lets you select the source instance and environment in the modal.  In the top-right corner of the TDI Portal, you'll now find a nine-dot menu for easy navigation to TimeXtender MDM, TimeXtender DQ, and TimeXtender Orchestration. A banner on the Home page will now let you know about upcoming system maintenance. The Upgrade data source page has received a new coat of paint to match the new TDI Portal design. On CSV data sources, you can now define custom null values, such as "N/A" and "-",  in the aptly named "Null Values" field. On SAP Table data sources, we have added a Table name filter that makes it possible to filter out some of the irrelevant tables before you can even see them in TDI. This can make importing metadata from the source much faster and makes it easier to manage the notoriously large amount of tables in SAP. To prevent accidental password leakage, we've applied password protection to more relevant fields in the TimeXtender-branded data source providers. You can now connect to Azure Blob Storage (or ADLS) using principal user credentials. This applies to the TimeXtender-branded CSV, Excel, and Parquet data sources.  We've made the Ingest authentication refresh logic more robust to prevent potential issues. We've changed SQL queries to include a 30-second command timeout, preventing client lockups during cloud database issues, and improved Timextender Data Integration logging for clearer task tracking. When you upgrade TimeXtender Data Integration, you can now see more information about what is being imported from the old version in the first run of the new version. FixedOn the Migrations page in the TDI Portal, cards now accommodate longer instance names. On the Instances page in the TDI Portal, a non-data estate admin user would sometimes get "User not authorized" or "Missing data estate permission" errors. In the TDI Portal, Test Connection would return "successful connection" for non-existing paths in cloud-type locations (AWS, Azure, GCS). In TimeXtender Data Integration, we have improved the visualization of invalid data sources under Ingest instances. They'll now have "(invalid)" postfixed to their name which will be displayed in red. Fixed a "Task was canceled" error when opening TimeXtender Data Integration with over 250 instances and adjusted the HTTP timeout settings to improve loading. Using the integrate existing objects feature in TimeXtender Data Integration would sometimes cause a "duplicate key" error due to unfiltered duplicate keys. Duplicate keys are now properly handled to prevent this error. In TimeXtender Data Integration, we fixed an issue with a radio button that prevented you from switching between the Valid and Raw tables when you created indexes.   In the Filter Rows window in TimeXtender Data Integration, you could click the Preview button even when the data source did not support preview. In TimeXtender Data Integration, we fixed an issue where changes in Edit SQL Snippet Transformation were not being saved. In TimeXtender Data Integration, we have improved the message displayed when an error is thrown on Reports > Errors. In TimeXtender Data Integration, tables with selection rules would fail when dragged from one data area to another on a Prepare instance that uses Snowflake as storage. In TimeXtender Data Integration 6766.1, SAP data sources experienced degraded performance due to the accidental release of a 32-bit version of the TXIntegrationServices component. We updated the stored procedures for executing Prepare instances to sort data by 'DW_ODXBatchNumber' for insertion into the valid table during a full load. If 'DW_ODXBatchNumber' is not available, it will default to sorting by [DW_Id] in ascending order. The execution of execution packages would sometimes fail with the error "terminated unexpectedly". To solve the issue, we made the access token refresh logic more robust.  It now permits refreshes up to 4 hours before expiration, incorporates retries for failed attempts, and includes an automatic refresh when the execution service restarts. The Execution Service would ignore proxy settings when executing packages, which could result in misleading error descriptions for the end-user.  The TimeXtender REST data source provider now handles empty property names, property names that start or end with a colon, and property names with more than one colon.

Related products:TimeXtender Data Integration

TimeXtender Data Integration 6766.1

Today, we’ve published a hotfix release of TimeXtender Data Integration and TimeXtender Ingest Service (v. 6766.1) that contains the changes listed below. FixedReplaced the lock with a mutex to prevent thread conflicts in the workspace. Fixed a syntax error in the generated Snowflake script caused by an incorrectly placed semicolon in the data cleansing procedure. Fixed an issue where the max degree of delivery instance parallelism reset to 1 after reloading the execution service configuration. Resolved an issue preventing new fields from being added to a DW table with existing mappings. Fixed migration errors between Prepare Instances due to missing extended properties. Updated fiscal week calculation to fix a month comparison issue. Fixed a loading issue with the Custom View Tracing Object when views were split across multiple data warehouses. Updated the instance list order in the Ingest Service Configuration Tool. Fixed an issue in Synchronize with Ingest Instance where fields weren't auto-selected if field names had changed. Fixed a UI issue where the persist view info box was visible by default, and the icon misaligned when resizing the dialog. Resolved an issue where the parameter rename dialog in the custom view script was partially obscured by Windows scaling. Resolved an issue where job statuses failed to update after task completion by implementing retry logic.  We fixed an issue where the dynamic role-level security table was not included in "end-to-end" Dynamic Perspectives Fixed an issue with missing command timeout for Dynamic Security queries against the "Deliver" storage Fixed an issue when adding a private data source (ADO.NET), where an error saying assembly file was not found was thrown.

Related products:TimeXtender Data Integration

TimeXtender Data Integration 6744.1

Today, we’ve published a new release of TimeXtender (v. 6744.1) that contains the changes listed below. NewProduct renaming:TimeXtender desktop is now known as TimeXtender Data Integration (TDI) TimeXtender ODX service is now known as TimeXtender Ingest Service (TIS)ODX is now known as Ingest.MDW is now known as Prepare.SSL is now known as Deliver. String aggregation as an option for aggregation tables. The output will be separated by a comma and will be ordered by the content of the column Added persist view functionality, which will persist a view as a table. Introduced a filter for available data providers to improve data source selection. Added a feature that allows users to clone instances for more efficient management. Added hidden fields support in data source connection forms. In the Execution Service Configuration tool it's now possible to set up how many parallel executions of Deliver instances should be possible at a time. Default is set to 1.ImprovedThe TimeXtender database cleanup tool can now run without the need of a data area to exist. The solution explorer in TDI client will now remember if a node is collapsed when refreshing the solution tree  Changed the error message for opening incompatible instances. Environments page renamed to 'Migrations' page at /migrations and all non-transfer related functionality removed. The Instances page has been merged with the Environments page, now accessible at /instances, featuring drag-and-drop functionality for moving instances between environments. A new design has been implemented for data tables. Improved the loading speed of the organization table.FixedDouble clicking a view field in data lineage would not select the view field node  Shutting down TDI (TimeXtender client) with step in the execution queue would ask you if you wish to exit, but exit would happen if you clicked No instead of Yes Resizing the column width in the TIS (ODX) incremental load dialog was disabled Dragging a table node from the semantic source database browser to the semantic tables node did not work Dragging a field node from the semantic source database browser to a semantic table node did not work The context menu for editing an environment was shown on the default environment in the TDI (TimeXtender client) The buttons for showing schemas on the TIS (ODX) ADO and OLE.DB data source advanced settings were hidden. Null reference error when validating Deliver (SSL) Qlik endpoint of the type Qlik View while using Snowflake as the source Prepare instance (MDW) Wrong label for the table distribution setting for Dedicated SQL Pool Synchronizing the Prepare instance (MDW) with an Ingest instance (ODX) would show tables as missing when the data source had "data on demand" enabled and no execution tasks were added on the data source Searching for tables for the mapping sets in the Prepare instance (MDW) would not find tables where the data source had "data on demand" enabled and no execution tasks were added on the data source Listing constraint suggestions on the Prepare instance (MDW) would not find tables where the data source had "data on demand" enabled and no execution tasks were added on the data source Adding new view fields to a standard view was blocked after selecting a field in the overview pane Some Ingest instance (ODX) execution logs sometimes went missing due to a timing issue in the Ingest instance (ODX) execution logic Potential null reference error when two users concurrently committed a change on an Ingest instance (ODX) data source or Ingest instance (ODX) task  Some Semantic dialogs could only resize horizontally Tabular Security DAX was incorrect when having more than one setup Moving Ingest instance (ODX) data type override rules up and down sometimes caused an index out of bounds exception A circular reference in data lineage with history tables caused the data lineage to run forever.  Label in warning and error reports dialogs shows database instead of data area When adding a private data source (ADO.NET) an error saying assembly file was not found was thrown When right-clicking on a table with a mapping type of the Integration Table, the client sometimes crashed. Updated the layout of the Deliver Instance (SSL) Role Dialog  Fixed an issue where the TIS (ODX service) would get stuck in 'Starting' state if the authentication token could not be renewed. Addressed alignment issues with dropdowns and the search icon in multi select filters. Corrected the incorrect tooltip displayed for the delete instance button. Resolved an issue where the hand cursor appeared on mouse over an instance in the Environment page, but clicking did not trigger any action. Fixed an issue where it was not possible to save a prepare (MDW) instance after previously experiencing an error. Fixed a bug that caused the customer table to not refresh after deleting a customer. Resolve an issue that made it impossible to delete some data source connections because of hidden double spaces in the name. Fixed an issue where important setup properties were accidentally overwritten during updates, and ensured that the connection string is securely encrypted when edited. Fixed an issue to ensure that repository information and related secrets stay properly aligned and consistent. Resolved an issue that caused organization creation to fail due to incorrect data handling. Ensured that comments are now properly added to the activity log when removing a company and removed duplicate comments that appeared when deleting an organization

Related products:TimeXtender Data IntegrationTimeXtender Portal

TimeXtender Desktop 6691.1

Hotfix release of TimeXtender Desktop (6691.1) that contains the changes listed below. Fixed Desktop 21596: Can't deploy DWH with service principal authentication Fixed an issue where service principal authentication was failing when deploying a SQL data warehouse 21604: SaaS Data Selection Rules - source vs destination fields Fixed issue where the data selection rule pane showed destination field names instead of source field names in table mappings. 21641: Implement 'WITH RECOMPILE' in Direct Read procedures All data movement procedures for moving data between data areas are now created with 'WITH RECOMPILE'. This will be picked up by the differential deployment. 21673: Issue when connecting to MDW SQL Storage when it is a Azure SQL Database This has now been fixed. 21679: Duplicated advanced features on SQL/Azure DWH Fixed and issue where some Advanced DWH features were listed twice 21706: SSL instances upgrade from 1.0.0.0 to 2.0.0.0 are missing extended properties on one table Fixed an issue in the SSL repository which will not copy semantic notifications when using "Copy instance" in the portal. This results in the SSL not being able to open if references to semantic notifications are made in one or more semantic execution packages. 21650: Integrate Existing Objects wizard Fixed an issue which caused the wizard to fail when the MDW used a case-sensitive collation. Fixed an issue where having multiple objects with the same name, but in different schemas would cause errors when running the wizard.  Fixed missing command timeout for the MDW when running queries.

Related products:TimeXtender Data Integration
featured-image

TimeXtender Desktop 6675.2

Today, we’ve published a new release of TimeXtender Desktop with the following changes: NewRedesigned TimeXtender Portal UI with new layout, colors, dark modeWe've remodeled the Portal and given it a fresh coat of pain to enhance both the look and the user experience. The new design features a collapsible left-side menu for the features related to the data flow, while user account settings, support and admin stuff live in the revamped top menu.In addition to that, the new colors give the Portal an fresh and modern look, and on top of that, we've added a dark mode for those who prefer to turn down the light a bit. The new colors are complimented by new lighter icons and a new more readable font. In our quest for greater consistency across the suite, Exmon Turnkey have been updated to use the same colors, font and icons as the Portal. Shared login for TimeXtender and ExmonYou can now use the same login for TimeXtender and Exmon (web and the desktop DG and DM products). Less hassle, and one less password to remember! However, we haven't centralized company accounts just yet, so if you're not using Exmon already, you'll still have to have an Exmon account created for you. The same applies, of course, if you're using Exmon but not TimeXtender. Keep destination settings when you transfer an instanceYou can now choose if you want to override security, roles and notifications in the destination instance when you transfer an instance in Environments. The first time you transfer between two instances, you must override the destination settings, but on subsequent transfers you decide. Previously, these settings would always be overridden. Map endpoints when you transferring a semantic modelRelated to the improvement listed above, you can now map semantic endpoints when transferring one semantic model instance to another. The endpoints must be of the same type. Previously, the endpoints in the destination instance would have been overridden  Integrate existing data warehouses in TimeXtenderWith the new Integrate existing objects feature, you can easily use data from your old data warehouse even before you've converted it to a TimeXtender data warehouse - or if converting the old data warehouse isn't feasible. Any non-TimeXtender table that happens to be in your data warehouse storage can be integrated into the TimeXtender data warehouse instance. If you're using Xpert BI (acquired by TimeXtender in 2023), you can import additional metadata for the tables in the form of descriptions and tags.  New data source providers for Excel and CSV filesWith the new native data source providers, getting data out of Excel and CSV files just got a lot easier. ImprovedFirewall rules can now be configured on the aptly named Firewall Rules page under Data Estate instead of on the individual instance's details page. This way, it's easier to get an overview of firewall rules across all instances. You no longer need to run the ODX Service Configuration tool on the destination server after transferring an ODX instance under Environments. Instead, you simply need to restart the ODX service. Listing instances in TimeXtender Desktop is now a lot faster. Service requests from user installed software will now include custom headers to ease support cases. When you're using Snowflake as data warehouse storage. aggregate tables, table inserts, and custom table inserts are now supported. When you're using Snowflake as data warehouse storage, deployment is significantly faster. You can now use Windows, Entra Password, Entra Integrated, and Entra Service Principal authentication for ODX SQL storage in addition to the existing SQL Server Authentication. You can now use Entra Service Principal authentication for data warehouse SQL storage connections. Added strict encryption support for ODX and data warehouse SQL storage (SQL Server 2022 and Azure SQL Database). FixedPortalOptimized environment page load times. Optimized customer table load times.DesktopJobs that were not completed did not set their state to 'Failed' after a restart.  Fixed an issue where a Fabric Workspace name containing spaces would make the ODX Fabric Lakehouse unusable. On an ODX, adding an incremental rule with updates and deletes to an empty table resulted in an error. Fixed a performance issue with the CSV semantic endpoint for models that contained tables with lots of rows. Parameters would be removed from custom views created using drag-and-drop between two data areas. In the Performance Recommendations window, the info icons were not properly aligned. In the Selection Rules pane on mapping tables, some fields, including conditional lookup fields and system fields, would be missing for tables from another data area. Fixed and issue where dragging tables from a  ‘TimeXtender Dynamics 365 Business Central - SQL Server’ or ‘TimeXtender Dynamics 365 Finance - SQL Server’ data sources into the ODX’s query areas would result in nothing.

Related products:TimeXtender Data Integration
featured-image

TimeXtender 6618.1

It's been just one month since our last major release of TimeXtender, but we already have a new release ready for you. This release, however, is all about consolidation. It doesn't contain a lot of new features. Instead, we've been busy improving the existing features and fixing various bugs. ImprovedEasier selection for data source connections: When you set up a new data source connection in the Portal, you can now choose the provider and the version of the provider from separate lists, which makes it much easier to get an overview of the provider list. With the possible exception-to-prove-the-rule, the TimeXtender-branded providers will be the best choice when multiple providers are available for the same data source. For that reason, we've also created a Recommended category in the list for our "homemade" providers.  More complete data lineage with improved UI: Data lineage now traces through aggregate tables, measures, and lookups. We've simplified the UI to give more space to the actual content and a dearly missed feature from the 20.10 branch returns: You can now double-click an object to jump to that object in the main window. To facilitate that, the Data Lineage window is now non-modal, which means that it can be opened and used next to the main window. Links from Desktop to Portal for better integration of the two: In TimeXtender, you sometimes need to go back and forth between the Desktop and the Portal a lot, especially when you set up new data sources. To make the Desktop and the Portal feel more integrated, we've added links in Desktop that open the relevant page in the Portal, e.g. for adding a new data source connecting, managing environments, adding a new instance, and managing semantic model endpoints. REST API table builder: Taking data from a REST API and transforming that into a table structure can be a bit of a hassle unless you really like writing XSLT scripts. Our REST API data source provider now includes a tool that can generate the script for you. You just need to drag-and-drop the structure you'd like. Some data sources that uses a connection string when connecting to a data source (eq. ODBC, OLE DB, TimeXtender SQL, TimeXtender Oracle) now has support for adding additional connection string properties. You can now use the TimeXtender REST data source when you use Data Fabric as your ODX storage.  You can now set the background job timeout in the TimeXtender SAP Table data source. The TimeXtender Oracle data source now supports the ODP.NET provider which has improved performance. It's now possible to change the security filtering direction on semantic relations for Tabular/PowerBI. The character limit on FormatString in a calculation group item on a semantic model has been increased to 1000 characters and is now a multiline textbox. We've improved the Tabular model generated by the Tabular semantic endpoint to have better efficiency and reduce the size of the model. Fixed PortalPassword fields were unmasking incorrectly. Fixed an error that would occur when accepting the license agreement. Fixed an issue where Connection cache was duplicated Data source connections with a null category would fail to render. Error messages for transferring instances have been improved in case of timeout issues. Optimized calls to retrieve instance lists on the Portal frontend. Optimized calls to retrieve data source connections on the Portal frontend. Fixed password field on the Add/Edit Semantic Instance page not showing correct string when unmasked DesktopSSL: Previously decimal fields were being deployed as double. This has been corrected. SSL: Removed the option to include other tables and fields in a Custom Field script, than the table the field is being created on, as it didn't work and would just be empty anyway. MDW: Using an aggregate table on SQL Synapse would fail during execution due to a wrong script. This has been corrected. TimeXtender SAP Table Data Source:  Fixed issue where subtraction settings were not applied on incremental transfers Fixed an issue where deployment would fail when disabling physical tables on Snowflake data warehouse Fixed issue where ODX transfer tasks were blocking other tasks from running concurrently during bulk inserts Desktop proxy settings were not parsed from Execution Service to TimeXtender Fixed issue where open tabs did not refresh on Save and Reload

Related products:TimeXtender Data IntegrationTimeXtender Portal
featured-image

TimeXtender 6590.1

It's officially spring in the northern hemisphere and incidentally, we have a bouquet of features and improvements ready for you in TimeXtender 6590.1.NewODX on OneLake: We are excited to introduce Microsoft Fabric OneLake as ODX storage on our platform. This enhancement enables users to seamlessly harness OneLake for all ODX operations, from initial setup and configuration in the Portal to comprehensive integration within TimeXtender workflows. This is the first of many planned integrations with Microsoft Fabric, so stay tuned! Note, however, that you currently cannot use OneLake as your ODX storage if you use Snowflake as your data warehouse storage.  New data source provider for OneLake: A new data source provider for seamless and efficient ingestion of delta parquet tables from Microsoft Fabric OneLake, directly into your preferred storage solution via the TimeXtender ODX. Optimized for ingesting data from OneLake, this feature is not intended for use with Fabric OneLake ODX storage. Publish data as a REST API endpoint: Added a new semantic endpoint, REST API, that works together with a server component installed on-premises or on a virtual machine in the cloud to publish data through REST API endpoints. As getting data through a REST API is a very common use case, the new endpoint type opens up a host of opportunities for integrating TimeXtender with other tools. In our previous major release, 6505, we introduced a new REST data source provider. This means that you can now publish and ingest data from your TimeXtender solution through a REST API using first-party components. New and improved data source providers for Hubspot and Exact Online: The new providers dramatically improve upon the previous CData options with enhanced usability and performance. These new options allow you to easily add custom endpoints and flatten complex tables. To upgrade to the new connectors today just search for the "TimeXtender Hubspot" or "TimeXtender Exact" when adding a new data source connection. Then, in ODX, you can edit an existing data source configuration and change to the new TimeXtender data source connection. Read more about editing data sources. ImprovedYou can now have multiple data warehouse instances open at the same time in Desktop. We've reshuffled the shortcut menu on Data Sources in ODX instances. "Add Data Source" now redirects to the 'Add data source connection' page in the Portal, while the previous "Add Data Source" functionality is now "Map Existing Connection". The intention is to make it clearer that adding a brand new data source happens in the Portal, while "adding" a data source in Desktop is using one of the data source connections mapped to the instance in the Portal.  We've upgraded a lot of our underlying UI framework. You might notice a few changes and improvements to the Portal UI as a result of that. When adding firewall rules to instances, your own IP is now automatically suggestedFixed (Portal)Fixed an issue where data source connection categories would not be shown if the category was assigned a null value  Fixed an issue where MDW/SSL transfers could lead to errors Cloning TimeXtender REST data sources could lead to incorrect data on password fields ODX connection timeout value did not get set correctly Changes to users would not get propagated to identity provider Inputs did not get correctly disabled on SSL Qlik endpoint form Disabled fields would show up as required on the ODX form Fixed issue where the SSL form would sometimes incorrectly flag inputs as invalid, thereby making it impossible to save Fixed incorrect short name suggestion on data source mappingFixed (Desktop)Default settings for key stores were not remembered in the key generation menu for data warehouse instances using SQL Synapse or Snowflake. The parameter '%Instance%' was not working for email notifications. The CSV endpoint was not escaping the text qualifier. On-demand execution from ODX to data warehouse would fail when having the same ODX table in a data area multiple times while using ADF to transfer from an ODX ADLS2 storage. ODX to data warehouse transfer would fail when using ADF and having the same table mapped multiple times, but using different columns in each table. When using a custom field as parameter in a custom measure or calculation group item in a semantic model, the parameter would disappear after closing and opening the semantic model instance and editing the custom measure or calculation group item. Cleaning up mapped objects between an ODX and data warehouse would never clean up the instance mapping causing copy instance in the Portal to always request a remapping of an instance even though it was no longer used. Fixed issue in the semantic model repository upgrade scripts where calculation group and custom script tables were not marked as tables to be included in instance copying  Fixed an issue where jobs completed with errors or warnings were incorrectly displayed as "Completed". They are now accurately labeled as "Completed With Errors or Warnings.   Custom Data on a data warehouse table did not rename the custom data column when the corresponding field was renamed. The data warehouse field validation type 'Is Empty' would mark empty string values as invalid. Fixed an issue where raw-only fields were included in the Data Movement pane. Fixed an issue where preview in Filter Rows in the ODX would always select the first table. Fixed issue where incremental transfer from a ADLS2 ODX with 'Limit memory use' setting failed for empty tables. Fixed issue where transfers from ADLS2 ODX to SQL Synapse would fail if the source table contained reserved system field names.

Related products:TimeXtender Data IntegrationTimeXtender Portal
featured-image

TimeXtender 6505.1

New year, new features to make your TimeXtender life more enjoyable and productive! We're happy to announce the release of a new major version of TimeXtender (Desktop v. 6505.1) that includes all of the belated holiday gifts listed below. NewAutomatic data source creation in the ODX: When you map a data source connection to an ODX instance, a data source using the data source connection will automatically be created in the ODX. In addition to that, when you add a new data source connection you can now map a data source connection to an ODX instance right on the same page.  Test connection from the Portal: You can now test if the ODX can establish a connection to a data source when you add or edit a data source connection in the Portal. Improved step-by-step Get Started guide: We've created a new and improved step-by-step Get Started guide in the Portal. You can access it right from the Home page where it has its very own card. As you check off the steps, your progress is saved - on a company basis - so you can see how far you've come. And if you're already a TimeXtender champion, the card can be dismissed so it doesn't clutter up your Home page. New TimeXtender REST Provider: The brand new TimeXtender REST data source provider simplifies connection to REST-based data sources. Among other improvements, the new provider allows you to set up endpoints without fiddling with configuration files. Instances grouped by environment in Desktop: As an improvement to the multiple environments feature we added in our previous major release, instances are now grouped by environment in TimeXtender Desktop. We hope this will bring some peace to people who like things well organized! Generate end-to-end execution packages and tasks: To make it easier to set up a refresh of the data in a specific semantic model, you can now generate the data warehouse execution packages and ODX tasks that will update all data for a specific semantic model. When you make changes to the semantic model, you can regenerate the flow and the logic is smart enough to keep any customizations you made to the auto-generated objects. Calculation groups in semantic models: You can now add calculation groups to semantic models and deploy them to Tabular and PowerBI semantic endpoints. To make that work, we've added the 'discourage implicit measures' option to the endpoints. It defaults to 'automatic', which means 'true' when you've added calculation groups, and 'false' otherwise. Snippets in semantic models: It's now possible to add DAX, Qlik, and Tableau snippets, and use them in semantic custom fields, custom measures, and calculation group items. ChangedWe've tightened up the design of the add/edit data source connection pages in the Portal. In addition to the general improvements, some connections now have nice-to-have fields and categories hidden in an 'Advanced' section per default so you can set up a new connection faster.  We've improved the Desktop logic to more flawlessly support it when you rename instances in the Portal. In custom scripts in semantic models, you can now use the 'Value' parameter. Fixed PortalFixed an issue where users could see - but not access - instances that they hadn't been granted access to. Public job endpoints weren't able to handle unknown states. Endpoints were added out of order in the SSL form. Fixed issue with the "move customer" operation. Storage types weren't always loaded on the MDW form. Fixed floating info icon on SSL form. Fixed issue where the Portal throws a "not signed in" error - usually due to your token having expired - but then fails to route you back to sign in. The deployment target option for Analysis Services 2022 was missing from the Tabular SSL endpoint. Cloning a data source connection would route you to the original form, instead of the clone form. Disabling automatic firewall rules didn't always get handled correctly when handing out connections.DesktopFixed an issue with data lineage sometimes failing when trying to aggregate the display values in SQL Fixed an issue where the ODX service would sometimes fail to validate the data source connection version of TimeXtender enhanced and TimeXtender ADF transfer components causing an error. Updated some logic to better handle unsupported data sources instead of throwing an unclear error message. Fixed an issue where using an already created SQL database as storage for an ODX instance would reject the database due to the validation of the data storage version. Fixed issue with data lineage and reverse sign transformations not working Fixed an issue where using a dot (.) as the last character of a table name would cause executing a task in the ODX using a data lake to fail. The dot character will be replaced by an underscore when the dot is the last character of a folder name in the data lake. Fixed an issue deployment would fail when a source table DW_Id is mapped to a destination table DW_Id Fixed an issue where the TimeXtender BC365 online data source was failing to validate before inserting system fields during transfer. Fixed an issue where Synapse data warehouse would fail when adding a selection rule on a renamed field. Fixed issue setting up an incremental rule with year subtraction. Fixed an issue where generating documentation when only having an ODX open would throw an error. Fixed an issue where mapping would fail for tables that used system fields as column names. Fixed an issue where a table with multiple lookup fields would return incorrect results in Snowflake data warehouse. TimeXtender SAP Table data source provider:  Added support for DecimalFloatingPoint34 and DecimalFloatingPoint16 data types Fixed issue where fields starting with '/' could not be added to incremental rules Fixed issue where max. row setting was limiting number of data rows to be transferred Improved logging Fixed an issue where the default relation was not set correctly when relating tables in a semantic model. Optimized instance updates during task initialization

Related products:TimeXtender Data IntegrationTimeXtender Portal
featured-image

TimeXtender 6429.1

And it's about time for a new release of TimeXtender! The new version (Desktop v. 6429.1) includes a bunch of much-requested and, dare we say, exciting features, that we hope will improve your day-to-day. It doesn't have to be crazy at work.NewAs with any innovative feature release, there may be some quirks along the way. Though we’ve done extensive initial testing, we encourage you to report any bugs you may find so we may release further improvements as rapidly as possible.Multiple environments and instance transfers in the Portal: You can now group instances in environments to keep everything nice and ordered. In addition to that, you can transfer the contents of one instance to another, enabling a true DEV -> TEST -> PROD workflow right in the Portal​​​​​   Data source "adapters" for selected ERP systems: We've added new data sources for Microsoft Dynamics 365 Finance & Operations ("AX") and Microsoft Dynamics 365 Business Central ("NAV") that make it easier for you to handle accounts as well as other functionality that make these systems easier to work with. In the Portal, you'll find them in the data sources list as "TimeXtender Dynamics 365 Business Central" and "TimeXtender Dynamics 365 Finance" with "- SQL Server" or "- Online" appended.   Improved support for Snowflake as a data warehouse: We've taken a big step towards supporting each and every TimeXtender feature when you use Snowflake as data warehouse storage. The newly supported features include incremental load, conditional lookup fields, field transformations, field validations, history tables, supernatural keys, and custom views. Aggregate tables, custom data, custom hash fields, junk dimensions, pre- & post-scripts, related records, and table inserts are not supported yet.    XPilot integrated in the Desktop: You'll now find a handy link to XPilot, our data integration chatbot, right from the toolbar. Try all the features for 14 days: You can now try all TimeXtender features for free for 14 days before you decide if you're ready to sign up for a paid subscription. The feature- and resource-limited Free tier has been retired. Automated migration from 20.10 to the newest version of TimeXtender: If you are still on the 20.10 branch of TimeXtender, you can now upgrade to the newest version without starting from scratch. The 20.10.45 release of TimeXtender can convert existing projects to cloud-based instances to minimize the work you need to do to move up.   ChangedWe've standardized terminology around instances and data source connections in the Portal. Among other things, we wanted to fix the common confusion around data sources. Now, in the Portal, you add "data sources connections" that can be used on "data sources" in the ODX in Desktop.FixedPortalOn the Instances card on the Home page, the instances as now ordered with the newest first. On the 'Add/editing data source connection' page, the left-side section navigation was not displayed. On the 'Add/editing data source connection' page, SSL secrets are now hidden. The Portal would show incorrect data source variable names. A data source connection would fail to save due to an incorrect validation error. In some cases, the activity list would fail to load or the pagination would break. The Customer Details page would be displayed for deleted customers. We've improved the loading of the Home page with better loading animations. If a company can't be moved, you'll be notified without the Move modal popping up.Desktop18959: Updated sql server 2022 assembly dependencies 18988: When executing an object in the data warehouse with ‘Data on demand’ enabled on the data source, the transfer from the data source to the ODX storage would not be visible in the log. Now, the transfer from the source has a separate entry in the log in the details for the "outbound" transfer. 19123: Fixed an issue with sql spatial types and sql server 2022 in the MDW. 19191: Added support for data source connections without descriptions. 19199: Deploying only modified tables was very slow while deploying all tables was faster. 19261: An issue where you cannot add fields to semantic model tables with custom fields has been resolved. 19265: Changed a label from "data area" to "data warehouse" in the Execution Server Configuration tool. 19269: Fixed an out of memory exception. 19304: Empty tables would be created when using SQL Server as ODX Storage. 19317: Optimize StepRowCountLoggingExecute.cs (logging of rows). The logic behind the Step Row Count has been optimized. 19323: Mapping same field to multiple fields in a MDW table from ADF not possible. Using Azure Data Factory transfer from ODX to MDW doesn't support mapping the same column from the ODX to multiple fields on the same table in the MDW. We have added a validation that blocks this scenario in a deploy/execute scenario. 19326: We fixed the issue with losing new lines when saving or updating a Query Table on ODX. 19343: Improved labeling of Edit MDW instances. 19358: The version number was sometimes replaced with a random word in the Copy to Instance dialog. 19367: We resolved an issue where, when adding a job, the color for an invalid selection did not get grayed out, and there was a misalignment on the control for validating an item. 19386: Fixed a scaling issue with the documentation template dialog 19400: Can't authenticate the D365 BC provider on creation. 19412: Fixed an issue with "show translations" not working on custom measures for Power BI premium. 19415: Fixed an issue where data formatting settings were not enabled on SSL fields for Power BI 19429: Removed unnecessary warnings in the ODX when synchronizing OLE DB based data sources 19457: Fixed an issue with remapping SSL table when the mapped MDW gets deleted. 19464: Added syntax highlighting for various scripts. 19505: Fixed an issue with clone fields and lookup transformation templates. 19519: There was an issue with incremental load into Azure Data Lake storage where the source type is handled as UniversalDataTypes.Datetime. This caused the incremental datetime value to be UTC + local time offset. 19526: Improved error message for when loading an instance fails. 19533: Added support for OAuth login during the add data source flow in the ODX. 19540: Fixed an issue with enabling 'Keep fields up-to-date' with XML data type fields. 19560: The ODX would continue transferring data even though the task was stopped in the ODX Execution Queue. 19562: Fixed an issue with running a job schedule set to "run once on this day". 19627: Fixed an issue where running execution packages with prioritization didn't work. 19678: Fixed an issue where deleting a data source in the ODX would not always do cleanup of its task history.

Related products:TimeXtender Data IntegrationTimeXtender Portal

TimeXtender 6346.1

Today, we’ve published a minor release of TimeXtender (Desktop v. 6346.1) with the following changes:ChangedYou can now choose between a dynamic or static IP address filter when configuring firewall rules for instances in the Portal. This should help the - luckily - very few users for whom the dynamic system doesn't work. The 'description' field for data sources in the Portal is now optional, simplifying the data entry process. The system will now automatically scroll to validation errors when you try to save a data source in the Portal, making it easier to identify and correct issues. Fixed (Desktop)18883: In the Monitor window that shows the status of jobs, a job would show the status “Completed” with no indication of errors or warnings. Now, the status will tell you if there were issues during the execution. 19110: Switching between advanced and simple selection in the Add Data Source wizard, would sometimes result in an “Object reference not set to an instance of an object” error. 19113: The Row-level Security Setup window would “forget” the settings for the “Values field” and “Members field” options when the window was closed and then opened again. 19133: On some data sources, usually with lots of tables, loading a list of tables would take so long that the Query Tool window would be unusable. The list will now load asynchronously to avoid this issue. 19163: The setting for ‘Disable simple selection’ was not included when cloning a data source. 19168: Execution would fail with the error “An item with the same key has already been added” when using Azure Data Lake transfer and having renamed, then re-added a table in the data warehouse. 19173: On some data sources, usually with lots of tables, loading a list of tables would take so long that the Query Tables window would be unusable. The list will now load asynchronously to avoid this issue. 19178; Trying to delete a custom period in a date table would sometimes result in a “Deleted row information cannot be accessed” error.  19181: When using the Dynamics 365 Business Central data source provider, execution of the “G/L Entry” table would fail for tables with many records. Fixed (Portal)Trying to configure the ODX Server service using a user account with an unverified e-mail address would result in a blank error in the ODX Configuration app. The error message now explains the issue and how to resolve it. Monthly usage details were not displaying correctly in the Portal. Comments were not being saved or displayed for certain entries in the activity log. Changing the email address of a user in the Portal required that the user was activated (invited/signed in). This requirement has been removed. On the Activities page, it was possible to select dates that do not make sense, e.g. a 'From' date in the future. It was possible to change your email address to the email address of another user. It was not possible to set up sign-in with social accounts even though it should only require a verified e-mail address. When listing new fields on the 'Update data source' page, new values in a drop-down field would not count as a change. The 'Team development' information icon was displayed in the wrong place on the Edit Instance page. The Delete button would be positioned wrong on the Data Source page in the Portal. Trying to send a “critical notification” e-mail would result in an error if not username or password was provided.

Related products:TimeXtender Data IntegrationTimeXtender Portal

TimeXtender Desktop v. 6284.1

Today, we’ve published a minor release of TimeXtender Desktop (v. 6284.1) with the following changes:Fixed18896: Can't open Select Tables menu for data sources with large schemas Added option to default to advanced selection when creating or selecting tables from a data source 18670:  SSL field renaming not working for Tabular endpoint on Snowflake SSL Tabular endpoint: Fixed issue where renaming a field from a table on Snowflake storage would cause an error during execution. 18883: Job logs lacks detailed information We have addressed an issue where we failed to inform the user about job completion with errors or warnings. Now, when they check the job monitor, the status will be displayed as "completed" along with an indication if there are any issues encountered. 18838: Adding a Conditional Lookup Field to a SSL model gives a wrong field type until synchronized When adding a Conditional Lookup Field to an SSL model, a wrong field type would be shown until the SSL model was synchronized. 18662: Data Factory resources names are not unique when using "copy instance" Fixed an issue where creating resources for data factory would not be unique between instances created by the "copy instance" functionality. The naming convention has been changed to now include an identifier for the instances in the data factory resources names. 18599: Data Lineage is always using dbo as schema name when displaying object names Data lineage was always using dbo as schema. This has been corrected to show the actual schema that is being used. 18575: Default Hashing is referred to as Project Hashing and not Instance Hashing Updated an incorrect display name of the Default Hashing option 18720: Fix issue with mdw/ssl "copy to instance" corrupting destination Fixed an issue where using "copy to instance" for MDW and SSL instances could potentially corrupt the destination instance, if the destination instance was empty due to multiple transactions. 18713: Improve upgrade instance message Improved upgrade instance message for MDW and SSL instances 18633: Incremental load with deletes not working with a data lake Fixed an issue where incremental load with deletes didnt work in the ODX while using a data lake and no new data was transferred in the execution 18619: Issue with ODX ProjectLockVersions check Fixed an issue where a ProjectLock check was being run before the database was created. 18710: Issue with renaming script actions used in pre/post steps Fixed an issue with pre/post steps could not be edited when changing the name of a script action used as the pre/post step. 18640: Issues with Lookup Transformation Templates Fixed issue with Lookup Transformation Templates where names were not qualified Fixed issue with Lookup Transformation Templates where generating the lookup transformation script would fail since the fixed join value was missing formatting and escaping Fixed issue with Lookup Transformation Templates where the lookup transformation script would fail because the template selection statement was not formatted and escaped Fixed an issue with Lookup Transformation Templates where the lookup transformation script would fail because the default fallback value was also '' (empty string) for all data types. It is now NULL for all data types. 18688: Jobs sometimes select the wrong instance when using copy instances Fixed an issue with Jobs sometimes using the wrong instance in a "copy instance" setup. The execution packages in the job would use the first available instance across the copied instances when creating the job. 18610: Show error - details with incorrect redirect Fixed an issue where the details button on warnings and errors sometimes would incorrectly redirect to the support site. 18585: Startpage is not displaying the entire support site text Fixed UI issue where some of the text for the TimeXtender Support Site was not shown 18682: The ODX sometimes use old CData components Fixed issue where the ODX would sometimes use the wrong version of a managed ADO.NET component, due to the identifier of some external managed ADO.NET components not being unique. 18537: The Upgrade required pop-up appears with misaligned buttons Fixed issue with scaling on Upgrade job repository confirmation button 18571: Typo in Execution Server Configuration window Fixed a typo in the Execution Server Configuration 18907: Validate data source connection menu is not working Removed the 'Validate Data Source Command' from ODX data sources, as the logic that happened in the command was automated elsewhere. 18787: You cannot add multiple schedules to a job at a time Fixed an issue where trying to add multiple schedules to a job at the same time would only add one of them.

Related products:TimeXtender Data Integration