Skip to main content

TimeXtender Data Integration 6766.1

Today, we’ve published a hotfix release of TimeXtender Data Integration and TimeXtender Ingest Service (v. 6766.1) that contains the changes listed below. FixedReplaced the lock with a mutex to prevent thread conflicts in the workspace. Fixed a syntax error in the generated Snowflake script caused by an incorrectly placed semicolon in the data cleansing procedure. Fixed an issue where the max degree of delivery instance parallelism reset to 1 after reloading the execution service configuration. Resolved an issue preventing new fields from being added to a DW table with existing mappings. Fixed migration errors between Prepare Instances due to missing extended properties. Updated fiscal week calculation to fix a month comparison issue. Fixed a loading issue with the Custom View Tracing Object when views were split across multiple data warehouses. Updated the instance list order in the Ingest Service Configuration Tool. Fixed an issue in Synchronize with Ingest Instance where fields weren't auto-selected if field names had changed. Fixed a UI issue where the persist view info box was visible by default, and the icon misaligned when resizing the dialog. Resolved an issue where the parameter rename dialog in the custom view script was partially obscured by Windows scaling. Resolved an issue where job statuses failed to update after task completion by implementing retry logic.  We fixed an issue where the dynamic role-level security table was not included in "end-to-end" Dynamic Perspectives Fixed an issue with missing command timeout for Dynamic Security queries against the "Deliver" storage Fixed an issue when adding a private data source (ADO.NET), where an error saying assembly file was not found was thrown.

Related products:TimeXtender Data Integration

TimeXtender Data Integration 6744.1

Today, we’ve published a new release of TimeXtender (v. 6744.1) that contains the changes listed below. NewProduct renaming:TimeXtender desktop is now known as TimeXtender Data Integration (TDI) TimeXtender ODX service is now known as TimeXtender Ingest Service (TIS)ODX is now known as Ingest.MDW is now known as Prepare.SSL is now known as Deliver. String aggregation as an option for aggregation tables. The output will be separated by a comma and will be ordered by the content of the column Added persist view functionality, which will persist a view as a table. Introduced a filter for available data providers to improve data source selection. Added a feature that allows users to clone instances for more efficient management. Added hidden fields support in data source connection forms. In the Execution Service Configuration tool it's now possible to set up how many parallel executions of Deliver instances should be possible at a time. Default is set to 1. ImprovedThe TimeXtender database cleanup tool can now run without the need of a data area to exist. The solution explorer in TDI client will now remember if a node is collapsed when refreshing the solution tree  Changed the error message for opening incompatible instances. Environments page renamed to 'Migrations' page at /migrations and all non-transfer related functionality removed. The Instances page has been merged with the Environments page, now accessible at /instances, featuring drag-and-drop functionality for moving instances between environments. A new design has been implemented for data tables. Improved the loading speed of the organization table. FixedDouble clicking a view field in data lineage would not select the view field node  Shutting down TDI (TimeXtender client) with step in the execution queue would ask you if you wish to exit, but exit would happen if you clicked No instead of Yes Resizing the column width in the TIS (ODX) incremental load dialog was disabled Dragging a table node from the semantic source database browser to the semantic tables node did not work Dragging a field node from the semantic source database browser to a semantic table node did not work The context menu for editing an environment was shown on the default environment in the TDI (TimeXtender client) The buttons for showing schemas on the TIS (ODX) ADO and OLE.DB data source advanced settings were hidden. Null reference error when validating Deliver (SSL) Qlik endpoint of the type Qlik View while using Snowflake as the source Prepare instance (MDW) Wrong label for the table distribution setting for Dedicated SQL Pool Synchronizing the Prepare instance (MDW) with an Ingest instance (ODX) would show tables as missing when the data source had "data on demand" enabled and no execution tasks were added on the data source Searching for tables for the mapping sets in the Prepare instance (MDW) would not find tables where the data source had "data on demand" enabled and no execution tasks were added on the data source Listing constraint suggestions on the Prepare instance (MDW) would not find tables where the data source had "data on demand" enabled and no execution tasks were added on the data source Adding new view fields to a standard view was blocked after selecting a field in the overview pane Some Ingest instance (ODX) execution logs sometimes went missing due to a timing issue in the Ingest instance (ODX) execution logic Potential null reference error when two users concurrently committed a change on an Ingest instance (ODX) data source or Ingest instance (ODX) task  Some Semantic dialogs could only resize horizontally Tabular Security DAX was incorrect when having more than one setup Moving Ingest instance (ODX) data type override rules up and down sometimes caused an index out of bounds exception A circular reference in data lineage with history tables caused the data lineage to run forever.  Label in warning and error reports dialogs shows database instead of data area When adding a private data source (ADO.NET) an error saying assembly file was not found was thrown When right-clicking on a table with a mapping type of the Integration Table, the client sometimes crashed. Updated the layout of the Deliver Instance (SSL) Role Dialog  Fixed an issue where the TIS (ODX service) would get stuck in 'Starting' state if the authentication token could not be renewed. Addressed alignment issues with dropdowns and the search icon in multi select filters. Corrected the incorrect tooltip displayed for the delete instance button. Resolved an issue where the hand cursor appeared on mouse over an instance in the Environment page, but clicking did not trigger any action. Fixed an issue where it was not possible to save a prepare (MDW) instance after previously experiencing an error. Fixed a bug that caused the customer table to not refresh after deleting a customer. Resolve an issue that made it impossible to delete some data source connections because of hidden double spaces in the name. Fixed an issue where important setup properties were accidentally overwritten during updates, and ensured that the connection string is securely encrypted when edited. Fixed an issue to ensure that repository information and related secrets stay properly aligned and consistent. Resolved an issue that caused organization creation to fail due to incorrect data handling. Ensured that comments are now properly added to the activity log when removing a company and removed duplicate comments that appeared when deleting an organization

Related products:TimeXtender Data IntegrationTimeXtender Portal

TimeXtender Desktop 6691.1

Hotfix release of TimeXtender Desktop (6691.1) that contains the changes listed below. Fixed Desktop 21596: Can't deploy DWH with service principal authentication Fixed an issue where service principal authentication was failing when deploying a SQL data warehouse 21604: SaaS Data Selection Rules - source vs destination fields Fixed issue where the data selection rule pane showed destination field names instead of source field names in table mappings. 21641: Implement 'WITH RECOMPILE' in Direct Read procedures All data movement procedures for moving data between data areas are now created with 'WITH RECOMPILE'. This will be picked up by the differential deployment. 21673: Issue when connecting to MDW SQL Storage when it is a Azure SQL Database This has now been fixed. 21679: Duplicated advanced features on SQL/Azure DWH Fixed and issue where some Advanced DWH features were listed twice 21706: SSL instances upgrade from 1.0.0.0 to 2.0.0.0 are missing extended properties on one table Fixed an issue in the SSL repository which will not copy semantic notifications when using "Copy instance" in the portal. This results in the SSL not being able to open if references to semantic notifications are made in one or more semantic execution packages. 21650: Integrate Existing Objects wizard Fixed an issue which caused the wizard to fail when the MDW used a case-sensitive collation. Fixed an issue where having multiple objects with the same name, but in different schemas would cause errors when running the wizard.  Fixed missing command timeout for the MDW when running queries.

Related products:TimeXtender Data Integration
featured-image

TimeXtender Desktop 6675.2

Today, we’ve published a new release of TimeXtender Desktop with the following changes: NewRedesigned TimeXtender Portal UI with new layout, colors, dark modeWe've remodeled the Portal and given it a fresh coat of pain to enhance both the look and the user experience. The new design features a collapsible left-side menu for the features related to the data flow, while user account settings, support and admin stuff live in the revamped top menu.In addition to that, the new colors give the Portal an fresh and modern look, and on top of that, we've added a dark mode for those who prefer to turn down the light a bit. The new colors are complimented by new lighter icons and a new more readable font. In our quest for greater consistency across the suite, Exmon Turnkey have been updated to use the same colors, font and icons as the Portal. Shared login for TimeXtender and ExmonYou can now use the same login for TimeXtender and Exmon (web and the desktop DG and DM products). Less hassle, and one less password to remember! However, we haven't centralized company accounts just yet, so if you're not using Exmon already, you'll still have to have an Exmon account created for you. The same applies, of course, if you're using Exmon but not TimeXtender. Keep destination settings when you transfer an instanceYou can now choose if you want to override security, roles and notifications in the destination instance when you transfer an instance in Environments. The first time you transfer between two instances, you must override the destination settings, but on subsequent transfers you decide. Previously, these settings would always be overridden. Map endpoints when you transferring a semantic modelRelated to the improvement listed above, you can now map semantic endpoints when transferring one semantic model instance to another. The endpoints must be of the same type. Previously, the endpoints in the destination instance would have been overridden  Integrate existing data warehouses in TimeXtenderWith the new Integrate existing objects feature, you can easily use data from your old data warehouse even before you've converted it to a TimeXtender data warehouse - or if converting the old data warehouse isn't feasible. Any non-TimeXtender table that happens to be in your data warehouse storage can be integrated into the TimeXtender data warehouse instance. If you're using Xpert BI (acquired by TimeXtender in 2023), you can import additional metadata for the tables in the form of descriptions and tags.  New data source providers for Excel and CSV filesWith the new native data source providers, getting data out of Excel and CSV files just got a lot easier. ImprovedFirewall rules can now be configured on the aptly named Firewall Rules page under Data Estate instead of on the individual instance's details page. This way, it's easier to get an overview of firewall rules across all instances. You no longer need to run the ODX Service Configuration tool on the destination server after transferring an ODX instance under Environments. Instead, you simply need to restart the ODX service. Listing instances in TimeXtender Desktop is now a lot faster. Service requests from user installed software will now include custom headers to ease support cases. When you're using Snowflake as data warehouse storage. aggregate tables, table inserts, and custom table inserts are now supported. When you're using Snowflake as data warehouse storage, deployment is significantly faster. You can now use Windows, Entra Password, Entra Integrated, and Entra Service Principal authentication for ODX SQL storage in addition to the existing SQL Server Authentication. You can now use Entra Service Principal authentication for data warehouse SQL storage connections. Added strict encryption support for ODX and data warehouse SQL storage (SQL Server 2022 and Azure SQL Database). FixedPortalOptimized environment page load times. Optimized customer table load times.DesktopJobs that were not completed did not set their state to 'Failed' after a restart.  Fixed an issue where a Fabric Workspace name containing spaces would make the ODX Fabric Lakehouse unusable. On an ODX, adding an incremental rule with updates and deletes to an empty table resulted in an error. Fixed a performance issue with the CSV semantic endpoint for models that contained tables with lots of rows. Parameters would be removed from custom views created using drag-and-drop between two data areas. In the Performance Recommendations window, the info icons were not properly aligned. In the Selection Rules pane on mapping tables, some fields, including conditional lookup fields and system fields, would be missing for tables from another data area. Fixed and issue where dragging tables from a  ‘TimeXtender Dynamics 365 Business Central - SQL Server’ or ‘TimeXtender Dynamics 365 Finance - SQL Server’ data sources into the ODX’s query areas would result in nothing.

Related products:TimeXtender Data Integration
featured-image

TimeXtender 6618.1

It's been just one month since our last major release of TimeXtender, but we already have a new release ready for you. This release, however, is all about consolidation. It doesn't contain a lot of new features. Instead, we've been busy improving the existing features and fixing various bugs. ImprovedEasier selection for data source connections: When you set up a new data source connection in the Portal, you can now choose the provider and the version of the provider from separate lists, which makes it much easier to get an overview of the provider list. With the possible exception-to-prove-the-rule, the TimeXtender-branded providers will be the best choice when multiple providers are available for the same data source. For that reason, we've also created a Recommended category in the list for our "homemade" providers.  More complete data lineage with improved UI: Data lineage now traces through aggregate tables, measures, and lookups. We've simplified the UI to give more space to the actual content and a dearly missed feature from the 20.10 branch returns: You can now double-click an object to jump to that object in the main window. To facilitate that, the Data Lineage window is now non-modal, which means that it can be opened and used next to the main window. Links from Desktop to Portal for better integration of the two: In TimeXtender, you sometimes need to go back and forth between the Desktop and the Portal a lot, especially when you set up new data sources. To make the Desktop and the Portal feel more integrated, we've added links in Desktop that open the relevant page in the Portal, e.g. for adding a new data source connecting, managing environments, adding a new instance, and managing semantic model endpoints. REST API table builder: Taking data from a REST API and transforming that into a table structure can be a bit of a hassle unless you really like writing XSLT scripts. Our REST API data source provider now includes a tool that can generate the script for you. You just need to drag-and-drop the structure you'd like. Some data sources that uses a connection string when connecting to a data source (eq. ODBC, OLE DB, TimeXtender SQL, TimeXtender Oracle) now has support for adding additional connection string properties. You can now use the TimeXtender REST data source when you use Data Fabric as your ODX storage.  You can now set the background job timeout in the TimeXtender SAP Table data source. The TimeXtender Oracle data source now supports the ODP.NET provider which has improved performance. It's now possible to change the security filtering direction on semantic relations for Tabular/PowerBI. The character limit on FormatString in a calculation group item on a semantic model has been increased to 1000 characters and is now a multiline textbox. We've improved the Tabular model generated by the Tabular semantic endpoint to have better efficiency and reduce the size of the model. Fixed PortalPassword fields were unmasking incorrectly. Fixed an error that would occur when accepting the license agreement. Fixed an issue where Connection cache was duplicated Data source connections with a null category would fail to render. Error messages for transferring instances have been improved in case of timeout issues. Optimized calls to retrieve instance lists on the Portal frontend. Optimized calls to retrieve data source connections on the Portal frontend. Fixed password field on the Add/Edit Semantic Instance page not showing correct string when unmasked DesktopSSL: Previously decimal fields were being deployed as double. This has been corrected. SSL: Removed the option to include other tables and fields in a Custom Field script, than the table the field is being created on, as it didn't work and would just be empty anyway. MDW: Using an aggregate table on SQL Synapse would fail during execution due to a wrong script. This has been corrected. TimeXtender SAP Table Data Source:  Fixed issue where subtraction settings were not applied on incremental transfers Fixed an issue where deployment would fail when disabling physical tables on Snowflake data warehouse Fixed issue where ODX transfer tasks were blocking other tasks from running concurrently during bulk inserts Desktop proxy settings were not parsed from Execution Service to TimeXtender Fixed issue where open tabs did not refresh on Save and Reload

Related products:TimeXtender Data IntegrationTimeXtender Portal
featured-image

TimeXtender 6590.1

It's officially spring in the northern hemisphere and incidentally, we have a bouquet of features and improvements ready for you in TimeXtender 6590.1.NewODX on OneLake: We are excited to introduce Microsoft Fabric OneLake as ODX storage on our platform. This enhancement enables users to seamlessly harness OneLake for all ODX operations, from initial setup and configuration in the Portal to comprehensive integration within TimeXtender workflows. This is the first of many planned integrations with Microsoft Fabric, so stay tuned! Note, however, that you currently cannot use OneLake as your ODX storage if you use Snowflake as your data warehouse storage.  New data source provider for OneLake: A new data source provider for seamless and efficient ingestion of delta parquet tables from Microsoft Fabric OneLake, directly into your preferred storage solution via the TimeXtender ODX. Optimized for ingesting data from OneLake, this feature is not intended for use with Fabric OneLake ODX storage. Publish data as a REST API endpoint: Added a new semantic endpoint, REST API, that works together with a server component installed on-premises or on a virtual machine in the cloud to publish data through REST API endpoints. As getting data through a REST API is a very common use case, the new endpoint type opens up a host of opportunities for integrating TimeXtender with other tools. In our previous major release, 6505, we introduced a new REST data source provider. This means that you can now publish and ingest data from your TimeXtender solution through a REST API using first-party components. New and improved data source providers for Hubspot and Exact Online: The new providers dramatically improve upon the previous CData options with enhanced usability and performance. These new options allow you to easily add custom endpoints and flatten complex tables. To upgrade to the new connectors today just search for the "TimeXtender Hubspot" or "TimeXtender Exact" when adding a new data source connection. Then, in ODX, you can edit an existing data source configuration and change to the new TimeXtender data source connection. Read more about editing data sources. ImprovedYou can now have multiple data warehouse instances open at the same time in Desktop. We've reshuffled the shortcut menu on Data Sources in ODX instances. "Add Data Source" now redirects to the 'Add data source connection' page in the Portal, while the previous "Add Data Source" functionality is now "Map Existing Connection". The intention is to make it clearer that adding a brand new data source happens in the Portal, while "adding" a data source in Desktop is using one of the data source connections mapped to the instance in the Portal.  We've upgraded a lot of our underlying UI framework. You might notice a few changes and improvements to the Portal UI as a result of that. When adding firewall rules to instances, your own IP is now automatically suggestedFixed (Portal)Fixed an issue where data source connection categories would not be shown if the category was assigned a null value  Fixed an issue where MDW/SSL transfers could lead to errors Cloning TimeXtender REST data sources could lead to incorrect data on password fields ODX connection timeout value did not get set correctly Changes to users would not get propagated to identity provider Inputs did not get correctly disabled on SSL Qlik endpoint form Disabled fields would show up as required on the ODX form Fixed issue where the SSL form would sometimes incorrectly flag inputs as invalid, thereby making it impossible to save Fixed incorrect short name suggestion on data source mappingFixed (Desktop)Default settings for key stores were not remembered in the key generation menu for data warehouse instances using SQL Synapse or Snowflake. The parameter '%Instance%' was not working for email notifications. The CSV endpoint was not escaping the text qualifier. On-demand execution from ODX to data warehouse would fail when having the same ODX table in a data area multiple times while using ADF to transfer from an ODX ADLS2 storage. ODX to data warehouse transfer would fail when using ADF and having the same table mapped multiple times, but using different columns in each table. When using a custom field as parameter in a custom measure or calculation group item in a semantic model, the parameter would disappear after closing and opening the semantic model instance and editing the custom measure or calculation group item. Cleaning up mapped objects between an ODX and data warehouse would never clean up the instance mapping causing copy instance in the Portal to always request a remapping of an instance even though it was no longer used. Fixed issue in the semantic model repository upgrade scripts where calculation group and custom script tables were not marked as tables to be included in instance copying  Fixed an issue where jobs completed with errors or warnings were incorrectly displayed as "Completed". They are now accurately labeled as "Completed With Errors or Warnings.   Custom Data on a data warehouse table did not rename the custom data column when the corresponding field was renamed. The data warehouse field validation type 'Is Empty' would mark empty string values as invalid. Fixed an issue where raw-only fields were included in the Data Movement pane. Fixed an issue where preview in Filter Rows in the ODX would always select the first table. Fixed issue where incremental transfer from a ADLS2 ODX with 'Limit memory use' setting failed for empty tables. Fixed issue where transfers from ADLS2 ODX to SQL Synapse would fail if the source table contained reserved system field names.

Related products:TimeXtender Data IntegrationTimeXtender Portal
featured-image

TimeXtender 6505.1

New year, new features to make your TimeXtender life more enjoyable and productive! We're happy to announce the release of a new major version of TimeXtender (Desktop v. 6505.1) that includes all of the belated holiday gifts listed below. NewAutomatic data source creation in the ODX: When you map a data source connection to an ODX instance, a data source using the data source connection will automatically be created in the ODX. In addition to that, when you add a new data source connection you can now map a data source connection to an ODX instance right on the same page.  Test connection from the Portal: You can now test if the ODX can establish a connection to a data source when you add or edit a data source connection in the Portal. Improved step-by-step Get Started guide: We've created a new and improved step-by-step Get Started guide in the Portal. You can access it right from the Home page where it has its very own card. As you check off the steps, your progress is saved - on a company basis - so you can see how far you've come. And if you're already a TimeXtender champion, the card can be dismissed so it doesn't clutter up your Home page. New TimeXtender REST Provider: The brand new TimeXtender REST data source provider simplifies connection to REST-based data sources. Among other improvements, the new provider allows you to set up endpoints without fiddling with configuration files. Instances grouped by environment in Desktop: As an improvement to the multiple environments feature we added in our previous major release, instances are now grouped by environment in TimeXtender Desktop. We hope this will bring some peace to people who like things well organized! Generate end-to-end execution packages and tasks: To make it easier to set up a refresh of the data in a specific semantic model, you can now generate the data warehouse execution packages and ODX tasks that will update all data for a specific semantic model. When you make changes to the semantic model, you can regenerate the flow and the logic is smart enough to keep any customizations you made to the auto-generated objects. Calculation groups in semantic models: You can now add calculation groups to semantic models and deploy them to Tabular and PowerBI semantic endpoints. To make that work, we've added the 'discourage implicit measures' option to the endpoints. It defaults to 'automatic', which means 'true' when you've added calculation groups, and 'false' otherwise. Snippets in semantic models: It's now possible to add DAX, Qlik, and Tableau snippets, and use them in semantic custom fields, custom measures, and calculation group items. ChangedWe've tightened up the design of the add/edit data source connection pages in the Portal. In addition to the general improvements, some connections now have nice-to-have fields and categories hidden in an 'Advanced' section per default so you can set up a new connection faster.  We've improved the Desktop logic to more flawlessly support it when you rename instances in the Portal. In custom scripts in semantic models, you can now use the 'Value' parameter. Fixed PortalFixed an issue where users could see - but not access - instances that they hadn't been granted access to. Public job endpoints weren't able to handle unknown states. Endpoints were added out of order in the SSL form. Fixed issue with the "move customer" operation. Storage types weren't always loaded on the MDW form. Fixed floating info icon on SSL form. Fixed issue where the Portal throws a "not signed in" error - usually due to your token having expired - but then fails to route you back to sign in. The deployment target option for Analysis Services 2022 was missing from the Tabular SSL endpoint. Cloning a data source connection would route you to the original form, instead of the clone form. Disabling automatic firewall rules didn't always get handled correctly when handing out connections.DesktopFixed an issue with data lineage sometimes failing when trying to aggregate the display values in SQL Fixed an issue where the ODX service would sometimes fail to validate the data source connection version of TimeXtender enhanced and TimeXtender ADF transfer components causing an error. Updated some logic to better handle unsupported data sources instead of throwing an unclear error message. Fixed an issue where using an already created SQL database as storage for an ODX instance would reject the database due to the validation of the data storage version. Fixed issue with data lineage and reverse sign transformations not working Fixed an issue where using a dot (.) as the last character of a table name would cause executing a task in the ODX using a data lake to fail. The dot character will be replaced by an underscore when the dot is the last character of a folder name in the data lake. Fixed an issue deployment would fail when a source table DW_Id is mapped to a destination table DW_Id Fixed an issue where the TimeXtender BC365 online data source was failing to validate before inserting system fields during transfer. Fixed an issue where Synapse data warehouse would fail when adding a selection rule on a renamed field. Fixed issue setting up an incremental rule with year subtraction. Fixed an issue where generating documentation when only having an ODX open would throw an error. Fixed an issue where mapping would fail for tables that used system fields as column names. Fixed an issue where a table with multiple lookup fields would return incorrect results in Snowflake data warehouse. TimeXtender SAP Table data source provider:  Added support for DecimalFloatingPoint34 and DecimalFloatingPoint16 data types Fixed issue where fields starting with '/' could not be added to incremental rules Fixed issue where max. row setting was limiting number of data rows to be transferred Improved logging Fixed an issue where the default relation was not set correctly when relating tables in a semantic model. Optimized instance updates during task initialization

Related products:TimeXtender Data IntegrationTimeXtender Portal
featured-image

TimeXtender 6429.1

And it's about time for a new release of TimeXtender! The new version (Desktop v. 6429.1) includes a bunch of much-requested and, dare we say, exciting features, that we hope will improve your day-to-day. It doesn't have to be crazy at work.NewAs with any innovative feature release, there may be some quirks along the way. Though we’ve done extensive initial testing, we encourage you to report any bugs you may find so we may release further improvements as rapidly as possible.Multiple environments and instance transfers in the Portal: You can now group instances in environments to keep everything nice and ordered. In addition to that, you can transfer the contents of one instance to another, enabling a true DEV -> TEST -> PROD workflow right in the Portal​​​​​   Data source "adapters" for selected ERP systems: We've added new data sources for Microsoft Dynamics 365 Finance & Operations ("AX") and Microsoft Dynamics 365 Business Central ("NAV") that make it easier for you to handle accounts as well as other functionality that make these systems easier to work with. In the Portal, you'll find them in the data sources list as "TimeXtender Dynamics 365 Business Central" and "TimeXtender Dynamics 365 Finance" with "- SQL Server" or "- Online" appended.   Improved support for Snowflake as a data warehouse: We've taken a big step towards supporting each and every TimeXtender feature when you use Snowflake as data warehouse storage. The newly supported features include incremental load, conditional lookup fields, field transformations, field validations, history tables, supernatural keys, and custom views. Aggregate tables, custom data, custom hash fields, junk dimensions, pre- & post-scripts, related records, and table inserts are not supported yet.    XPilot integrated in the Desktop: You'll now find a handy link to XPilot, our data integration chatbot, right from the toolbar. Try all the features for 14 days: You can now try all TimeXtender features for free for 14 days before you decide if you're ready to sign up for a paid subscription. The feature- and resource-limited Free tier has been retired. Automated migration from 20.10 to the newest version of TimeXtender: If you are still on the 20.10 branch of TimeXtender, you can now upgrade to the newest version without starting from scratch. The 20.10.45 release of TimeXtender can convert existing projects to cloud-based instances to minimize the work you need to do to move up.   ChangedWe've standardized terminology around instances and data source connections in the Portal. Among other things, we wanted to fix the common confusion around data sources. Now, in the Portal, you add "data sources connections" that can be used on "data sources" in the ODX in Desktop.FixedPortalOn the Instances card on the Home page, the instances as now ordered with the newest first. On the 'Add/editing data source connection' page, the left-side section navigation was not displayed. On the 'Add/editing data source connection' page, SSL secrets are now hidden. The Portal would show incorrect data source variable names. A data source connection would fail to save due to an incorrect validation error. In some cases, the activity list would fail to load or the pagination would break. The Customer Details page would be displayed for deleted customers. We've improved the loading of the Home page with better loading animations. If a company can't be moved, you'll be notified without the Move modal popping up.Desktop18959: Updated sql server 2022 assembly dependencies 18988: When executing an object in the data warehouse with ‘Data on demand’ enabled on the data source, the transfer from the data source to the ODX storage would not be visible in the log. Now, the transfer from the source has a separate entry in the log in the details for the "outbound" transfer. 19123: Fixed an issue with sql spatial types and sql server 2022 in the MDW. 19191: Added support for data source connections without descriptions. 19199: Deploying only modified tables was very slow while deploying all tables was faster. 19261: An issue where you cannot add fields to semantic model tables with custom fields has been resolved. 19265: Changed a label from "data area" to "data warehouse" in the Execution Server Configuration tool. 19269: Fixed an out of memory exception. 19304: Empty tables would be created when using SQL Server as ODX Storage. 19317: Optimize StepRowCountLoggingExecute.cs (logging of rows). The logic behind the Step Row Count has been optimized. 19323: Mapping same field to multiple fields in a MDW table from ADF not possible. Using Azure Data Factory transfer from ODX to MDW doesn't support mapping the same column from the ODX to multiple fields on the same table in the MDW. We have added a validation that blocks this scenario in a deploy/execute scenario. 19326: We fixed the issue with losing new lines when saving or updating a Query Table on ODX. 19343: Improved labeling of Edit MDW instances. 19358: The version number was sometimes replaced with a random word in the Copy to Instance dialog. 19367: We resolved an issue where, when adding a job, the color for an invalid selection did not get grayed out, and there was a misalignment on the control for validating an item. 19386: Fixed a scaling issue with the documentation template dialog 19400: Can't authenticate the D365 BC provider on creation. 19412: Fixed an issue with "show translations" not working on custom measures for Power BI premium. 19415: Fixed an issue where data formatting settings were not enabled on SSL fields for Power BI 19429: Removed unnecessary warnings in the ODX when synchronizing OLE DB based data sources 19457: Fixed an issue with remapping SSL table when the mapped MDW gets deleted. 19464: Added syntax highlighting for various scripts. 19505: Fixed an issue with clone fields and lookup transformation templates. 19519: There was an issue with incremental load into Azure Data Lake storage where the source type is handled as UniversalDataTypes.Datetime. This caused the incremental datetime value to be UTC + local time offset. 19526: Improved error message for when loading an instance fails. 19533: Added support for OAuth login during the add data source flow in the ODX. 19540: Fixed an issue with enabling 'Keep fields up-to-date' with XML data type fields. 19560: The ODX would continue transferring data even though the task was stopped in the ODX Execution Queue. 19562: Fixed an issue with running a job schedule set to "run once on this day". 19627: Fixed an issue where running execution packages with prioritization didn't work. 19678: Fixed an issue where deleting a data source in the ODX would not always do cleanup of its task history.

Related products:TimeXtender Data IntegrationTimeXtender Portal

TimeXtender 6346.1

Today, we’ve published a minor release of TimeXtender (Desktop v. 6346.1) with the following changes:ChangedYou can now choose between a dynamic or static IP address filter when configuring firewall rules for instances in the Portal. This should help the - luckily - very few users for whom the dynamic system doesn't work. The 'description' field for data sources in the Portal is now optional, simplifying the data entry process. The system will now automatically scroll to validation errors when you try to save a data source in the Portal, making it easier to identify and correct issues. Fixed (Desktop)18883: In the Monitor window that shows the status of jobs, a job would show the status “Completed” with no indication of errors or warnings. Now, the status will tell you if there were issues during the execution. 19110: Switching between advanced and simple selection in the Add Data Source wizard, would sometimes result in an “Object reference not set to an instance of an object” error. 19113: The Row-level Security Setup window would “forget” the settings for the “Values field” and “Members field” options when the window was closed and then opened again. 19133: On some data sources, usually with lots of tables, loading a list of tables would take so long that the Query Tool window would be unusable. The list will now load asynchronously to avoid this issue. 19163: The setting for ‘Disable simple selection’ was not included when cloning a data source. 19168: Execution would fail with the error “An item with the same key has already been added” when using Azure Data Lake transfer and having renamed, then re-added a table in the data warehouse. 19173: On some data sources, usually with lots of tables, loading a list of tables would take so long that the Query Tables window would be unusable. The list will now load asynchronously to avoid this issue. 19178; Trying to delete a custom period in a date table would sometimes result in a “Deleted row information cannot be accessed” error.  19181: When using the Dynamics 365 Business Central data source provider, execution of the “G/L Entry” table would fail for tables with many records. Fixed (Portal)Trying to configure the ODX Server service using a user account with an unverified e-mail address would result in a blank error in the ODX Configuration app. The error message now explains the issue and how to resolve it. Monthly usage details were not displaying correctly in the Portal. Comments were not being saved or displayed for certain entries in the activity log. Changing the email address of a user in the Portal required that the user was activated (invited/signed in). This requirement has been removed. On the Activities page, it was possible to select dates that do not make sense, e.g. a 'From' date in the future. It was possible to change your email address to the email address of another user. It was not possible to set up sign-in with social accounts even though it should only require a verified e-mail address. When listing new fields on the 'Update data source' page, new values in a drop-down field would not count as a change. The 'Team development' information icon was displayed in the wrong place on the Edit Instance page. The Delete button would be positioned wrong on the Data Source page in the Portal. Trying to send a “critical notification” e-mail would result in an error if not username or password was provided.

Related products:TimeXtender Data IntegrationTimeXtender Portal

TimeXtender Desktop v. 6284.1

Today, we’ve published a minor release of TimeXtender Desktop (v. 6284.1) with the following changes:Fixed18896: Can't open Select Tables menu for data sources with large schemas Added option to default to advanced selection when creating or selecting tables from a data source 18670:  SSL field renaming not working for Tabular endpoint on Snowflake SSL Tabular endpoint: Fixed issue where renaming a field from a table on Snowflake storage would cause an error during execution. 18883: Job logs lacks detailed information We have addressed an issue where we failed to inform the user about job completion with errors or warnings. Now, when they check the job monitor, the status will be displayed as "completed" along with an indication if there are any issues encountered. 18838: Adding a Conditional Lookup Field to a SSL model gives a wrong field type until synchronized When adding a Conditional Lookup Field to an SSL model, a wrong field type would be shown until the SSL model was synchronized. 18662: Data Factory resources names are not unique when using "copy instance" Fixed an issue where creating resources for data factory would not be unique between instances created by the "copy instance" functionality. The naming convention has been changed to now include an identifier for the instances in the data factory resources names. 18599: Data Lineage is always using dbo as schema name when displaying object names Data lineage was always using dbo as schema. This has been corrected to show the actual schema that is being used. 18575: Default Hashing is referred to as Project Hashing and not Instance Hashing Updated an incorrect display name of the Default Hashing option 18720: Fix issue with mdw/ssl "copy to instance" corrupting destination Fixed an issue where using "copy to instance" for MDW and SSL instances could potentially corrupt the destination instance, if the destination instance was empty due to multiple transactions. 18713: Improve upgrade instance message Improved upgrade instance message for MDW and SSL instances 18633: Incremental load with deletes not working with a data lake Fixed an issue where incremental load with deletes didnt work in the ODX while using a data lake and no new data was transferred in the execution 18619: Issue with ODX ProjectLockVersions check Fixed an issue where a ProjectLock check was being run before the database was created. 18710: Issue with renaming script actions used in pre/post steps Fixed an issue with pre/post steps could not be edited when changing the name of a script action used as the pre/post step. 18640: Issues with Lookup Transformation Templates Fixed issue with Lookup Transformation Templates where names were not qualified Fixed issue with Lookup Transformation Templates where generating the lookup transformation script would fail since the fixed join value was missing formatting and escaping Fixed issue with Lookup Transformation Templates where the lookup transformation script would fail because the template selection statement was not formatted and escaped Fixed an issue with Lookup Transformation Templates where the lookup transformation script would fail because the default fallback value was also '' (empty string) for all data types. It is now NULL for all data types. 18688: Jobs sometimes select the wrong instance when using copy instances Fixed an issue with Jobs sometimes using the wrong instance in a "copy instance" setup. The execution packages in the job would use the first available instance across the copied instances when creating the job. 18610: Show error - details with incorrect redirect Fixed an issue where the details button on warnings and errors sometimes would incorrectly redirect to the support site. 18585: Startpage is not displaying the entire support site text Fixed UI issue where some of the text for the TimeXtender Support Site was not shown 18682: The ODX sometimes use old CData components Fixed issue where the ODX would sometimes use the wrong version of a managed ADO.NET component, due to the identifier of some external managed ADO.NET components not being unique. 18537: The Upgrade required pop-up appears with misaligned buttons Fixed issue with scaling on Upgrade job repository confirmation button 18571: Typo in Execution Server Configuration window Fixed a typo in the Execution Server Configuration 18907: Validate data source connection menu is not working Removed the 'Validate Data Source Command' from ODX data sources, as the logic that happened in the command was automated elsewhere. 18787: You cannot add multiple schedules to a job at a time Fixed an issue where trying to add multiple schedules to a job at the same time would only add one of them.

Related products:TimeXtender Data Integration
featured-image

TimeXtender 6221.1

Spring has sprung, and we're happy to announce the release of a new version of TimeXtender (Desktop v. 6221.1). See what we've been up to below.Note: These Release Notes have been updated to reflect that the TimeXtender API is now live and no longer in closed BETA.NewAll semantic endpoints are now supported for Snowflake: If you have a data warehouse on Snowflake, you can now use it with all the semantic endpoints supported by TimeXtender. The Power BI, Tableau and Tabular endpoints join Qlik and CSV file as supported endpoints for this type of data warehouse storage. SQL Server 2022 support: TimeXtender now supports the latest and greatest major release of Microsoft SQL Server for use as a data warehouse or ODX data storage. Official support for Amazon RDS for SQL Server: Amazon's cloud SQL Server offering is now officially supported for use as a data warehouse or ODX data storage. Some of our enterprising customers have already paved the way by just doing it, and we're happy to put the "officially supported" stamp on their endeavor. Easy data source provider updates: We've made it much simpler to update a data source provider to take advantage of new features or bug fixes. You'll now see an aptly named 'Update' button whenever an update is available. Previously, you'd have to add a new data source in the TimeXtender Portal and switch the connection in TimeXtender Desktop.  TimeXtender API for integrating with external systems: As an important step in our march towards world domination, we've created an API that can be used by external systems that want to, among other things, trigger and monitor task executions. Currently in closed beta, this feature can be compared to the feature in TimeXtender 20.10 and older that allows you to trigger an execution package from the command prompt. ChangedA Job can now be scheduled multiple independent times. On the ODX, we've added support for data-on-demand for Managed ADO.net data sources. 'Show data types' have been implemented on semantic models Tabular endpoints now show more details when an error occurs during execution. Fixed Managed ADO.net data sources now support multi-line properties that automatically add the correct line endings ('CR LR' or '\r\n')PortalWhen adding or editing a Qlik endpoint, you would get a "some fields have invalid values" validation error. It was not possible to delete a data source if the name of the data source contained whitespace or special characters in a specific way. Add/edit/clone data sources would not show a loading spinner when loading the form. When cloning a data source, the 'Clone' submit button was not disabled if validation failed. Users on the Free tier could clone a data source to exceed the limit of data sources. Fixed various other issues with data source cloning. Minor tweaks and adjustments to the styling of the Add/Edit Instance forms We fixed some technical debt relating to customer types left over from the implementation of the Free tier in our previous release.DesktopA few outdated or incorrect icons have been changed. Data would be missing from the "valid" table on tables that had a specific setup with a mapping set, a primary key field, and a data selection rule. Configuring the Execution server would, in some cases, not take the lock on an instance. When a test notification failed, it would not give the user a useful error message. Changing a snippet didn't always update the script. Opening the Error view would result in an error in a specific setup involving the 'Keep field values up to date' option. Jobs would on rare occasions show execution packages from other instances if these instances were made as a copy of another instance. In the Add Jobs wizard, some text was truncated at the end. Fixed an issue with the "Execute ODX Data Factory Merge Transfer" step that caused data sources with 'Data on demand' enabled to fail or be skipped when transferring data from the ODX to the MDW using Azure Data Factory. Fixed an issue on execution where excluding the "Execute ODX Data Factory Merge Transfer" step was ignored and executed anyway. Fixed an issue where transfers with Azure Data Factory from the ODX to the data warehouse did not set the batch count. Fixed issue with transfer from the ODX to a data warehouse on Snowflake when the table had incremental load with updates enabled in the ODX. Resuming an execution would skip 'table insert' and 'related records' steps. Fixed a misleading label in the Table Settings window. For data warehouse storage, 'Additional connection properties' were not added to the connection string. After changing storage on a data warehouse instance from on-prem SQL Server to an Azure SQL database, deployment would fail because extended properties were not created for functions and views. For data warehouses on an Azure SQL database, 'custom table insert' requires the 'xact_abort' setting to be enabled, which it was not. When synchronizing a mapping set with lots of tables, the window would be bigger than the display and therefore you would not be able to see and click the buttons at the end. The CSV endpoint would always use UTF8-BOM encoding, ignoring the user's choice. It was possible to add fields from different source tables to a semantic model even though it should not be possible. In a semantic model, deleting a measure or a hierarchy that was included in a perspective would not clean up the perspective properly. In a semantic model, deleting a field that was included in a perspective would throw an error during deployment. In a semantic model, adding a field to a table when having a custom field would cause an error. In a semantic model, dynamic role security setup values were not reselected on edit.

Related products:TimeXtender Data IntegrationTimeXtender Portal

TimeXtender 6143.1

We've released a new version of TimeXtender (Desktop v. 6143.1) with a bunch of new features and even more fixes - see what's new below.Warning: The new version of TimeXtender does not support version 11 of the following data source providers:Azure Data Factory - MySQL Azure Data Factory - Oracle Azure Data Factory - PostgreSQL Azure Data Factory - SQL ServerPlease use version 12 of these data sources with the new release.NewFree tier replaces trials: You can now use TimeXtender for free as long as you like without worrying about running out of credits. When you sign up for TimeXtender, you now start on the Free tier that never runs out, but comes with a few limitations. Existing trial accounts will be converted to free.Limitations of the Free tier: One user One semantic model One data warehouse One ODX One data source Azure Data Lake Storage cannot be used for ODX storage Dedicated SQL Pool (SQL DW) and Snowflake cannot be used for data warehouse storage   Data warehouse on Snowflake: We've added support for Snowflake and now, for the first time, you can deploy a TimeXtender data warehouse to non-SQL data storage, and, of course, take advantage of Snowflake features. Our initial implementation requires an ODX that uses Azure Data Lake Storage with SAS authentication and only works with the Qlik, and CSV file endpoints in the semantic layer. On the data warehouse, only features supported by simple mode are available. Read more on how to Use Snowflake as data warehouse.   Improved scheduling (Desktop): You can now schedule execution packages from DWH and SSL instances in the same job. This is useful if you, for instance, want to execute a semantic model just after the relevant tables in your data warehouse. Note that the instances must be mapped to the same TimeXtender Execution Server service.   On-Demand data warehouse ingestion: When the data on demand option is enabled, the data source will refresh each table in the ODX storage before transferring it to the data warehouse storage. This will work without configuring an explicit "transfer task" under the data source.  Changed (Portal)For consistency, we've added an 'Edit' button for each item on the 'Data sources' list.  Fixed (Portal)17587: It was not possible to add a data warehouse with Azure AD as authentication (released as hotfix). 16800: 'Clone data source' had the wrong "breadcrumb". 17809: The input box for the 'Batch size' option on ODX and data warehouse storage would max out at 65536 when using the "up" button which is far below the valid maximum value. 17485: The Permissions list is now hidden from 'Edit company details' when the list is empty. 17319: The Merge button is now disabled when you've clicked it to prevent accidental additional clicks.  Fixed (Desktop)16902: Issue with misleading text in the Synchronize window when synchronizing a data warehouse with an ODX 16686: An unnecessary 'Connection Changed' message could show up when using the Query Tool on the data warehouse  17878: Issue where "resume execution" would skip Table Insert and Related Records 16865: Data lineage for views in data warehouse to data warehouse fields was not working 16599: Previewing a query table in the ODX sometimes wouldn't suggest the query table's statement, but instead use "Select * from..." 16036:  When reloading an instance using 'Save and Reload', the previously open tabs were not reopened accordingly. This has been fixed. 17482: Removing a table that was included in an Object Security Setup, would cause the next deployment of that Object Security Setup to fail, as the references from the deleted table were still there.  16708: Using Export Deployment Steps to a CSV file would cause a null reference error 17249: Allowing a table to be compressed could not be combined with having history enabled. Enabling page compression on a table would result in the message "System field 'Is TombStone' cannot be removed". 16825: Data lineage tracing between a data warehouse view and a semantic model did not work. The semantic model did not track lineage through a mapped custom view. 17687: TimeXtender would crash when using the Deploy and Execute hotkey on views based on SQL snippets 16704: Using Select Columns to remove columns from query tables would fail on execution when transferring from an ODX on Azure Data Lake Storage. 17653: The Edit Data Area dialog would allow more than 15 characters in the area name. 16645: Enter didn't call search function in remap table when remapping a ODX This has now been corrected. 15407: Primary key validation error would remove all rows for the primary key in the valid table when using incremental load with hard deletes. 17148: It was not possible to change letter casing in the name of a conditional lookup field by clicking on the field and pressing the F2 "rename" keyboard shortcut. 17267: ODX DL to DW Azure Synapse Dedicated SQL Pool Incremental Load & hard delete results in valid table truncation. There was an issue where incremental load from the ODX using data lake as storage to a Data Warehouse using Synapse Dedicated SQL Pool would not transfer primary keys when no new data exists in the ODX, which would cause the valid table to be truncated.  17591: Adding both pre- and post steps on deployment for an incremental table would not redeploy the valid and incremental tables on "full load deploy" 16836: Trying to send a test mail in Notifications on Critical Errors would throw an error instead of sending an e-mail. 16729: Reconnecting to an Azure service in TimeXtender would fail after 12 hours without prior activity to the Azure service. 15995: Data lineage was missing information when a default relation was used instead of a join on a conditional lookup field. 17359: You would see an error message when testing a mail notification in Notifications on Critical Errors if the server returned "2.6.0 Queued mail for delivery" which isn't actually an error. 17115: SMTP authentication without a password did not work.

Related products:TimeXtender Data IntegrationTimeXtender Portal