See what’s new at our product, check the updates below
Today, we’ve published a minor release of TimeXtender Desktop (6534.1) that contains the changes listed below.FixedResolved issue: Truncation of valid table and data override during initial execution after migrating a data warehouse table with Incremental load set to true.
We’ve published a new release of TimeXtender Portal with the following changes:NewNew organization type: Subscription. The new pricing is now reflected in the Portal by the inclusion of a new organization type, subscription, and the four subscription packages. New organization type: Sandbox. Another addition coming from the new pricing is the new sandbox organization type. It allows TimeXtender and partners to provide a sandbox environment for customers on the Premium and Enterprise subscriptions where this is included. Sandbox organizations have a time limit, similar to trials, and can only add one of each instance. TimeXtender ❤ Exmon: You can now unlock and access the Exmon features right from the Home page of the Portal. ChangedIt is no longer possible to add Free or Premium customers. "Premium" has been renamed "Legacy Premium" The search bar on tables is now disabled when loading We’ve added a search bar to the Users list under Admin > Users We’ve added a sidebar to the TimeXtender REST data source connection form.FixedLong data source names were not fully visible in the transfer instance modal Edit preferred support would sometimes fail with an unhandled exception Fixed values appearing twice under data source connection details "Move" user on a Premium customer would throw a "customer manager permission required" The tooltip on the environment description was clickable despite being empty Fixed tooltip placement for missing mapping in environments Fixed spacing on the TimeXtender REST data source connection form Instance names are now truncated if they are too long in the transfer instance modal Fixed an issue where mapping a data source connection would result in an error Fixed an issue in the add/edit semantic model instance form where certain fields in the CSV endpoint wouldn't be correctly disabled The user/customers tables are now refreshed when a new user or customer is added Fixed an issue where the environments page would return an error if no instance had been created Creating a static firewall rule now also creates a connection that can be reused
Today, we’ve published a minor release of TimeXtender Desktop (6521.1) that contains the changes listed below. FixedFixed a race condition issue where the application services could potentially create new connections to the repositories on each repository action, causing the repository to be filled with firewall rules when "automatic firewall rules" is enabled.
Today, we’ve published a minor release of TimeXtender Desktop (6512.1) that contains the changes listed below.FixedRenaming a table in the SSL would cause measures to stop working for the Tabular endpoint. The measures affected were all types except measures using custom scripts or Row Count.
We’ve published a minor release of TimeXtender Desktop (v. 6506.1) and TimeXtender ODX Server (v. 6506.1) that contain the following changes:FixedUpgrading ODX to 6505.1 can prevent connections from being re-used. This has been resolved. Upgrading to 6505.1 can make custom measures and custom fields in Semantic Model instances change endpoint. This has been resolved.PortalAddressed issue in transferring instances.
New year, new features to make your TimeXtender life more enjoyable and productive! We're happy to announce the release of a new major version of TimeXtender (Desktop v. 6505.1) that includes all of the belated holiday gifts listed below. NewAutomatic data source creation in the ODX: When you map a data source connection to an ODX instance, a data source using the data source connection will automatically be created in the ODX. In addition to that, when you add a new data source connection you can now map a data source connection to an ODX instance right on the same page. Test connection from the Portal: You can now test if the ODX can establish a connection to a data source when you add or edit a data source connection in the Portal. Improved step-by-step Get Started guide: We've created a new and improved step-by-step Get Started guide in the Portal. You can access it right from the Home page where it has its very own card. As you check off the steps, your progress is saved - on a company basis - so you can see how far you've come. And if you're already a TimeXtender champion, the card can be dismissed so it doesn't clutter up your Home page. New TimeXtender REST Provider: The brand new TimeXtender REST data source provider simplifies connection to REST-based data sources. Among other improvements, the new provider allows you to set up endpoints without fiddling with configuration files. Instances grouped by environment in Desktop: As an improvement to the multiple environments feature we added in our previous major release, instances are now grouped by environment in TimeXtender Desktop. We hope this will bring some peace to people who like things well organized! Generate end-to-end execution packages and tasks: To make it easier to set up a refresh of the data in a specific semantic model, you can now generate the data warehouse execution packages and ODX tasks that will update all data for a specific semantic model. When you make changes to the semantic model, you can regenerate the flow and the logic is smart enough to keep any customizations you made to the auto-generated objects. Calculation groups in semantic models: You can now add calculation groups to semantic models and deploy them to Tabular and PowerBI semantic endpoints. To make that work, we've added the 'discourage implicit measures' option to the endpoints. It defaults to 'automatic', which means 'true' when you've added calculation groups, and 'false' otherwise. Snippets in semantic models: It's now possible to add DAX, Qlik, and Tableau snippets, and use them in semantic custom fields, custom measures, and calculation group items. ChangedWe've tightened up the design of the add/edit data source connection pages in the Portal. In addition to the general improvements, some connections now have nice-to-have fields and categories hidden in an 'Advanced' section per default so you can set up a new connection faster. We've improved the Desktop logic to more flawlessly support it when you rename instances in the Portal. In custom scripts in semantic models, you can now use the 'Value' parameter. Fixed PortalFixed an issue where users could see - but not access - instances that they hadn't been granted access to. Public job endpoints weren't able to handle unknown states. Endpoints were added out of order in the SSL form. Fixed issue with the "move customer" operation. Storage types weren't always loaded on the MDW form. Fixed floating info icon on SSL form. Fixed issue where the Portal throws a "not signed in" error - usually due to your token having expired - but then fails to route you back to sign in. The deployment target option for Analysis Services 2022 was missing from the Tabular SSL endpoint. Cloning a data source connection would route you to the original form, instead of the clone form. Disabling automatic firewall rules didn't always get handled correctly when handing out connections.DesktopFixed an issue with data lineage sometimes failing when trying to aggregate the display values in SQL Fixed an issue where the ODX service would sometimes fail to validate the data source connection version of TimeXtender enhanced and TimeXtender ADF transfer components causing an error. Updated some logic to better handle unsupported data sources instead of throwing an unclear error message. Fixed an issue where using an already created SQL database as storage for an ODX instance would reject the database due to the validation of the data storage version. Fixed issue with data lineage and reverse sign transformations not working Fixed an issue where using a dot (.) as the last character of a table name would cause executing a task in the ODX using a data lake to fail. The dot character will be replaced by an underscore when the dot is the last character of a folder name in the data lake. Fixed an issue deployment would fail when a source table DW_Id is mapped to a destination table DW_Id Fixed an issue where the TimeXtender BC365 online data source was failing to validate before inserting system fields during transfer. Fixed an issue where Synapse data warehouse would fail when adding a selection rule on a renamed field. Fixed issue setting up an incremental rule with year subtraction. Fixed an issue where generating documentation when only having an ODX open would throw an error. Fixed an issue where mapping would fail for tables that used system fields as column names. Fixed an issue where a table with multiple lookup fields would return incorrect results in Snowflake data warehouse. Improved logging for the TimeXtender SAP Table data source provider and fixed two issues: Fields starting with '/' could not be added to incremental rules. The max. row setting was limiting the number of data rows to be transferred. Fixed an issue where the default relation was not set correctly when relating tables in a semantic model. Optimized instance updates during task initialization
Today, we’ve published a minor release of TimeXtender Desktop (v. 6436.1) that contains the following changes:Fixed19801: Import settings from a previous version is not choosing latest version. Fixed an issue where it would choose the wrong latest version to import settings from. 19805: Missing check enabling Raw Only Fields to be added to a table on a different area. Fixed an issue where a missing validation enabled raw only fields to be added from one table to another. 19824: Upgrading to 6431.1 can make ODX data sources loose columns when using simple select. This has been resolved. 19852: Azure Data Factory does not work in version 13. The issue was fixed. Azure Data Factory data source connections need to be upgraded version 14.
Today, we’ve published a minor release of TimeXtender Desktop (v. 6431.1) that contains the following changes: FixedWhen a data warehouse had ADF transfer from ODX enabled, a validation would fail which caused tables with mapping sets to be unable to be deployed.
Today, we’ve published a minor release of TimeXtender Desktop (v. 6430.1) that contains the following changes: ChangedRemoved the ‘Copy to Instance’ feature from Desktop as this is now done in the Portal.FixedExecuting an execution package with an incrementally loaded table added to the Full Load objects list would do a wrong cast and throw an error.
And it's about time for a new release of TimeXtender! The new version (Desktop v. 6429.1) includes a bunch of much-requested and, dare we say, exciting features, that we hope will improve your day-to-day. It doesn't have to be crazy at work.NewAs with any innovative feature release, there may be some quirks along the way. Though we’ve done extensive initial testing, we encourage you to report any bugs you may find so we may release further improvements as rapidly as possible.Multiple environments and instance transfers in the Portal: You can now group instances in environments to keep everything nice and ordered. In addition to that, you can transfer the contents of one instance to another, enabling a true DEV -> TEST -> PROD workflow right in the Portal Data source "adapters" for selected ERP systems: We've added new data sources for Microsoft Dynamics 365 Finance & Operations ("AX") and Microsoft Dynamics 365 Business Central ("NAV") that make it easier for you to handle accounts as well as other functionality that make these systems easier to work with. In the Portal, you'll find them in the data sources list as "TimeXtender Dynamics 365 Business Central" and "TimeXtender Dynamics 365 Finance" with "- SQL Server" or "- Online" appended. Improved support for Snowflake as a data warehouse: We've taken a big step towards supporting each and every TimeXtender feature when you use Snowflake as data warehouse storage. The newly supported features include incremental load, conditional lookup fields, field transformations, field validations, history tables, supernatural keys, and custom views. Aggregate tables, custom data, custom hash fields, junk dimensions, pre- & post-scripts, related records, and table inserts are not supported yet. XPilot integrated in the Desktop: You'll now find a handy link to XPilot, our data integration chatbot, right from the toolbar. Try all the features for 14 days: You can now try all TimeXtender features for free for 14 days before you decide if you're ready to sign up for a paid subscription. The feature- and resource-limited Free tier has been retired. Automated migration from 20.10 to the newest version of TimeXtender: If you are still on the 20.10 branch of TimeXtender, you can now upgrade to the newest version without starting from scratch. The 20.10.45 release of TimeXtender can convert existing projects to cloud-based instances to minimize the work you need to do to move up. ChangedWe've standardized terminology around instances and data source connections in the Portal. Among other things, we wanted to fix the common confusion around data sources. Now, in the Portal, you add "data sources connections" that can be used on "data sources" in the ODX in Desktop.FixedPortalOn the Instances card on the Home page, the instances as now ordered with the newest first. On the 'Add/editing data source connection' page, the left-side section navigation was not displayed. On the 'Add/editing data source connection' page, SSL secrets are now hidden. The Portal would show incorrect data source variable names. A data source connection would fail to save due to an incorrect validation error. In some cases, the activity list would fail to load or the pagination would break. The Customer Details page would be displayed for deleted customers. We've improved the loading of the Home page with better loading animations. If a company can't be moved, you'll be notified without the Move modal popping up.Desktop18959: Updated sql server 2022 assembly dependencies 18988: When executing an object in the data warehouse with ‘Data on demand’ enabled on the data source, the transfer from the data source to the ODX storage would not be visible in the log. Now, the transfer from the source has a separate entry in the log in the details for the "outbound" transfer. 19123: Fixed an issue with sql spatial types and sql server 2022 in the MDW. 19191: Added support for data source connections without descriptions. 19199: Deploying only modified tables was very slow while deploying all tables was faster. 19261: An issue where you cannot add fields to semantic model tables with custom fields has been resolved. 19265: Changed a label from "data area" to "data warehouse" in the Execution Server Configuration tool. 19269: Fixed an out of memory exception. 19304: Empty tables would be created when using SQL Server as ODX Storage. 19317: Optimize StepRowCountLoggingExecute.cs (logging of rows). The logic behind the Step Row Count has been optimized. 19323: Mapping same field to multiple fields in a MDW table from ADF not possible. Using Azure Data Factory transfer from ODX to MDW doesn't support mapping the same column from the ODX to multiple fields on the same table in the MDW. We have added a validation that blocks this scenario in a deploy/execute scenario. 19326: We fixed the issue with losing new lines when saving or updating a Query Table on ODX. 19343: Improved labeling of Edit MDW instances. 19358: The version number was sometimes replaced with a random word in the Copy to Instance dialog. 19367: We resolved an issue where, when adding a job, the color for an invalid selection did not get grayed out, and there was a misalignment on the control for validating an item. 19386: Fixed a scaling issue with the documentation template dialog 19400: Can't authenticate the D365 BC provider on creation. 19412: Fixed an issue with "show translations" not working on custom measures for Power BI premium. 19415: Fixed an issue where data formatting settings were not enabled on SSL fields for Power BI 19429: Removed unnecessary warnings in the ODX when synchronizing OLE DB based data sources 19457: Fixed an issue with remapping SSL table when the mapped MDW gets deleted. 19464: Added syntax highlighting for various scripts. 19505: Fixed an issue with clone fields and lookup transformation templates. 19519: There was an issue with incremental load into Azure Data Lake storage where the source type is handled as UniversalDataTypes.Datetime. This caused the incremental datetime value to be UTC + local time offset. 19526: Improved error message for when loading an instance fails. 19533: Added support for OAuth login during the add data source flow in the ODX. 19540: Fixed an issue with enabling 'Keep fields up-to-date' with XML data type fields. 19560: The ODX would continue transferring data even though the task was stopped in the ODX Execution Queue. 19562: Fixed an issue with running a job schedule set to "run once on this day". 19627: Fixed an issue where running execution packages with prioritization didn't work. 19678: Fixed an issue where deleting a data source in the ODX would not always do cleanup of its task history.
Today, we’ve published a minor release of TimeXtender (Desktop v. 6346.1) with the following changes:ChangedYou can now choose between a dynamic or static IP address filter when configuring firewall rules for instances in the Portal. This should help the - luckily - very few users for whom the dynamic system doesn't work. The 'description' field for data sources in the Portal is now optional, simplifying the data entry process. The system will now automatically scroll to validation errors when you try to save a data source in the Portal, making it easier to identify and correct issues. Fixed (Desktop)18883: In the Monitor window that shows the status of jobs, a job would show the status “Completed” with no indication of errors or warnings. Now, the status will tell you if there were issues during the execution. 19110: Switching between advanced and simple selection in the Add Data Source wizard, would sometimes result in an “Object reference not set to an instance of an object” error. 19113: The Row-level Security Setup window would “forget” the settings for the “Values field” and “Members field” options when the window was closed and then opened again. 19133: On some data sources, usually with lots of tables, loading a list of tables would take so long that the Query Tool window would be unusable. The list will now load asynchronously to avoid this issue. 19163: The setting for ‘Disable simple selection’ was not included when cloning a data source. 19168: Execution would fail with the error “An item with the same key has already been added” when using Azure Data Lake transfer and having renamed, then re-added a table in the data warehouse. 19173: On some data sources, usually with lots of tables, loading a list of tables would take so long that the Query Tables window would be unusable. The list will now load asynchronously to avoid this issue. 19178; Trying to delete a custom period in a date table would sometimes result in a “Deleted row information cannot be accessed” error. 19181: When using the Dynamics 365 Business Central data source provider, execution of the “G/L Entry” table would fail for tables with many records. Fixed (Portal)Trying to configure the ODX Server service using a user account with an unverified e-mail address would result in a blank error in the ODX Configuration app. The error message now explains the issue and how to resolve it. Monthly usage details were not displaying correctly in the Portal. Comments were not being saved or displayed for certain entries in the activity log. Changing the email address of a user in the Portal required that the user was activated (invited/signed in). This requirement has been removed. On the Activities page, it was possible to select dates that do not make sense, e.g. a 'From' date in the future. It was possible to change your email address to the email address of another user. It was not possible to set up sign-in with social accounts even though it should only require a verified e-mail address. When listing new fields on the 'Update data source' page, new values in a drop-down field would not count as a change. The 'Team development' information icon was displayed in the wrong place on the Edit Instance page. The Delete button would be positioned wrong on the Data Source page in the Portal. Trying to send a “critical notification” e-mail would result in an error if not username or password was provided.
Today, we’ve published a minor release of TimeXtender Portal with the following changes:NewCustom firewall rules for instances: You can now grant specific IP addresses access to an instance. Access to your TimeXtender instances are protected by dynamically created firewall rules, but in rare cases, the automatic system will not create the correct rules to allow access from TimeXtender Desktop. With the new feature, you can add additional rules if the automated system, for some reason, does not work for you.ChangedWhen you add and edit data sources, you’ll now find a handy menu on the left to help you navigate the forms, which can be rather long for some data source providers.FixedThis list includes all fixes released since TimeXtender 6221.1, including hotfixes.Adding missing activity logging for data sources When moving a user, the user would get all possible user-level permissions at the destination company. "Create instance" would sometimes fail if a company had previously deleted instances "Move company" could not find any valid destinations. When merging two companies, an outdated warning about ODX client secrets would be displayed. Editing your company on the ‘My company’ page would fail Some error messages from the backend would be replaced with generic messages when displayed in the UI. Checking status on a running job in TimeXtender Desktop would fail Fixed an issue where loading the list of data sources could fail Fixed intermittent threading error on public endpoints Updated .NET framework
Today, we’ve published a minor release of TimeXtender Desktop (v. 6284.1) with the following changes:Fixed18896: Can't open Select Tables menu for data sources with large schemas Added option to default to advanced selection when creating or selecting tables from a data source 18670: SSL field renaming not working for Tabular endpoint on Snowflake SSL Tabular endpoint: Fixed issue where renaming a field from a table on Snowflake storage would cause an error during execution. 18883: Job logs lacks detailed information We have addressed an issue where we failed to inform the user about job completion with errors or warnings. Now, when they check the job monitor, the status will be displayed as "completed" along with an indication if there are any issues encountered. 18838: Adding a Conditional Lookup Field to a SSL model gives a wrong field type until synchronized When adding a Conditional Lookup Field to an SSL model, a wrong field type would be shown until the SSL model was synchronized. 18662: Data Factory resources names are not unique when using "copy instance" Fixed an issue where creating resources for data factory would not be unique between instances created by the "copy instance" functionality. The naming convention has been changed to now include an identifier for the instances in the data factory resources names. 18599: Data Lineage is always using dbo as schema name when displaying object names Data lineage was always using dbo as schema. This has been corrected to show the actual schema that is being used. 18575: Default Hashing is referred to as Project Hashing and not Instance Hashing Updated an incorrect display name of the Default Hashing option 18720: Fix issue with mdw/ssl "copy to instance" corrupting destination Fixed an issue where using "copy to instance" for MDW and SSL instances could potentially corrupt the destination instance, if the destination instance was empty due to multiple transactions. 18713: Improve upgrade instance message Improved upgrade instance message for MDW and SSL instances 18633: Incremental load with deletes not working with a data lake Fixed an issue where incremental load with deletes didnt work in the ODX while using a data lake and no new data was transferred in the execution 18619: Issue with ODX ProjectLockVersions check Fixed an issue where a ProjectLock check was being run before the database was created. 18710: Issue with renaming script actions used in pre/post steps Fixed an issue with pre/post steps could not be edited when changing the name of a script action used as the pre/post step. 18640: Issues with Lookup Transformation Templates Fixed issue with Lookup Transformation Templates where names were not qualified Fixed issue with Lookup Transformation Templates where generating the lookup transformation script would fail since the fixed join value was missing formatting and escaping Fixed issue with Lookup Transformation Templates where the lookup transformation script would fail because the template selection statement was not formatted and escaped Fixed an issue with Lookup Transformation Templates where the lookup transformation script would fail because the default fallback value was also '' (empty string) for all data types. It is now NULL for all data types. 18688: Jobs sometimes select the wrong instance when using copy instances Fixed an issue with Jobs sometimes using the wrong instance in a "copy instance" setup. The execution packages in the job would use the first available instance across the copied instances when creating the job. 18610: Show error - details with incorrect redirect Fixed an issue where the details button on warnings and errors sometimes would incorrectly redirect to the support site. 18585: Startpage is not displaying the entire support site text Fixed UI issue where some of the text for the TimeXtender Support Site was not shown 18682: The ODX sometimes use old CData components Fixed issue where the ODX would sometimes use the wrong version of a managed ADO.NET component, due to the identifier of some external managed ADO.NET components not being unique. 18537: The Upgrade required pop-up appears with misaligned buttons Fixed issue with scaling on Upgrade job repository confirmation button 18571: Typo in Execution Server Configuration window Fixed a typo in the Execution Server Configuration 18907: Validate data source connection menu is not working Removed the 'Validate Data Source Command' from ODX data sources, as the logic that happened in the command was automated elsewhere. 18787: You cannot add multiple schedules to a job at a time Fixed an issue where trying to add multiple schedules to a job at the same time would only add one of them.
Spring has sprung, and we're happy to announce the release of a new version of TimeXtender (Desktop v. 6221.1). See what we've been up to below.Note: These Release Notes have been updated to reflect that the TimeXtender API is now live and no longer in closed BETA.NewAll semantic endpoints are now supported for Snowflake: If you have a data warehouse on Snowflake, you can now use it with all the semantic endpoints supported by TimeXtender. The Power BI, Tableau and Tabular endpoints join Qlik and CSV file as supported endpoints for this type of data warehouse storage. SQL Server 2022 support: TimeXtender now supports the latest and greatest major release of Microsoft SQL Server for use as a data warehouse or ODX data storage. Official support for Amazon RDS for SQL Server: Amazon's cloud SQL Server offering is now officially supported for use as a data warehouse or ODX data storage. Some of our enterprising customers have already paved the way by just doing it, and we're happy to put the "officially supported" stamp on their endeavor. Easy data source provider updates: We've made it much simpler to update a data source provider to take advantage of new features or bug fixes. You'll now see an aptly named 'Update' button whenever an update is available. Previously, you'd have to add a new data source in the TimeXtender Portal and switch the connection in TimeXtender Desktop. TimeXtender API for integrating with external systems: As an important step in our march towards world domination, we've created an API that can be used by external systems that want to, among other things, trigger and monitor task executions. Currently in closed beta, this feature can be compared to the feature in TimeXtender 20.10 and older that allows you to trigger an execution package from the command prompt. ChangedA Job can now be scheduled multiple independent times. On the ODX, we've added support for data-on-demand for Managed ADO.net data sources. 'Show data types' have been implemented on semantic models Tabular endpoints now show more details when an error occurs during execution. Fixed Managed ADO.net data sources now support multi-line properties that automatically add the correct line endings ('CR LR' or '\r\n')PortalWhen adding or editing a Qlik endpoint, you would get a "some fields have invalid values" validation error. It was not possible to delete a data source if the name of the data source contained whitespace or special characters in a specific way. Add/edit/clone data sources would not show a loading spinner when loading the form. When cloning a data source, the 'Clone' submit button was not disabled if validation failed. Users on the Free tier could clone a data source to exceed the limit of data sources. Fixed various other issues with data source cloning. Minor tweaks and adjustments to the styling of the Add/Edit Instance forms We fixed some technical debt relating to customer types left over from the implementation of the Free tier in our previous release.DesktopA few outdated or incorrect icons have been changed. Data would be missing from the "valid" table on tables that had a specific setup with a mapping set, a primary key field, and a data selection rule. Configuring the Execution server would, in some cases, not take the lock on an instance. When a test notification failed, it would not give the user a useful error message. Changing a snippet didn't always update the script. Opening the Error view would result in an error in a specific setup involving the 'Keep field values up to date' option. Jobs would on rare occasions show execution packages from other instances if these instances were made as a copy of another instance. In the Add Jobs wizard, some text was truncated at the end. Fixed an issue with the "Execute ODX Data Factory Merge Transfer" step that caused data sources with 'Data on demand' enabled to fail or be skipped when transferring data from the ODX to the MDW using Azure Data Factory. Fixed an issue on execution where excluding the "Execute ODX Data Factory Merge Transfer" step was ignored and executed anyway. Fixed an issue where transfers with Azure Data Factory from the ODX to the data warehouse did not set the batch count. Fixed issue with transfer from the ODX to a data warehouse on Snowflake when the table had incremental load with updates enabled in the ODX. Resuming an execution would skip 'table insert' and 'related records' steps. Fixed a misleading label in the Table Settings window. For data warehouse storage, 'Additional connection properties' were not added to the connection string. After changing storage on a data warehouse instance from on-prem SQL Server to an Azure SQL database, deployment would fail because extended properties were not created for functions and views. For data warehouses on an Azure SQL database, 'custom table insert' requires the 'xact_abort' setting to be enabled, which it was not. When synchronizing a mapping set with lots of tables, the window would be bigger than the display and therefore you would not be able to see and click the buttons at the end. The CSV endpoint would always use UTF8-BOM encoding, ignoring the user's choice. It was possible to add fields from different source tables to a semantic model even though it should not be possible. In a semantic model, deleting a measure or a hierarchy that was included in a perspective would not clean up the perspective properly. In a semantic model, deleting a field that was included in a perspective would throw an error during deployment. In a semantic model, adding a field to a table when having a custom field would cause an error. In a semantic model, dynamic role security setup values were not reselected on edit.
We've released a new version of TimeXtender (Desktop v. 6143.1) with a bunch of new features and even more fixes - see what's new below.Warning: The new version of TimeXtender does not support version 11 of the following data source providers:Azure Data Factory - MySQL Azure Data Factory - Oracle Azure Data Factory - PostgreSQL Azure Data Factory - SQL ServerPlease use version 12 of these data sources with the new release.NewFree tier replaces trials: You can now use TimeXtender for free as long as you like without worrying about running out of credits. When you sign up for TimeXtender, you now start on the Free tier that never runs out, but comes with a few limitations. Existing trial accounts will be converted to free.Limitations of the Free tier: One user One semantic model One data warehouse One ODX One data source Azure Data Lake Storage cannot be used for ODX storage Dedicated SQL Pool (SQL DW) and Snowflake cannot be used for data warehouse storage Data warehouse on Snowflake: We've added support for Snowflake and now, for the first time, you can deploy a TimeXtender data warehouse to non-SQL data storage, and, of course, take advantage of Snowflake features. Our initial implementation requires an ODX that uses Azure Data Lake Storage with SAS authentication and only works with the Qlik, and CSV file endpoints in the semantic layer. On the data warehouse, only features supported by simple mode are available. Read more on how to Use Snowflake as data warehouse. Improved scheduling (Desktop): You can now schedule execution packages from DWH and SSL instances in the same job. This is useful if you, for instance, want to execute a semantic model just after the relevant tables in your data warehouse. Note that the instances must be mapped to the same TimeXtender Execution Server service. On-Demand data warehouse ingestion: When the data on demand option is enabled, the data source will refresh each table in the ODX storage before transferring it to the data warehouse storage. This will work without configuring an explicit "transfer task" under the data source. Changed (Portal)For consistency, we've added an 'Edit' button for each item on the 'Data sources' list. Fixed (Portal)17587: It was not possible to add a data warehouse with Azure AD as authentication (released as hotfix). 16800: 'Clone data source' had the wrong "breadcrumb". 17809: The input box for the 'Batch size' option on ODX and data warehouse storage would max out at 65536 when using the "up" button which is far below the valid maximum value. 17485: The Permissions list is now hidden from 'Edit company details' when the list is empty. 17319: The Merge button is now disabled when you've clicked it to prevent accidental additional clicks. Fixed (Desktop)16902: Issue with misleading text in the Synchronize window when synchronizing a data warehouse with an ODX 16686: An unnecessary 'Connection Changed' message could show up when using the Query Tool on the data warehouse 17878: Issue where "resume execution" would skip Table Insert and Related Records 16865: Data lineage for views in data warehouse to data warehouse fields was not working 16599: Previewing a query table in the ODX sometimes wouldn't suggest the query table's statement, but instead use "Select * from..." 16036: When reloading an instance using 'Save and Reload', the previously open tabs were not reopened accordingly. This has been fixed. 17482: Removing a table that was included in an Object Security Setup, would cause the next deployment of that Object Security Setup to fail, as the references from the deleted table were still there. 16708: Using Export Deployment Steps to a CSV file would cause a null reference error 17249: Allowing a table to be compressed could not be combined with having history enabled. Enabling page compression on a table would result in the message "System field 'Is TombStone' cannot be removed". 16825: Data lineage tracing between a data warehouse view and a semantic model did not work. The semantic model did not track lineage through a mapped custom view. 17687: TimeXtender would crash when using the Deploy and Execute hotkey on views based on SQL snippets 16704: Using Select Columns to remove columns from query tables would fail on execution when transferring from an ODX on Azure Data Lake Storage. 17653: The Edit Data Area dialog would allow more than 15 characters in the area name. 16645: Enter didn't call search function in remap table when remapping a ODX This has now been corrected. 15407: Primary key validation error would remove all rows for the primary key in the valid table when using incremental load with hard deletes. 17148: It was not possible to change letter casing in the name of a conditional lookup field by clicking on the field and pressing the F2 "rename" keyboard shortcut. 17267: ODX DL to DW Azure Synapse Dedicated SQL Pool Incremental Load & hard delete results in valid table truncation. There was an issue where incremental load from the ODX using data lake as storage to a Data Warehouse using Synapse Dedicated SQL Pool would not transfer primary keys when no new data exists in the ODX, which would cause the valid table to be truncated. 17591: Adding both pre- and post steps on deployment for an incremental table would not redeploy the valid and incremental tables on "full load deploy" 16836: Trying to send a test mail in Notifications on Critical Errors would throw an error instead of sending an e-mail. 16729: Reconnecting to an Azure service in TimeXtender would fail after 12 hours without prior activity to the Azure service. 15995: Data lineage was missing information when a default relation was used instead of a join on a conditional lookup field. 17359: You would see an error message when testing a mail notification in Notifications on Critical Errors if the server returned "2.6.0 Queued mail for delivery" which isn't actually an error. 17115: SMTP authentication without a password did not work.
Modern software is built on open source third-party frameworks and components and TimeXtender is no exception. To use these pieces of software, we are required to include license notices - which we do with thanks to their authors. Below, you'll find notices related to TimeXtender Portal. For the Desktop, you can find the notices in the ThirdPartyNotices.txt file distributed with the program files. The MIT licensePermission is hereby granted, free of charge, to any person obtaining a copyof this software and associated documentation files (the "Software"), to dealin the Software without restriction, including without limitation the rightsto use, copy, modify, merge, publish, distribute, sublicense, and/or sellcopies of the Software, and to permit persons to whom the Software isfurnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included inall copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS ORIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THEAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHERLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS INTHE SOFTWARE.Components used under the MIT licenseBootstrapCopyright (c) 2011-2020 Twitter, Inc.Copyright (c) 2011-2020 The Bootstrap AuthorsAngularCopyright (c) 2010-2020 Google LLC. http://angular.io/license@angular-devkit/build-angularCopyright (c) 2017 Google, Inc.@angular/animations@angular/common@angular/core@angular/forms@angular/platform-browser@angular/router@ng-select/ng-select@swimlane/ngx-datatableCopyright (c) 2016 Swimlane <firstname.lastname@example.org>@yellowspot/ng-truncateangular-archwizardCopyright (c) 2016 madoarangular-tree-componentCopyright (c) 2016 500Tech LTDcore-jsCopyright (c) 2014-2020 Denis Pushkarevfile-saverCopyright © 2016 Eli Grey, http://eligrey.commobxCopyright (c) 2015 Michel Weststratemoment-miningx-bootstrap/chronosCopyright (c) Valor SoftwareCopyright (c) Dmitriy Shekhovtsov<email@example.com>Copyright (c) moment/momentCopyright (c) JS Foundation and other contributorsngx-clipboardngx-cookie-serviceCopyright (c) 2017 7leads GmbHngx-paginationCopyright (c) 2016 Michael Bromleyngx-toastrCopyright (c) Scott Cooper <firstname.lastname@example.org>ngx-window-tokenregenerator-runtimeCopyright (c) 2014-present, Facebook, Inc.utilCopyright Joyent, Inc. and other Node contributors. All rights reserved.webpackCopyright JS Foundation and other contributorszone.jsCopyright (c) 2016-2018 Google, Inc.Auth0.AuthenticationApiSendGridAuth0.ManagementApi The Apache 2.0 licenseLicensed under the Apache License, Version 2.0 (the "License");you may not use this file except in compliance with the License.You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, softwaredistributed under the License is distributed on an "AS IS" BASIS,WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.See the License for the specific language governing permissions andlimitations under the License. Components used under the Apache 2.0 licenserxjsCopyright (c) 2015-2018 Google, Inc., Netflix, Inc., Microsoft Corp. and contributorsrxjs-compatCopyright (c) 2015-2018 Google, Inc., Netflix, Inc., Microsoft Corp. and contributorstslibFluentValidation.AspNetCoreMicrosoft.Extensions.Configuration.FileExtensionsMicrosoft.Extensions.Configuration.JsonSelenium.SupportSelenium.WebDriver The ISC licensePermission to use, copy, modify, and/or distribute this software for anypurpose with or without fee is hereby granted, provided that the abovecopyright notice and this permission notice appear in all copies.THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITHREGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY ANDFITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROMLOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OROTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE ORPERFORMANCE OF THIS SOFTWARE. Components used under the ISC licenseinheritsCopyright (c) Isaac Z. Schlueter The Font Awesome Free License(CC-BY-4.0 AND MIT)Font Awesome Free is free, open source, and GPL friendly. You can use it forcommercial projects, open source projects, or really almost whatever you want.Full Font Awesome Free license: https://fontawesome.com/license/free.# Icons: CC BY 4.0 License (https://creativecommons.org/licenses/by/4.0/)In the Font Awesome Free download, the CC BY 4.0 license applies to all iconspackaged as SVG and JS file types.# Fonts: SIL OFL 1.1 License (https://scripts.sil.org/OFL)In the Font Awesome Free download, the SIL OFL license applies to all iconspackaged as web and desktop font files.# Code: MIT License (https://opensource.org/licenses/MIT)In the Font Awesome Free download, the MIT license applies to all non-font andnon-icon files.# AttributionAttribution is required by MIT, SIL OFL, and CC BY licenses. Downloaded FontAwesome Free files already contain embedded comments with sufficientattribution, so you shouldn't need to do anything additional when using thesefiles normally.We've kept attribution comments terse, so we ask that you do not actively workto remove them from files, especially code. They're a great way for folks tolearn about Font Awesome.# Brand IconsAll brand icons are trademarks of their respective owners. The use of thesetrademarks does not indicate endorsement of the trademark holder by FontAwesome, nor vice versa. **Please do not use brand logos for any purpose exceptto represent the company, product, or service to which they refer.** Components used under the Font Awesome Free License@fortawesome/angular-fontawesome@fortawesome/fontawesome-svg-core@fortawesome/free-brands-svg-icons@fortawesome/free-solid-svg-icons Components used under mixed licensesSee details under the individual component lodashCopyright OpenJS Foundation and other contributors <https://openjsf.org/>Based on Underscore.js, copyright Jeremy Ashkenas,DocumentCloud and Investigative Reporters & Editors <http://underscorejs.org/>This software consists of voluntary contributions made by manyindividuals. For exact contribution history, see the revision historyavailable at https://github.com/lodash/lodashThe following license applies to all parts of this software except asdocumented below:====Permission is hereby granted, free of charge, to any person obtaininga copy of this software and associated documentation files (the"Software"), to deal in the Software without restriction, includingwithout limitation the rights to use, copy, modify, merge, publish,distribute, sublicense, and/or sell copies of the Software, and topermit persons to whom the Software is furnished to do so, subject tothe following conditions:The above copyright notice and this permission notice shall beincluded in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OFMERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE ANDNONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BELIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTIONOF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTIONWITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.====Copyright and related rights for sample code are waived via CC0. Samplecode is defined as all source code displayed within the prose of thedocumentation.CC0: http://creativecommons.org/publicdomain/zero/1.0/====Files located in the node_modules and vendor directories are externallymaintained libraries used by this software which have their ownlicenses; we recommend you read them, as their terms may differ from theterms above. Components used under other licensesFollow the links for licensing information. Jdenticon-nethttps://github.com/dmester/jdenticon-net/blob/master/LICENSE.txtBogushttps://raw.githubusercontent.com/bchavez/Bogus/master/LICENSExunit.runner.visualstudiohttps://raw.githubusercontent.com/xunit/xunit/master/license.txtxunithttps://raw.githubusercontent.com/xunit/xunit/master/license.txtSelenium.WebDriver.ChromeDriverhttps://licenses.nuget.org/UnlicenseMicrosoft.NET.Test.Sdkhttp://www.microsoft.com/web/webpi/eula/net_library_eula_enu.htmxunit.runner.visualstudiohttps://raw.githubusercontent.com/xunit/xunit/master/license.txtMoqhttps://raw.githubusercontent.com/moq/moq4/master/License.txtxunithttps://raw.githubusercontent.com/xunit/xunit/master/license.txtPDFsharp-MigraDochttp://www.pdfsharp.net/MigraDoc_License.ashxMicrosoft.EntityFrameworkCore.SqlServerhttps://raw.githubusercontent.com/aspnet/Home/2.0.0/LICENSE.txt
This Service Level Agreement (SLA) is between TimeXtender and all users of the TimeXtender SaaS portal. The SLA covers the responsibility of TimeXtender for maintaining the quality and availability of the TimeXtender SaaS portal. Availability of servicesThe TimeXtender SaaS portal runs in the cloud and depends on various cloud services to run and function as expected. Incidents can happen and we do our very best to mitigate any possible downtime of the services.Our target availability is to have a guaranteed uptime of 99.9% of the TimeXtender SaaS portal. The metrics used to determine the availability of the service are 1) resource availability and 2) response time. The current status and availability of the SaaS portal can be viewed and monitored at https://status.timextender.com/Additionally any incidents or scheduled maintenance that can affect the availability of the service can be seen there. Exceptions and limitationsTimeXtender is not liable in any circumstances where the downtime or availability of the SaaS portal causes an end-user impact. A scheduled maintenance window can be needed to perform routine maintenance and upgrades. Those will always be announced on the status page in advance.
Release date: 2022-12-16 Fixed 17074: Duplicate Semantic Field added with suggested relation This has now been corrected. 17102: Auto-generated indexes are missing on valid when a new table is created by an ODX table with incremental load This has now been corrected. 17706: Issue with TimeXtender Execution Server and repository connections The Execution Server would cause to many new connection to be created. This has now been corrected.
IntroductionWelcome to the new cloud-enabled version of TimeXtender! The biggest change by far is the introduction of cloud-powered ODX, data warehouse, and semantic model instances and with that, a turn towards a more web-based infrastructure and software-as-a-service business model. Instances are self-contained objects that can be strung together in the software to create the flow of data you need. No more messing with local repositories and projects, everything is managed by TimeXtender to create a more trouble-free experience. What's new?The new version of TimeXtender is a major reimagining of the application. All the changes are too many to list, but below is an overview of the major additions and changes. At the end of this article, you'll find links to all the new documentation - including screenshots - if you want to get into the nitty-gritty of the new release.Cloud-powered instances as building blocks of the Data Estate As mentioned above, you build your TimeXtender Data Estate with ODX, data warehouse, and semantic model instances. They are created and managed by company admins in the TimeXtender Portal, and available for developers in TimeXtender Desktop when they've signed in. Instances are the basis for our new usage-based pricing model. Manage data sources in the Portal Admins can set up data sources in the Portal. Data source credentials can be kept confidential while allowing users to use the data source. User management and access control in the Portal Users - and what instances they can access - are managed in the Portal. No more hassle with license keys and client secrets Sign-in is now required to use TimeXtender Desktop. This means that you no longer need to enter license keys and activate the software. Setting up ODX services also makes use of sign-in to remove the need for messing with client secrets. Full documentation - incl. the ODX You can now create documentation of your ODX just like you know it from the data warehouse. Context-sensitive help In TimeXtender Desktop, you can click '?' in the title bar or press F1 in the different windows to be redirected to a relevant article on the support site, support.timextender.com. If there's no relevant article to show, you'll be redirected to the front page. We'll continuously improve the list of articles to ensure that there's a relevant article for the windows that spawn the most requests. ODX: Oracle Data Source TimeXtender-enhanced data source with date and numeric range conversion and exclusion of system schemas. ODX: Dynamics 365 Business Central data sources In the TimeXtender-enhanced data source for Dynamics 365 Business Central, you can select accounts. This saves you the hassled of filtering out tables that belong to accounts you don't need. In addition to that, we have a data source for getting a table with Business Central "option values" that you can use to enrich your data model. ODX: Rollup data files in Azure Data Lake If you're using an Azure Data Lake for your ODX storage, each incremental load from the source creates a new set of files with data. To improve performance, you can now set up storage management tasks to rollup - or merge - these files to improve performance when loading from the ODX. The rollup utilizes Azure Data Factory. ODX: Sync data sources through Azure Data Factory You no longer need direct access from an ODX to sync a data source that you are transferring data from using Azure Data Factory. You can now use an Azure Data Factory to sync data sources as well. ODX: "Subtraction from value" option for incremental load You can now subtract a value or an amount of time from the incremental selection rule when you do incremental load, which is nice for making sure all changes are copied from the data source. MDW: Data profiling As a supplement to the plain preview option in TimeXtender, you can run a standard analysis on a table to get an overview of the data profile. MDW: Execute PowerShell Script as 'external step' You can execute PowerShell scripts as an 'external step' which provides endless possibilities for interacting with external components. MDW: Lookup transformation With a new kind of transformation, you can manage small or trivial lookups in a more effective way compared to conditional lookup fields. MDW: Multiple ODX table mappings based on filters You can now map multiple tables into one data warehouse table from the ODX based on filters. MDW: Support for multiple ODXs as sources for one data warehouse You can now use data from multiple ODXs in one data warehouse. SSL: Support for Power BI XMLA endpoint We've added a new semantic endpoint to semantic models, Power BI Premium, to support the Power BI Premium XMLA Read/Write Endpoint. SSL: Semantic endpoint for CSV files We've added a new endpoint to enable export to CSV files. This was previously enabled by Data Export, which we have removed from the semantic layer.Other changesIn addition to all the new stuff, we have changed or removed a bunch of the existing functionality in TimeXtender. That includes the following:Improved synchronization from MDW->SSL and ODX->MDW Synchronization has been improved to give you a better overview of changes with the option to accept or reject changes as well as remapping fields in bulk. ODX: Improvements to table selection You can now select tables for copying to the ODX from a list, much the same way as you might be used to from the business unit. This is an addition to the current rule-based system that you use to select what tables to copy from a data source in the ODX can be cumbersome to use when you're only interested in a few specific tables or the data source naming schema is too "chaotic" for rules to make sense. ODX: Improved creation of ADF pipelines for transfer between for data lake and data warehouse We've changed the logic for transferring data from an ODX data storage on Azure Data Lake to a data warehouse through Azure Data Factory to use fewer pipelines which improves performance and reduces cost. MDW: Removed SSIS as a transfer option The MDW layer has been simplified by removing SSIS as a transfer option. MDW: Goodbye to Business Units Using the ODX has long been the preferred way to copy data from data sources. In the new version, it will be the only way, which means a goodbye to the legacy Business Units. SSL: Goodbye to SSAS Multidimensional (OLAP) Cubes, the Qlik modeler, and Data Export We've simplified the semantic layer by removing everything that is not semantic models built in the shared semantic layer. Removed deprecated features We've removed all previously deprecated features. This includes the following: Azure Data Lake Storage Gen1 as ODX data storage, regular expressions in the ODX, 'data aggregations' on data warehouse tables, 'Enable BK hash key' and 'Use left outer join' options on data warehouse tables, 'Force sub select' and 'Use temporary table' options on conditional lookup fields, 'SQL mode->Partition By' option on lookup fields, 'split' and 'concatenate' options on field-to-field data movement, 'time tables' in the data warehouse.Next steps - planned featuresWhen you build software, you're never truly done and we also have a bunch of stuff planned for the next releases. This includes the following:Improved multiple environments Currently, you can copy one instance to another instance as a basic form of multiple environments functionality. However, we plan to implement much more extensive support for multiple environments. It will be available in the web interface and will include updating the relevant connections to save manual work. End-to-end scheduling We plan to offer the option to execute and schedule ODX tasks and MDW/SSL execution packages together in one job that can automatically calculate dependencies between the included objects to ensure the correct execution order. Automated migration The new TimeXtender contains some big and breaking changes, but we will provide a tool that allows you to migrate projects from previous versions of TimeXtender into the new structure in an automated fashion. Custom(er) Data Sources - (Open Interface Data Source) With a new interface, you can create your own data source providers for the ODX.Learn more about all the new featuresOur customer success team has been hard at work documenting all the new stuff. Below you'll find links to the articles they've created - click one and take a deep dive into the latest release of the world's premier data estate builder! Getting Started - Setup TimeXtenderInstall TimeXtender Setup and Configure an ODX Instance TimeXtender Prerequisites Configure your Firewall Copying Instances to Implement Multiple EnvironmentsGetting Started - Configure Azure ServicesUse Azure Data Factory for Data MovementKnowledge Base - Connecting to DataTasks in an ODX Instance (includes incremental rollup storage management task) Table and Column Selection in an ODX Instance Transfer data from an ODX to a Data Warehouse Instance TimeXtender Dynamics 365 Business Central TimeXtender Oracle Data Source provider for ODXKnowledge Base - TimeXtender PortalGrant and Revoke Access to Instances Add an ODX Instance Add a Data Warehouse instance Add a Semantic Model Instance (includes CSV endpoint) Credits and BillingKnowledge Base - Incremental Load, Execution & SchedulingExecute PowerShell Script as External Executable Scheduling Executions using Jobs Incremental load in Data Warehouse Instances Incremental load in an ODX instanceKnowledge Base - Data Validation, Quality, and ProfilingDocumentation for Instances (includes documentation for ODX instances)Knowledge Base - Design, Modelling and TransformationsLookup Transformation Template Mapping Set Data ProfilingKnowledge Base - Semantic ModelsSemantic Model Synchronization Power BI XMLA Endpoint
New featuresA TimeXtender solution is now based on ODX, data warehouse and semantic model instances set up in the TimeXtender Portal New pricing model based on usage Manage data sources in the Portal User management and access control in the Portal License keys and client secrets replaced with sign-in to the Desktop Full documentation for the ODX Context-sensitive help ODX: Oracle Data Source ODX: Dynamics 365 Business Central data sources ODX: Rollup data files in Azure Data Lake ODX: Sync data sources through Azure Data Factory ODX: "Subtraction from value" option for incremental load MDW: Data profiling MDW: Execute PowerShell Script as 'external step' MDW: Lookup transformation MDW: Multiple ODX table mappings based on filters MDW: Support for multiple ODXs as sources for one data warehouse SSL: Support for Power BI XMLA endpoint SSL: Semantic endpoint for CSV filesChangedImproved synchronization from MDW->SSL and ODX->MDW ODX: Select tables from a list in addition to the rule-based selection ODX: Improved creation of ADF pipelines for transfer between for data lake and data warehouse for better performance and reduced cost MDW: SSIS has been removed as a transfer option MDW: Business Units feature has been removed SSL: SSAS Multidimensional (OLAP) Cubes, the Qlik modeler, and Data Export have been removed All deprecated features have been removed
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.OK
Sorry, our virus scanner detected that this file isn't safe to download.OK