TimeXtender Desktop Q&A
Ask questions and find answers about the TimeXtender Desktop Application
Intermittent execution issues
I’m regularly running into the “Cannot open server 'sql-instances-prod' requested by the login. Client with IP address '22.214.171.124' is not allowed to access the server.” error message, especially during overnight executions by the execution scheduler.But I can’t replicate the connection issue, i.e. it seems to be intermittent:Has anybody else encountered this issue and do you know what would be causing it / what the workaround might be?
tableAccount is not contained. Parameter name: tableAccount
Dear Community,i'm working with timeXtender 20.10.22 and my source is a Business Unit with the NAV (Business Central) adaptor.I've switched in my project between the source being Production (BC) to Acceptance (BC).Now im getting this error and I cannot do any thing anymore. Also switching back does not work. Can anyone help me? Thanks! tableAccount is not contained.Parameter name: tableAccountDetails:tableAccount is not contained. ...Module: timeXtenderSystem.ArgumentException at TimeXtender.DataManager.TableUsageMatrix.GetTableUsage(Table_NAV tableAccount, Account account) at TimeXtender.DataManager.Adapter_NAV.GetExpectedDeployedObjectsPrivate(Table_NAV tableToDelpoy, Guid projectId, Guid tableId, List`1 sqlObjects, ProviderDestinationSql providerDestinationSql) at TimeXtender.DataManager.Adapter_NAV.GetExpectedDeployedObjects(IDataAdapterTable table, Guid projectId, Guid tableId, List`1 sqlObjects) at TimeXtender.DataManager.StepTableSimpleDeploy.GetExpectedDeployedObjects(L
Job logs lacks detailed information
Hi,We are currently trying to figure out the new gen of Timextender, and I’m finding the setup with jobs a bit lacking. Perhaps I’ve just missed something, but here are a few things that bug me. As far as I can understand, the way to go when scheduling in the DW is to set up your execution packages in the execution tab. Then we need to set up a Job that schedules the execution packages. In my trials I intentionally set up a package to fail, and first of all the monitoring view of jobs do not show any information about the error: Jobs monitoringIf I go into the execution log, I can see an error message that essentially just says that the job didn’t succeed.Execution log for test jobFor debugging that means I have to check the contents of the job (that potentially could have multiple execution packages in it) and then head over to the execution tab to check the log for the package, where I see all the details.Execution log in execution tabI would think it was nice to be able to reac
Schedular stopped working.
The scheduler stopped working in production. Yesterday I restarted but it still not running.How can I fix this issue?I have gone through the below document.Scheduled Execution issues - Did it not start, did it fail, or is my execution still running? – TimeXtender SupportOn my computer the recovery options were disabled.
ADO.NET Transfer consistently slow on incremental table with no new rows
Hi! I am experiencing an issue with ADO.NET transfer times on incrementally loaded tables where the ADO.NET transfer takes the same amount of time regardless of the amount of new rows coming into the table. A full load of the table containing about 45 million rows takes about 25 minutes, on the next incremental load the load time is still the same with ADO.NET taking up around 24 of these minutes. Our current data flow is as follows: SQL server → TimeXtender → Azure elastic pool (this is where all of our TimeXtender databases resides) Full load - 45 million rows Incremental load - 11 thousand new rows in the _R table The amount of new rows coming in to the table Has anyone experienced a similar issue? My best guess is that the problem resides in the azure elastic pool where the ado.net transfer is being throttled. Even if I have 0 rows in the _R table the ADO.NET transfer time is the same. Thank you!
Run transfer job only when ODX transfer task completed without errors
I want my transfer to MDW job to run only when my ODX transfer task has completed without any errors. What happens sometimes now is that for some reason there is an error in the extraction and some tables are empty, they then get pushed to the MDW and our reports break because the tables are empty. I see that you can use instance variables but I don’t see that option on my ODX. How can I set this up?
Extracting data from External-database
Hi TX Community!I get data from an external-database everyday.Since we have an aim of incremental-loading the data that we extract, we are doing some query-tables from the datasource to create a “incremental-load”-key.In that matter we are experiencing two issues:1- TX cannot read the date-formats that are extracted from the database. This is the format that we get out: Of course, we can right-click on the field and edit the datatype, but that we would have to do everytime we syncronize the datasource because everytime we syncronize the datasource all the date-fields are back to the “unknown”-format and therefore we would have to right-click on each field and edit the data-type. We’ve tried to use the “Data type overrides” but it doesnt seem that we can convert from an “unknown”-format. How can we solve this problem? As mentioned, the tables are query tables and therefore we would like to think that the date-formatting could be solved with a CAST or a CONVERT function. Any ideas?2- In
Incorrect syntax for parameters on custom fields in Qlik semantic models
Hi team, TimeXtender allows adding parameters from a different table to a custom field in a semantic data model (Qlik). The resulting syntax/qlik script combination is always broken.When using adding a custom field parameter from a different table, TimeXtender fully qualifies the Qlik syntax regardless of the settings. The resulting syntax on the Qlik side will no longer match the syntax in the views created by TimeXtender:Qualified setting:Fully qualified setting: The resulting Qlik Script:"Sales_Targets":LOAD"KPI", "Target", "DIM_Boekdatum.DayName" AS "Test";SQL SELECT"KPI", "Target"FROM "Test"."dbo"."Test QVD_SLQV";But the view has the following syntax:CREATE VIEW [dbo].[Test QVD_SLQV]-- Copyright 2011 timeXtender a/s-- All rights reserved---- This code is made available exclusively as an integral part of-- timeXtender. You may not make any other use of it and-- you may not redistribute it without the written permission of-- timeXtender a/s.ASSELECT [KPI] AS [KPI] ,[Target] AS [Targ
Transfering multiple tables from Theobald Error while copying content to stream
I am setting up TimeXtender to extact tables from SAP with Theobald. I have a selection of tables that I can succesfuly extract when I add just one table to a transfer task. However, when I add all relevant tables to a single transfer task I run into errors. The extraction seems to be succesful on the Theobald side but moving the data to the ODX storage gives the following error “Error while copying content to a stream”. I can figure out why the tranfer is working when I do it per table and gives an error when I do multiple tables.. Full error sample below: Executing table rest_mara_generalarticledata:failed with error:System.AggregateException: One or more errors occurred. ---> System.Net.Http.HttpRequestException: Error while copying content to a stream. ---> System.ObjectDisposedException: Cannot access a closed Stream. at System.IO.__Error.StreamIsClosed() at System.IO.MemoryStream.get_Position() at System.Net.Http.StreamToStreamCopy.StartAsync() --- End of inner exce
Scheduled excecution: prioritization not working
Hi Support, We are experiencing an issue with the prioritization in our execution package. Our trip table is updated with an update script using data from the number_per_trip table.However, the trip table is loaded before the numbers_per_trip table. This results in missing new data in the trip table as the number_per_trip table has not yet been loaded before the script action is executed. We would like to change the loading order of these tables and have tried to do this by adding prioritization. This has no effect on the load whatsoever. I cannot find out what the problem is.Are there any settings blocking the prioritization feature or are we using the feature in the wrong way?See settings below.
ODX "loses" transactions
When loading regularly (every hour) using incremental load, the ODX seems to “lose” transactions from source. Next run of the incremental load, ODX does not catch these missing transactions. The work around is to full load data, but the this takes long time. And it renders the incremental load useless as it is not to be trusted.We are running 20.10.34. Does any of the newer 20 versions have a fix to this?Regards Mads
Remove path name from File Name in table column
HelloI currently have a column with the following file name in one of my tables. C:\Users\John\OneDrive - Sales Solutions\Desktop\TimeXtender\TMX-DataSamples\RLZ\BESTERS - BESTERS POINT - BESTERS IND_DEC 22_101602_0.txtIs there a way to remove the path name in the column to only have the file name?I have a file for each month and have merged the files so all the data is in one table. For example, remove this piece: C:\Users\John\OneDrive - Sales Solutions\Desktop\TimeXtender\TMX-DataSamples\RLZ\and only show this piece:BESTERS - BESTERS POINT - BESTERS IND_DEC 22 as the file name.I also need to have the Month Year (DEC 22) of each file copied into a new column called “Date”. Is this at all possible?Thank you
Business Unit_ODX , Error with _R tables
How can I solve extension _R tables failing to create? I am working on the old project Now.This project was copied from another project repository.Environment: SandboxVersion: 126.96.36.199Data Source: Business unitProvider: SQL Server Data SourceThe connection with the Data source is good and synchronized well.When I am deploying the tables in ODX storage getting an Error.An error occurred during create a table. See exception details for the failing object: Create failed for Table 'BSA.BSA_dbo_Inspection_R'.An exception occurred while executing a Transact-SQL statement or batch.The specified schema name "BSA" either does not exist or you do not have permission to use it.
Excel Online connector sometimes fail with "Error while listing workbooks for drive"
Hi,I am running a legacy version of TimeXtender (version 188.8.131.52), where we have an Excel Online connection set up using CData ADO.NET Provider for Microsoft Excel Online 2022 (22.0.8389.0) as data source.This data source is configured to authenticate via Azure ‘client flow’, see picture. The app registration in Azure has the following permissions: The data source works properly most of the time. It is able to list the worksheets it finds on the SharePoint site, and can fetch data. However, scheduled execution packages sometimes fails with the following error message: Could not execute the specified command: Error while listing workbooks for drive: [generalException] General exception while processing. Details: Error while listing workbooks for drive: [generalException] General exception while processing. Module: System.Data.CData.ExcelOnline fx220l.yg at fx220l.IIu.m(Boolean ) at fx220l.IIu.X(tbp`1 , Boolean ) at fx220l.IIu.YI(LoL )
Unable to Set up MDW connecting to Azure SQL database using Azure AD Integrated Authentication type
Did anyone try to connect azure sql database using Azure AD Integrated Authentication? I notice MFA was missing as an Auth type but AD Integrated was failing with error. Let me know if anyone resolved this error. One or more errors occurred.Could not discover endpoint for Integrate Windows Authentication. Check your ADFS settings. It should support Integrate Widows Authentication for WS-Trust 1.3 or WS-Trust 2005. Details: Could not discover endpoint for Integrate Windows Authentication. Check your ADFS settings. It should support Integrate Widows Authentication for WS-Trust 1.3 or WS-Trust 2005.
Intent of Recompile - Default/Pro/Con
On clean procedures generated by TX, they seem to have a RECOMPILE embedded into the procedure itself as a general rule. I think this is to keep them fresh to other changes in a project that get introduced. But do we know the exact reason why, and is there a way to remove other that doing a customized change table by table?
An item with the same key has already been added
Dear Support,The reload in TimeXtender is giving the error: An item with the same key has already been added.It seems to be that one column is mapped on two different columns in the same table.Is there a quick solution to find the column which is causing this error? Thanks in advance! Christian
Incremental load based on history table with soft deletes
I have table A with history enabled, all columns SCD type 2 except the PK and a record is marked as deleted when deleted in the source (with a seperate record).In the next data area / data warehouse (DSA) i want to create an incremental table B based on table A, but I do not want to have deleted records from A in table B.When follwing the incremental rule wizzard the incremental selection rule can be set based on the incrementaltimestamp but deletes can not be handled because table A has no deletes (is thomstone =1).How to solve this the easiest? I can only think of taking along the Isthombstone field to table B and delete the records in a post script but this should be possible musch easier I would say.Suggestions?
Incremental loading on table using a custom table insert with a union
I have 3 tables in my src data warehouse:src.SalesOrderItemsDelta, this table gets filled every day with a delta (changes today vs yesterday) of our order lines. Via src.SalesOrderItems, this table gets filled every Saturday night with all the order lines available. src.preFactSalesOrderItems, this table gets filled via a custom table insert with the following table insert: SELECT [SalesOrderItemID] ,[SapClient] ,[SalesOrderNumber] ,[SalesOrderItemNumber] ,[SalesOrganization] ,[DistributionChannel] ,[Division] ,[FaboryArticleNumber] ,[SoldToCustomerCode] ,[BinCode] ,[CreatedOnDate] ,[CreatedOnDateID] ,[ChangedOnDate] ,[ReasonForRejectionCode] ,[PromisedDeliveryDate] ,[PromisedDeliveryDateID] ,[CommunicatedDeliveryDate] ,[CommunicatedDeliveryDateID] ,[PlantCode] ,[SalesAmount] ,[SalesCurrency] ,[ItemCategoryCode] ,[OrderedQuantity] ,[BinQuantity] ,[SalesOrderCategoryCode] ,[SalesOfficeCode] ,[ConfirmedDeliveryDate] ,[ConfirmedDeliveryDateID] ,[CommittedDeliveryDate] ,[CommittedD
Environment Transfer, Deploy and Execution dependencies
We have a setup with 3 environments (Dev/Test/Prod) running in the legacy version, with a BU Based ODX, DSA, MDW and Several SSLs. Based on a on-prem SQL Server setup. Fairly common I guess.We are multiple developers on a shared project, making changes daily to the project, and therefore needs some QA process. Our target is to have changes and new functionality running on Test for a week before transfering to production. In addition to out centralised BI org, we support the data needs of analysts in the different departments. For this we have established replica databases of ODX and MDW. The analysts can read these replicas without interfering with the centralised data processing (The primary reason for the replicas). Along with the data read access, we have a database the analysist have the rights create objects in (typically views and stored procedures). The analyst environment will only be exposed on our prod platform.We would like to provide a better SLA for our Analysts for new ta
Login to the community
No account yet? Create an account
Login with SSOSSO login
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.