Legacy & Upgrades
Ask questions about legacy TimeXtender products and upgrading
- 40 Topics
- 140 Replies
Tables made empty after ADF load from ODX to DSA
Hi,We wonder if someone of you already ran into this issue, and perhaps have an idea on how to solve it ?Setup :TX 20.10.38 using Azure SQL DB for DWH and ADLS gen2 for ODX Server. ADF used to load data from ODX Server into DSA. All databases have private endpoint, so SelfHosted Integration Runtime is used by ADF.Issue :From time to time (random - not always on the same tables - for both incremental and full load tables) we notice that the ADF pipeline takes the normal amount of time to load data from ODX to DSA R-table. However the cleansing takes 0 (zero) seconds and the record count log reports 0 (zero) records in the R-table. ADF nor TimeXtender Execution report an error.In the ODX Service Log we often notice this error : Failed to encrypt sub-resource payload… (not on exact the same moment as the execution - so not sure it has something to do with the issue)Work-around : ADO.netWe had a simular issue last year and started using ADO.net to load data from ODX Server into DSA. We
Error when deploying tabular model - "This row has been removed"
Hi all,When deploying our tabular model/endpoint (Azure Analysis Services), we get a strange error: "This row has been removed from a table and does not have any data. BeginEdit() will allow creation of new data in this row".In the error message we don't see which table or row this relates to and we don't know if this error relates to a datawarehouse table or a repository table. All our datawarehouse tables in this tabular model contain data and 'look good'. In addition, this error occurs in one of our environments and not in the other environments. Has anyone experience with this error? Any thought on how we can debug this one?Thanks.
Using PowerBI Premium Tabular for TimeXtender Legacy instead of AAS
There's loads of documentation about AAS and PowerBI Premium tabular models being mostly identical in use. Currently we are using 2 expensive AAS servers to host our development and testing semantic models. Transferring these 2 environments to a PPU environment within PowerBI would save allot of money. Within the current version of TimeXtender there is a separate option to select a Premium Tabular model instead of AAS. This is something the legacy version does not have. But why would TimeXtender need to know the difference. In working with either Premium or AAS their XMLA connection is exactly the same. Setting a migration from AAS to Premium and then simply exchanging the links within the environment properties seems like a simple enough plan. As i found in the following link the important part would be having an updated library to do this data transfer. https://learn.microsoft.com/en-us/analysis-services/client-libraries?view=azure-analysis-services-currentOther than this i don't see
Pre-script results in error
v18.104.22.168Hi,I have set up a simple script action as follows to set the LAST_CHANGE_DATETIME column to the ENTRY_DATETIME value if it is NULL.Script action This works fine when selecting from the table in SSMS:SSMS results using same script actionThe script is set as a pre-script to the ADO.NET transfer on the execute step, but when I execute the table, the following error appears:Incorrect syntax near 'LAST_CHANGE_DATETIME'.Can I implement this simple script action as a pre-script? The idea is that I can then use a fully populated LAST_CHANGE_DATETIME column for incremental loads on my ledger table.Thanks,Richard
Data Export - From Business Unit
I have used the data export tool to export to .csv files. However I can not do it straight from the Business Unit. Is that a feature that can be added? Or maybe not included in client license. The client wants a few tables from source to be transfered to Azure Data Lake as a csv file and do some tranformation there.
I'm encountering a truncation error when synchronizing Dynamic AX Adapter data source in my Business Unit
We have an upgraded MS Dynamics 365 environment, from which we publish to an Azure SQL database. When I attempt to attach an existing project to it to synchronize data source (using a SQL data source connector as the project normally does) I can connect with a test connect, but then after building the selection tree, TimeXtender throws the error “The given value of type String from the data source cannot be converted to type nvarchar of the specified target column. String or binary data would be truncated." Since it isn’t yet part of project execution but rather a TimeXtender comparison operation, I’m somewhat befuddled as to how to proceed. I’m not sure if this belongs in data sources or desktop, so took a guess on placement.
Schedule job stops executing
Once in a while my scheduled job does not finish without an error. It just stops doing anything.My event log states that around the time my job stopped running, both the server and the scheduler started.I do not find anything in the logs where it says the server and scheduler stopped running, so I have no idea why they restarted.Would the server/scheduler being down explain why the job stops running rather than failing and providing an error?Does anybody experience the same? Any ideas where I can find more information on why the server/scheduler stopped or restarted? And possibly how I can prevent this of course?
Does anyone have experience switching from ADO.NET to SSIS?
We would like to possibly switch from ADO.NET to SSIS. At the moment, both the TX application (ODX is not being used) and the SQL server are on the same box using ADO.NET. We want to split off TX to its own server and use SSIS instead of ADO.NET. Has anybody done this and can share the high level steps? Or even how you think it might be best accomplished? Specifically, once the application server is up and running with all the pertinent software, how would one make this transition?
How to identify modified objects using TimeXtender repository
Hi , We have a need to identify the objects modified in last few days to deploy as part of differential deployment to various environments. To do this, we were looking at DiscoveryHub metadata database, which contains various tables like DataTables, DataColumns, Transformations etc. However, to identify the delta, we have two columns ValidFrom and ValidTo which are stored as integers. We need a way to identify the data, either by converting these to dates or based on a range of dates. Please help
Reconnecting to a source getting "Object reference not set to an instance of an object" error
Thanks in advance for any help.We had a dev deploy but not execute a table before he left for the day. Someone else loaded the table and when he closed the project, no one else could get in. Imported a day-old backup, was able to reload all of the tables. Started working on creating some new dimension and when tried to sync to the source system (Epic Clarity) SQL Server database, got the above error. Full message below: Object reference not set to an instance of an object. Module: timeXtender System.NullReferenceException at TimeXtender.DataManager.BulkedInformationSchema.ReadSchema(Guid datasourceId, String overrideSql, Boolean deleteExistingRecords) at TimeXtender.DataManager.DataSource.BulkTransferADO(Boolean readForeignKeys) at TimeXtender.DataManager.DataSource.RefreshInformationSchema() at TimeXtender.DataManager.ReadDataSourceObjectsCommand.<>c__DisplayClass6_0.<ExecuteCommand>b__0() at TimeXtender.DataManager.ConnectingThread.ExecuteConnectingThread(O
Can you protect your project from piracy?
I know you can lock a project from editing, but is there a way to password protect a project we install on a client's server so they can't view how we built it? We've spent hundreds of hours building an ERP integration that we want to sell multiple times and don't want a client or competitor to simply rebuild it by exporting, or just looking at the project design and mimicking it.
Multiple environments, different load settings
I have a question about multi environments and transferring. In addition to production, we have set up a development and test environment in TimeXtender with separate databases. This all works great. However, we want to work with a limited set of data in the development environment. We don't want to constantly have to wait for an execution in the development environment. We also do not want the test and development environment to fill up the hard disk too much. Now the question: is it possible to use a filter (for example load all data from last 30 days) in the development environment (in stead of incremental load) and to continue to use it incrementally load in the test and production environment? The filter should not overwrite the incremental setting during a transfer! So we want transferring data from different environments with different load settings.
How to create one central ODX fot both Test and Production environment
I have a separate Test and Production environment. Currently each environment has its own ODX database. But some cloud API source systems charge euros for loading records through an API connection. I’m therefor checking how I can limit the data transaction on the source systems. Also we only use production data in Pest and Production. Since we use Azure I don't want to load all the production data twice. One way is to use 1 shared ODX for both Test and Production. Then I only have to load the source data ones. I can think of the following options: 1. Use the ODX server I've tested the ODX server but I can't use historic tables and incremental loads don't work on most API connections. I also encountered limitations on a firebird SQL data source. So the ODX server doesn't seem like a workable solution 2. Use an external ODX business unit A external business unit works. But how can I use a single external ODX for both Test and Production as these environments run on different project repo
Use tables from one project in another project (from the same database)
We have multiple TimeXtender projects that use the same databases for their ODX/DSA/MDWs. What is the best way to use tables from one project in another one of the other projects? "Add External SQL Connection" seems to be an option, should this still be used if the table we want to connect to is in the same database?
subscription license activation failed with following reason: access to codeletter.xml is denied.
hi since this morning I get the message displayed in title. It is an Azure environment, TimeXtender was working on Friday, nothing changed on the license key... Any idea where to find/how to solve this issue??? thanks in advance Mrt
Issue: after deploy project in development environment, Data missing on scheduled execution with no errors in production environment
Now every time when I deploy my project in environment development, I need to deploy this project in environment production too. Event I don't want transfer my development project to production. If not, When executing projects on a schedule, the execution finishes without error, but tables end up empty. I don't know because I changed version (version 22.214.171.124) or I have configured something wrong, in last year I didn't have this issue.
Global Database Catalog missing
I am following the TimeXtender Basics course and at a certain point I am asked to create a new Global Database (specifically Datawarehouse) through the following steps:Create Global Database for DSAIn the Tools menu select Environment Properties Click New Global Database Type in DSA in the Name box Select Data Warehouse in the Type drop-down list and click OK Click the settings text box Type localhost in the Server text box Type TX_DSA_DEV in the Catalog text box Right click the settings text box, select Create Database and click OK, when notified of successful creationHowever at step 7. I get stuck as there is no catalog row. Does someone know how to fix this issue? I'm on version 126.96.36.199
TimeXtender Server Can't Start After Upgrade
I just upgraded from Discovery Hub 188.8.131.52 to TimeXtender 184.108.40.206and upgraded the repository. Everything works well with the deploy and execute, but when I try to restart the service, I get an error in the event log. Service cannot be started. System.Data.SqlClient.SqlException (0x80131904): A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 Could someone please assist?
Did anyone have a data dictionary that has been pulled from timextender system database. The ideal one I am looking at is either a search option on a html where you can look for a table with all the columns, descriptions and data mapping(source). Or if anyone did similar one on an excel document will equally help me.
Login to the community
No account yet? Create an account
Login with SSOSSO login
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.