TimeXtender Desktop Q&A
Ask questions and find answers about the TimeXtender Desktop Application
The use of PBI semantic models require to use XMLA endpoint, that’s mean that we need a Power Bi PPU license. That it’s ok.The question it’s why not use the deployment pipelines feature in PBI PPU?The principal problem that I found it’s that it’s not possible to parametrize the data source conections to use correctly this feature.Somebody has experience in this topic?
Greetings,I have an odd error and trying to eliminate where this issue is coming from.I have a server running SQL server 2016 and TX is running on top of that. The version of TX I am running is 126.96.36.199 in Server 1.The errors I am getting are attached. TX is running ok in Server 1 which is DB server.There is Analysis Server 2 which seems to be an issue and is creating the errors above.The attached error is coming from PowerBI Report Server interface. We are using Microsoft PowerBI tool for our reports. I have increased memory allocation in Server 1 where TX is running and issue seems to still be there.Our products are all 64bit version.Any Idea or view of what could be causing this.
Hello,we're using Timextender with ADF for the data movement between our Data Lake storage and SQL Database. Running the execute from TimeXtender of rerun it within ADF the performance is poor (slow!).Our setup at the moment, regarding to the specs within the topic "Create a Demo Template Project TX SaaS”. We wanted to start basic and then tweak the settings for better performance. Executing the first SQL DatabaseService tier: General Purpose Compute tier: Serverless Hardware: Standard (Gen5), Min/Max vCores 10 and 30GB Min/Max Memory Auto-pause delay: 1 hourIntegration Runtime - type AzureComputer Size: smallWe tried using a different Computer Size on the IR and also changed the SQL DB Tier for more DTU. No performance improvement after changing.Looking at one of the jobs we are transferring 2048 rows from our Data Lake to our Azure SQL.The Queue is taking most of the time. We did some research on the support page of TimeXtender (No topics) and also at the Microsoft community. Ther
We are implementing a data warehouse with data from Dynamics 365 F&O and I’m thinking that we will have money data from UK in pounds and from Spain in euros and we will need currency conversion (based on time) to resolve this situation.Some body has considered this situation?How must it be implemented?
For one of our clients I need to connect to a database on Amazon Redshift. The IT Vendor who made the system is telling us to use SSL mode Required ( https://docs.aws.amazon.com/redshift/latest/mgmt/connecting-ssl-support.html#connect-using-ssl ). Using this setting in the ODBC connector from Amazon itself works fine but the CData connector only gives a true false dropdown. I've checked if this perhaps was one of those fields where you can overrule the drop down with manual input but that's not the case for this one. Is there any way to get this working in the CData connector or would our only option for now by create DSN and then use the ADO ODBC option the the ODX to connect with the DSN.
The following article describes several types of data masking: https://legacysupport.timextender.com/hc/en-us/articles/5463363539101-Dynamic-Data-Masking .Does TimeXtender allow custom data masking as well? The example table shown at the top of this article is exactly what I am trying to accomplish. However, the article doesn't describe this type of masking, does it?
Hi Support, Mount Anvil have a Incremental project to run the incremental load of the finance system data. Changing the incremental execution rule on a table called G/L Budget Entry from the 'Modified At' field to the 'Last Date Modified' field is preventing the execution from running. The 'Modified At' field is in a date\time format and the 'Last Date Modified' field is just a date only format that appears to be causing the execution failure. The error messages caused are below: Finished executing project 'Incremental' Execution Package 'Update Project' Execution failed Start Time : 24/01/2023 16:31:05 End Time : 24/01/2023 16:32:33 on server: MAV01APP01 -Execute Execution Package Update Project 'Failed' -Execute Business Units 'Failed' -Execute Business Unit Business Unit 'Failed' -'One or more errors occurred.' -Execute JetBCStage_I 'Failed' -'One or more errors occurred.' -Execute Table JetBCStage_I TEST.BC_G/L Entry (17) 'Successful' -Execute Table JetBCStage_I TEST.RowCountGL 'Su
I need to be able to call an API using a reference number from the ERP system ( which is already in the ODX) as part of the URL. It looks like the Dynamic parameters as descibed on the old support site could work. However it is stated that this will only work in the old fashioned business unit. Is there a solution that would work in the ODX or would a Power Automate/Logic App be the only solution.
I have a transaction table in odx with the following fields (simplified)Key,Value, transactiondate Example100, V100, 2023-01-01100, V100A,2023-01-31 09100, V100B,2023-01-31 10 Each day there are multiple transactions added , of course with another transaction date.Note in the example I have only shown the date part (not hours, minutes, seconds) What I need is a table with the latest update in DSAI created the same table. Set primary key to KEY and added an incremental selection rule on transactiondate . Added history settings to update the Value and transaction date based on the keyWhere it goes wrong is if we get in one day multiple transactions 100, V100A,2023-01-31 09100, V100B,2023-01-31 10I traced down the cleansing procedure and Tx detects that there are twice the same key. So far so good.Next it puts the ID of the latest transcation into <table>_L and only processes the ID’s that ar not in <table>_LThe result is that I get 100, V100A,2023-01-31 09 which is incor
Dear Community,I like to build my data estates with supernatural keys but in lager datasets the data cleansing starts to take very, very long. Do you happen to have the same issues? Is there a way to make the supernatural keys load faster? Even with incremental loading it begins to be super slow:I've ran a test on 435,397 records. This is on a Azure SQL with 10 vCores1. is a full load on the table with 7 supernatural keys.2. is a full load on the same table without the supernatural keys.1 has data cleansing of 1 second. 2 has a data cleansing of 104 seconds!Second I've done a test on the same tables but now incremental loads:1. incremental load with 7 supernatural keys2. incremental load without supernatural keys1 has a data cleansing time 1,6 seconds and 2 a data cleansing of a whopping 129 seconds!I'm not so sure I want to keep using the supernatural keys. What do you guys do? Take care= Daniel
Hello,I'm working on a project (new license model) where we want to use the Azure Data Factory for Data Movement. I've installed the TX 6143.1 version but got the message “Data Factory source is out of date”. Looking at the release notes this version of TX doesn't support ADF Data Movement at the moment:“Warning: The new version does not support the data source providers that move data using Azure Data Factory (e.g. "Azure Data Factory - SQL Server (188.8.131.52) 64 bit").”I've decided to downgrade to TX version 6117 but I'm getting the message as shown below (screenshot). It looks like there is another Repository from both installations.How can I solve this and remove one of the repositories. I want to use the ADF Movement so the repository for version 6117 is needed.Thanks in advance!VinceVicta B.V.
Hi, I have a list that maps product_codes to product_ID's.What I would like to create is a Key Store that, when fed a product_code from the list, produces the same product_ID and when fed a new code (so not in list) produces a supernatural_key as usual. Is it possible to force the keystore to create the ID's as shown above?
Hi,I have the following setup in my Dev/Prod environments:ODX Shared, all on my live environment DSA / MDW separate for each environmentA daily refresh starts with ODX and on success moves to DSA. So far so good.On the ODX step, there's a Usage condition for Environment (project variable) = ‘Prod’.I added this because I don't want to start 2 refreshes of the same ODX from both Prod&Dev.However, the Dev environment goes straight into the DSA refresh because the ODX ‘starts’ and finishes instantly. Meaning the DSA only gets a few thousand rows from the still-refreshing ODX (largest table has 2000 rows where I'd expect 2million+).Is my assumption correct in using the Usage condition? How can I use the same Execution packages for Dev/Prod while still using the same ODX ?Thanks for any help!
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.