Because Time Matters
New to TimeXtender? Get Started now!
Start discussions, ask questions, get answers
Read product guides and how-to's
Submit ideas and suggestions to our team
Read the latest news from our product team
Online training and certification
Connect with like-minded professionals
Explore the new cloud-enabled version of TimeXtender.
Hi, I am syncing my data between ODX (csv files) and my DSA_Clean data area (Data waehouse). No i’m constantly getting an error that 1 of my previous inserted rows doesn’t exist anymore. I’ve tried to sync all my possible steps, so ODX, Source, Data are, but I keep getting the error and I don’t know hpw to resolve this problem.I tried several steps:1 Add the table as new table in my datawarehouse area, but still same error.2 Remove and add column again
Hello, I have built a data model in Timextender and with a Power Bi endpoint. In the Power Bi report I see certain fields summarized by ‘Sum’. In the SSL layer I only get the option to change te name and I am missing the 'Summarize by’ and the ‘Sort by’ option. What am I doing wrong?
Good morning,What is the status of the upgrade wizard or tool to migrate TimeXtender to the new version 6xxx?We are waiting for it for quite a long time now and the only status i can find in this community is from 4 month ago that it still is not available yet.Customers are even beginning to think about other data estate solutions becasue of the silence from your side and the uncertainty of when it becomes available.Can we please have more clarity on the timelines regarding this tool?
Good Day,I have setup a Perspective in TimeXtender (v20.10.39.64) and it includes 3 tables and a Tabular cube. When I go to deploy this perspective from DEV to QA, the tables are listed in the deploy tasks but not the cube.I tried the following fix but this did not resolve the issue:Remove Tabular cube from perspective Save TX Project Re-add Tabular cube to Perspective Save TX Project Deploy TX Project Multi Environment TransferHas anyone ever run into this issue before?I am going to try to add a new perspective to see if this resolves the issue but it worries me for promotes going forward that objects maybe missing on my deployment/promotes.Kerry
I have table A with history enabled, all columns SCD type 2 except the PK and a record is marked as deleted when deleted in the source (with a seperate record).In the next data area / data warehouse (DSA) i want to create an incremental table B based on table A, but I do not want to have deleted records from A in table B.When follwing the incremental rule wizzard the incremental selection rule can be set based on the incrementaltimestamp but deletes can not be handled because table A has no deletes (is thomstone =1).How to solve this the easiest? I can only think of taking along the Isthombstone field to table B and delete the records in a post script but this should be possible musch easier I would say.Suggestions?
Hi,I am running a legacy version of TimeXtender (version 20.10.40.64), where we have an Excel Online connection set up using CData ADO.NET Provider for Microsoft Excel Online 2022 (22.0.8389.0) as data source.This data source is configured to authenticate via Azure ‘client flow’, see picture. The app registration in Azure has the following permissions: The data source works properly most of the time. It is able to list the worksheets it finds on the SharePoint site, and can fetch data. However, scheduled execution packages sometimes fails with the following error message:[500] Could not execute the specified command: Error while listing workbooks for drive: [generalException] General exception while processing. Details: Error while listing workbooks for drive: [generalException] General exception while processing. Module: System.Data.CData.ExcelOnline fx220l.yg at fx220l.IIu.m(Boolean ) at fx220l.IIu.X(tbp`1 , Boolean ) at fx220l.IIu.YI(LoL )
Hello, i use Semantic layer as an endpoint to analyze the data through Microsoft Excel. In my model i have Calendar table (auto generated by TX) which dates from 2015 up to 2030. Fact in my dataset has values only up to year 2026.When i open analysis services endpoint through Excel and want to filter values based on Calendar, i can see the empty row in filter. I assume this is due to the reason that Calendar dimension has values up until 2030, and the fact - up until 2026:Is there anyway to not show empty values coming from dimensions (that do not have matches within the fact) in the analysis services from semantic layer? Thanks!
The ODX server stopped (per 25/8) synchronizing it's job status with the portal.Restarting the ODX Server does not solve the issue. Any tips?
Hello to all the community!😃When I make any change to any table and run deploy and execute, at the end of the procedure, I find he table with very little data compared to what should be present (example out of 90.000 records it loads only 4.100). I tried to check the various settings but I don't see anything wrong, can you give me some advice? TimeXtender vers. 20.10.32.64 Thank you in advance
This article clarifies two different methods of adding a SharePoint site as a data source in the TimeXtender Portal. Method 1: Username and password.This method specifies a user account that does not have MFA two factor authentication enabled and has been granted access to the SharePoint site. The configuration in the TimeXtender Portal for the SharePoint data source is as follows:Provider: Microsoft SharePoint Auth Scheme: Basic Share Point Edition: SharePoint Online URL: <SharePoint Site URL> i.e. https://contoso.sharepoint.com/sites/PBI User: <username> of a user that has been granted access to the site. Password: <password> for the specified user.Method 2: OAuth.This method entails entering the appropriate OAuth information for the specified SharePoint site. This information is outlined below and may be available via the SharePointAccess resource in the Azure Portal.In the TimeXtender Portal, create a new data source and set the provider to be “Microsoft SharePoin
HelloAt the moment we are processing our MDW datawarehouse to an Azure SQL Database and then we copy data to Snowflake.We want to change this to Snowflake directly however only when it is possible to fully deploy to Snowflake (transformations included).I know it is hard to give an exact timeline when this becomes available but do I need to think in a couple of months or in a couple of quarters or is this even low priority at the moment?Looking forward to your response,Greetings, Roy
Hello, To my understanding TimeXtender determines the data type of a column based on first x amount of rows, depending on your setup in the portal. The fields in my source come in as (dd.mm.yyy) for example: "05.03.2021”. I use a Storage Account with a CSV connection. I would expect that TimeXtender creates a "date” date type out of this. However, is seems that if a field character starts with a '0’, it skips it. In the ODX it shows an Integer value '5032023’.ODX: Depending if a character starts with a '0’ the fields (dates) show up like: It counts for all date fields and costs me a lot of time to create a work around each time. I tried to override the datatype in my data source but without succes. Help would be much appreciated.
I have this table where i get some time stamp called start and end, i split those up into more tabels with start_hour,start_min,end_hour,end_min. There is a time between 12.00 and 12.30 where the people goes to break if they work over the break they get 30 min less in thier total worktime, if they start doing break or end during break i want to make a transformation that subtracts the time spent if they start thier shift in the break. I have 2 fields called Start_in_lunch and End_in_work. I made this into a boolean conditions so if they start or end in lunch it should trigger, how can i do this. I tried some DateDiff with some Timefromparts but with no luck. The simple solution saying Start-30 and end-30 works but doing that devided by 60 does not work. Any good ideers
What does “Remove Unused Objects” do? I cannot find documentation about it. Thx!
Why does the app registration for the ODX need owner permissions on the resource? I would think that read/write permissions would be sufficient. Why does TimeXtender need the “extra” rights?
Learn about troubleshooting techniques
Find a Partner that fits your needs!
Submit a ticket to our Support Team
Already have an account? Login
No account yet? Create an account
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.