Because Time Matters
Let's make some noise! Help us spread the word about TimeXtender - Submit your vote here
Start discussions, ask questions, get answers
Read product guides and how-to's
Submit ideas and suggestions to our team
Read the latest news from our product team
Online training and certification
Connect with like-minded professionals
Explore the new cloud-enabled version of TimeXtender.
Hello,We designed our TX solution in Azure and set up an Azure SQL db as our repository database with a specific Azure AD group (which contains our developers) as ‘Azure Active Directory admin’. Our organization has enforced MFA, which unfortunately makes it impossible for us to connect to our Project Repository through AAD authentication. The option is simply not available:Have you encountered this before? How can we work with this? It would be really nice to use the ‘Azure Active Directory admin’ setting instead of working with sql logins.
At one of our customers we have a separate cloud for testing software / infrastructure changes. We are running 20.10.35 and want to evaluate 20.10.40. I would like to be able to upgrade the ODX Cloud repository for only one of the projects/environments instead of upgrading both to 20.10.40 and needing to risk production issues. Is this possble / can this be requested?
I have a setup of a TimeXtender 20.10.40.64, ODX, ADL, ADF, Self hosted IR which I get to work just fine. I need to extract data from an IFS10 source, Oracle database, and it works until syncing then it seems to start an endless loop and never finish or gives an error. In TX I havesetup a new data source, Azure Data Factory - Oracle included all connection info information In Select tables I click “Search” and get all the tables available in the Oracle DB I Include a couple a tables Setup a transfer task Syncing starts running and I can see in ADF that is is running but it never receives any data from the data sourceI have tested from a SQL data source with ADF and it works just fine, both sync and transferI have in ADF tested to preview tables in the dataset created by TX and I can preview tables from Oracle DB just fine. I have even tried to create my own pipeline in ADF to copy data from the same data source to the same sink that TX created and it works just fine. It creates a parqu
Hi Community,I'm trying to connect the database from CBS (in both old and new version of TX). The original request URI gives an error response that the query exceeds 10.000 rows at test connection. This is a well-known issue with data from CBS, so I adapted the URI to include a filter. I tested this new query both in Postman and Qlik Sense Desktop and there the filter works and I retrieve aprox. 2.500 rows, from 2020 until 2022. Using this query in TX does not work unfortunately. The error that normally encounters when testing connection does not pop-up, however, the table still retrieves 10.000 rows. Probably just random from all available years (1995 t/m 2022).I am using below request URI:https://opendata.cbs.nl/ODataFeed/OData/70072NED/TypedDataSet?$filter=substring(Perioden,0,4) ge '2020' and substring(Perioden,4,2) eq 'JJ'with custom header: accept:application/jsonAny ideas what causes this? Thanks!
I have an Excel sheet with percentages (11 decimals). The decimal sign is comma. See example below.Year Period CostPercentage2020 1 0,018001882020 2 0,016852519But when I read the data in TX then I see the below result: I have tried to use the culture feature see below but no success. Find a screen dump belowWhy does TX / Cdata not read the data as is? The “,” is interpreted as a Thousand separator.Any help will be appreciated.When I have a file with a “.” (dot) as a decimal sign I convert the data to numeric precision 38,scale 11 and that works.
You can keep track of what the ODX Server is doing and has done in a number of ways.ContentsContents View Executing and Queued Tasks View Previously Executed Tasks Main Execution Log Specific Execution Log View Log of Service Calls View Data Source and Data Storage StatisticsView Executing and Queued TasksIn the Execution Queue, you can see the tasks that are currently executing, waiting to start or just finished executing.To open the Execution Queue,Be sure the ODX instance is marked and in the Tools menu, click ODX Execution Queue. The ODX Execution Queue menu can be opened whenever this icon is available. The ODX Execution Queue shows currently running, pending tasks and recently completed tasks.You have the following options.You can click on Refresh or press F5, to see updated tasks. You can click on Stop, If you click on a running task, a Stop button will appear, that can be used to stop a task execution You can click on Remove to remove a pending or completed task.View Previo
Hi,We are using the Excel Online connector which is authenticated with a service account and uses delegated permissions to access Excel files on Sharepoint. The idea is that all relevant files will be shared with this account and then loaded into our DWH. The required files are visible on Sharepoint when siging in with the user:But for some reason, they are not when using the Excel Online connector (with option Show shared documents = ‘True’). I know that the connector uses the /SharedWithMe (OneDrive) call to fetch shared items, since that is what I inferred from the logging.This call indeed retrieves no results through the Graph Explorer, but the files are visible through another, similar, call on the Graph Explorer (the one from ‘Insights’): Why are files visible on Sharepoint but not on OneDrive? Is there a way to work around this? I have seen use cases where they actually are visible on both Sharepoint and OneDrive and the connector is working properly.
Hi,We wonder if someone of you already ran into this issue, and perhaps have an idea on how to solve it ?Setup :TX 20.10.38 using Azure SQL DB for DWH and ADLS gen2 for ODX Server. ADF used to load data from ODX Server into DSA. All databases have private endpoint, so SelfHosted Integration Runtime is used by ADF.Issue :From time to time (random - not always on the same tables - for both incremental and full load tables) we notice that the ADF pipeline takes the normal amount of time to load data from ODX to DSA R-table. However the cleansing takes 0 (zero) seconds and the record count log reports 0 (zero) records in the R-table. ADF nor TimeXtender Execution report an error.In the ODX Service Log we often notice this error : Failed to encrypt sub-resource payload… (not on exact the same moment as the execution - so not sure it has something to do with the issue)Work-around : ADO.netWe had a simular issue last year and started using ADO.net to load data from ODX Server into DSA. We
I'm following the steps explained on this page https://support.timextender.com/data-sources-112/connect-to-excel-files-with-excel-online-628#Connect+with+the+Excel+Online+CData+provider, but I couldnt get it. I used Microsoft SharePoint Excel Provider.I have a 48 Excel file. I uploaded them to my company's onedrive. I then gave OAuth to Cdata. But I took this error. ‘You must specify an Excel file: Set the FILE property to an .xlsx file.’ The file url address where the excel files are located like this:https://companyname-my.sharepoint.com/:f:/g/personal/xxx_companyname_onmicrosoft_com/xxxxxxxxxxxxxxxxxxxx Now I have written this above address in the 'URL' section under 'Authentication'. Is this true? Or should I specify in the 'Folder' and 'File' sections in the 'Connection' section? Which Auth Scheme should I choose?
To prevent accidental data loss, deleting a table in TimeXtender UI does not delete the physical table in the data warehouse. The downside is that tables deleted in TimeXtender still take up space in the database.Identifying and Deleting Unused TablesThe SQL Database Cleanup Tool enables you to identify tables left behind by TimeXtender and delete - drop - them to free up space. Note that database schemas are not deleted from the database. You will need to drop those manually in SQL Server after deleting them in TimeXtender.Warning: When you drop a table with the SQL Database Cleanup Tool, it is permanently deleted. Use caution when running this tool.To clean up your data warehouse, follow the steps below.Right click a data warehouse, click Advanced and click SQL Database Cleanup Tool. TimeXtender will read the objects from the database and open the SQL Database Cleanup Tool window. The objects in the database that are no longer, or never was, part of the currently opened ins
Hi everyone After successful synchronization of a csv file, I get an error in the transfer job. The error is: ‘The schema XXXData_CSV is invalid. Execution is canceled’ But with Query Tool I can see all of data. In addition If I click to preview this source in ODX , I take this error 'the sql server contains no data for the table'. What changes should I make to the schema of this csv data? Best Regards
Hello everyone,After the odx transfer task for API, when I click on the preview, I get the “invalid column name” error. This warning also occurs while executing in the data warehouse and stops execute. What should I do? I wanted to use aliases but I have 575 columns.I tried to re-deploy the table with differential deployment disabled. Version: 6143.1 My column name example is below: it is very long‘properties_parameter_WS10M_RANGE_202109’ ThanksBest Regards
Learn about troubleshooting techniques
Find a Partner that fits your needs!
Submit a ticket to our Support Team
Already have an account? Login
No account yet? Create an account
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.