Because Time Matters
Start discussions, ask questions, get answers
Read product guides and how-to's
Submit ideas and suggestions to our team
Read the latest news from our product team
Online training and certification
Connect with like-minded professionals
Explore the new cloud-enabled version of TimeXtender.
Hi,We have been using ADF for transferring data from our ODX storage account to Azure SQL database for a long time but now our manual and scheduled reloads get stuck in ADF.It looks as if TX does not pass the runtime to ADF. This is required because our storage account is behind a private endpoint so we need to use an IR that resolves in our VNet in Azure.Do you have any idea why the runtime does not appear on the pipeline? As you can see there is no runtime assigned to this run: Normally the runtime appears in ADF: We are on TX 20.10.45 by the way.Would love to hear from you!
I am loading a fact table from a staging table, both are regular data warehouse tables. When I create a data selection rule on the mapping table, I'm unable to select the conditional lookup fields or the system fields from the source table: TX Version is 6590.1. In legacy TX, it is possible to select the conditional lookup fields and system fields for creating selection rules: We have implemented a workaround where we create an extra field in the DSA table that has a transformation that fills it with the value of the conditional lookup field. In turn, this extra field does show up in the selection rule fields because it is a regular field. But this is not a very nice or future-proof solution.Is this an intentional change and if so, can you explain why? Or perhaps this is a bug in TX SAAS?
Hi I am trying to use ADF to connect to a on-premise SQL Database with some very large tables for transfer.I selected the Azure Data Factory - SQL Server (14.0.0.0) as the source and created the datasource.When I tried to Synchronize the ODX source in TX desktop (6521.1) I got the error message that the assembly for the provider is not installed.I cannot change the provider without creating a new data source (product enhancement request!), Do I need to upgrade the TX Desktop to get access to the drivers? Or is it some other install that I need to do? I cant find any documentation on this.
Hi I wanted to verify what I am seeing as recent changes to the TimeXtender Portal data source definitions.I distinctly remember being able to switch how a data source was being used to upload data to the ODX whether it was going to use AD).NET or ADF. If I selected ADF then I had to enter the ADF credentials.This seems to have disappeared in recent changes. I was using CDATA REST API connectors, I can no no -longer switch between transfer methods without it seems defining new connections. Can someone confirm this and let me know how I can switch between these transfers without creating new data sources?
Hi,I have a very regular table that is loaded from a SQL Datasource. The table has a little over 3 million records and the following definition: Nothing out of the ordinary as you can see. The table is loaded with the ODX with an incremental selection rule on te mut_dt column. The ODX load works fine; loading the table full or incremental is no problem and the data is stored in the container as expected within 1 or 2 minutes.However, when I drag the table to the DSA data area and load it there, something strange happens. Loading the table to the DSA hangs after 900.000 records are inserted into the raw table. We have tried waiting for several hours, but it just hangs there and nothing happens. Other tables from the same source load fine to the DSA.Data source version is TimeXtender SQL Source v 19.1.1.0 (but load from source to ODX works fine)ODX and TX version is 6590.1 .
Hello,We are currently utilizing the new TimeXtender REST API 6.1 and leveraging its built-in paging feature to connect to an external REST API. We're not using an RSD configuration for this setup.I've encountered a challenge with pagination, specifically with the use of a parameter named "Cursor." This parameter is critical for the pagination logic in the API we're connecting to. Each row in the data returned by the API has a unique Cursor value, but for pagination purposes, only the first Cursor value from the current page should be used in the subsequent POST request to fetch the next page.Here’s how we've set up the Cursor parameter:Name: cursor Type: XPath Value: /TX_Autogenerated_Root/TX_Autogenerated_Element/data/arTransactions/edges/cursorThe issue arises because I need to ensure that only the first Cursor value from each fetched page is used in the pagination post body for the next request. I'm unsure how to configure TimeXtender to utilize only the first Cursor value for this
v20.10.35.64I just noticed there’s a bug in the current version of our ERP related to the UpdateDate & UpdateTS field which impacts incremental loading. Hence, I'm investigating the possibilities to set up a semi-incremental loading infrastructure. With my TX version I can also set up the following, which is limited: My database has data ranging from 2013-today. I'm looking to find a way to load 2013-2023 just once and then only full load the data with a UpdateDate >= 01-01-2024. I'm thinking to achieve this the following way:Set up a Query Table for every table where UpdateDate >= 01-01-2024. Set up a Load task, LOAD_FULL which has the original tables Set up a Load task, LOAD_INCREMENTAL, which has the Query table with just data from 2024. Execute the LOAD_FULL task just once. Schedule & Execute LOAD_Incremental every 30 minutes. If I do it this way, it's not a classic incremental loading process.Can I still utilize TimeXtenders ODX/DSA incremental loading functionaliti
I'm trying to connect to an InfluxDB source, where a POST request is expected, including flux script in the body to filter data. The output is in CSV format. I have this working in Postman, but how should I configure this in TimeXtender? Anything I try results in the error below, which also means no RSD is generated.[500] Could not execute the specified command: HTTP protocol error. 405 Method Not Allowed
Dear community,Is the only way to connect the ODX server to an Azure DB still on SQL Server authentication?No Windows or Azure integrated security yet?Thanks!= Daniel
I have setup an Execution Package and am running a number of Perspectives in the order they need to build in the project. I then call 2 External Steps, which are Execution Packages in other TX Projects in the same environment. I am having an issue where when a step fails in the project, the build then continues onto the External steps whereas I would like the entire package to fail. I thought this would be accomplished by selecting the FailPackage action on Failure Handling but it is not working. Do I need to set this up a different way to get this to work?(I have hidden sensitive data in my screenshot)
When choosing a execution job, just a single execution job, it’s grayed out and I’m not able to choose it.it’s deployed and the execution jobs works, when I run it manually.I’m running Timextender version 6590.1
Hi, I was wondering if it is possible to reference to the content of local file on the ODX server from within a RSD file. Such as Get-Content with Powershell. We have some RSD files for some of the api’s call that we are making. We are in the process of including these RSD's in our own source control system in order to keep track of the changes that we do with the RSD files. One of the api's is quite unstable and sometimes needs tweeking in the rsd file. (Source control on the RSD file's / powershell scripts files could be another topic to be honest).Anyway, we have to include an API key in the script that we would not like to have saved in the source control. Therefore, an option to reference a file in which we can store the API key would be great. I know that you can also include authentication in the portal, but we are looking to maybe avoid putting credentials as much as possible in the portal. The portal also creates a OAUTH map if you use OAUTH in the portal. But I don't know h
Hi all,I have v20.10.45 running in Azure with ODX server running for a client. We have about 15 ODX datasources. The synch/loads run daily and the server has been up for about 9 months, so no extraordinary amount of logging I think. The issue that I’m having is that it’s not possible anymore to check the execution logs of some ODX task runs. The popup doesn’t appear and TX goes into a “Not Responding” state. Any click in the client will crash the application. This happens when I have no TX project opened, only the Manage tab for ODX server (see screenshot).I’ve upscaled the VM from DS2v2 (2 core, 8GB) to D4S_v3 (4 core, 16GB), but this doesn’t change it. Restarting ODX server also doesn’t help.The issue is not with all tasks. Tasks for which all recent executions are successful will show the logging. Checking the event logging at ODX server level will crash and tasks with a lot of recent failures also crash. Edit 1: A connection that has this logging issue is a connection using the CD
Current advice on partitioning on a non-date field calls for creating a partitioning “bucket” using field transformations. While attempting to follow the instructions outlined here I encountered a problem. Partitioning works by creating a calculated field in the valid table that has the capacity to handle NULL values if the partition template is so configured. However, that same field, along with the calculated default value, is also present in the Raw table. This means that when anyone following TimeXtender’s guidance for non-date transformations executes the table, that execution will fail with the error “Cannot insert the value NULL into column ‘DW_PartitionKey’”Is this a bug, or does the documentation for creating partitions need to be updated?
Learn about troubleshooting techniques
Find a Partner that fits your needs!
Submit a ticket to our Support Team
Already have an account? Login
No account yet? Create an account
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.