Because Time Matters
New to TimeXtender? Get Started now!
Start discussions, ask questions, get answers
Read product guides and how-to's
Submit ideas and suggestions to our team
Read the latest news from our product team
Online training and certification
Connect with like-minded professionals
Explore the new cloud-enabled version of TimeXtender.
I am loading a CSV (comma separated) file with the ODX. After upgrading the ODX to the latest version, transferring any table with this datasource gives the following error: Syncing does work.ODX version: 6521.1TX version: 6536.1CSV Provider version: tried both 22.0.8276.0 and 23.0.8749.0 but same error.
Hi,I have set up a Storage Management Task with only the Rollup option ticked, and filled in the right ADF credentials. One of my source tables has been selected for this task.TX says the runs are successful and they finish in a fraction of a second, but nothing happens in the storage account. The version folders contain the full load file + all the incremental files, as before. I have tried with many different settings for min and max rollup file size. I have not specified “folder (optional)”. I have tried with and without specifying integration runtime.No pipeline run seems to show up in ADF after triggering the task.
We are running TX version 6536.1.Our data source is AX Dynamics 2012 on premise. We have the issue both with the SQL Server connector and the OLEDB connector.After synchronizing, we can't select any tables with right click on the data source → Select Tables → Search.It also takes > 30 minutes before the table selector pops up.Our only option is to select all tables.The Transfer task - Select table option does give us the option to select tables. However we can't select any individual columns. Neither does this give us the option to define any incremental selection rules or row filters.
We have several tables that are quite large. We are attempting to add both partitions and clustered columnstore indexes to make the data more manageable. This is not only possible but is recommended by Microsoft for certain use cases. However, enabling both partitions and clustered columnstore indexes appears to be impossible in our current version of TimeXtender, 20.10.29.Attempting to do so produces a message saying that SQL Server does not support the combination of these two features, which, as previously mentioned, is not true. Can anyone shed some light on this?
I need some guidence saving contents ofan endpoint as BLOOB. I can see that the version we use doesnot have an option for BLOOB but may be Text or Binary with maximum length can give me what i need.So I have this RSD that i created <api:script xmlns:api="http://apiscript.com/ns?v1" xmlns:xs="http://www.w3.org/2001/XMLSchema"> <!-- See Column Definitions to specify column behavior and use XPaths to extract column values from JSON. --> <api:info title="GetAllStops" desc="Generated schema file." xmlns:other="http://apiscript.com/ns?v1"> <!-- You can modify the name, type, and column size here. --> <attr name="kortnavn" xs:type="string" readonly="false" other:xPath="/json/kortnavn" /> <attr name="tittel" xs:type="string" readonly="false" other:xPath="/json/tittel" /> <attr name="virksomhet" xs:type="string" readonly="false" other:xPath="/jso
Hello, TimeXtender: 20.10.40.64ODX: 20.10.31We are using the ODX and TimeXtenders SQL database connector to get data from a synpase database. Sometimes the synpase databas seems to have problems, where a query runs and doesn’t complete (a seperate issue we are working on).However what happens in TimeXtender in the ODX is that the transfer task keeps running indefinitely and never times out, leading timextender execution package to run but just sits there waiting for the odx transfer task to finish. However nothing fails anywhere, so we do not get an alert.We need some way to be able to capture this.Is there anyway to set up one or both of these things in TimeXtender or if you have any other ideas?1. Kill the transfer task if it takes more then X minutes2. Get an alert on failure or long running taskThank you!
Hello,TimeXtender 20.10.31.64ODX: 20.10.31We are currently running a database on hyperscale with 12vcores and premium memory optimized hardware. ODX is a data lake.We are running execution packages on 4 threads and we want to reduce our odx transfer times as that is out biggest impact right now.This is an example of a load. I am seeing long ODX transfer times for very small tables that I cant find the cause of or explain.We have alot of small tables (see example below) that is taking 1-3 minutes, while some tables doing data cleansing on 1million + rows is doing it in the same amount of time. See this small table that is 400 rows, no transformations and only 10 columns. I can see we are peaking log io at 100% during certain periods.However, that seems to be during the later parts of the loads and not during odx transfer in the start. Is there any thing that can cause this or possibility to improve it?Thank you,Victor
Hi,We are setting up a new Azure environment for a client and we are having some issues with the Azure SQL DB.The app server VM is based on the one available in the Azure marketplace:https://azuremarketplace.microsoft.com/en-us/marketplace/apps/timextender.timextender-app-server?tab=OverviewThe issue occurs when we try to test the storage connection for the MDW instance. When we try to use Azure AD integrated or password authentication we get the error that adalsql.dll is not able to load when we test the connection in the desktop app.Azure AD integrated authenticationAzure AD password authenticationThe URL leads to a 404 not found: http://go.microsoft.com/fwlink/?LinkID=513072I found these two posts about AAD integrated auth, but it is not clear to me if it is possible or not to get it to work:I didn’t find any discussions about the Azure AD password authentication.Are these not supposed to work as authentication methods for a DW instance deployed in an Azure SQL DB? If they are, is t
Dear Support,My customer is using AFAS. From this tool they load data into TimeXtender with an API.They have a wish to load the data from AFAS incrementally in the ODX.Is it possible to load incrementally from an API to the ODX server and what are your suggestions to start with? Best regards,Christian Koeken
Hi, We are running Timextender 6536.1 with execution server 6536.1 on on-premise sql server 2019.We seems to be having some strange behaviour with the “Manage schedule” option in with editing a job. Whenever we open the schedule it seems to trigger the job itself. I don't know if others experienced the same. I also don't know if it is related to Timextender 6536.1 since we had to do a lot of updates and configuration changes to the server lately due to a tempdb problem. Below I added a screenshot of the option, it applies to any job not only the odx one I'm highlighting.
Hello, For a number of times now, it happened that the included step in a execution package disappeared after pushing the ssl dev to ssl prod. I would expect a full copy of the instance. This only happens when pushing the ssl. Pushing the MDW works just fine. Perhaps a bug, otherwise I am curious about a solution.
Hi guys,We are creating a users table in Timextender based on Active Directory. However, there’s also an on premise Exchange server, adding Exchange attributes to the user in Active Directory. But it seems we cannot read these attributes. We use the CData Provider for Active Directory in our ODX server. When trying to add the fields, the Exchange attributes are not in the list. How can we add these attributes? Do we need the CData Provider for Exchange? If so, has anyone set that one up and is willing to provide more details? Or how can we manage to import all the Exchange attributes in our user table? Thanks!Stijn Hensen
v20.10.35.64I’ve got a History table with ± 16 million rows in PROD and ± 2.3 million rows in TEST. To increase loading times I tried to delete some columns of the table, first in TEST. Here I was able to successfully Deploy & Execute the History table with Differential + Managed deployment, which finished in several minutes.However, when I try the same procedure in PROD it keeps stuck on the step Deploy Valid Table Structure as shown below. I also tried to Deploy with Differential Deployment unchecked, but with no result. It just keeps loading forever.Anybody an idea?
v20.10.35.64I just noticed there’s a bug in the current version of our ERP related to the UpdateDate & UpdateTS field which impacts incremental loading. Hence, I'm investigating the possibilities to set up a semi-incremental loading infrastructure. With my TX version I can also set up the following, which is limited: My database has data ranging from 2013-today. I'm looking to find a way to load 2013-2023 just once and then only full load the data with a UpdateDate >= 01-01-2024. I'm thinking to achieve this the following way:Set up a Query Table for every table where UpdateDate >= 01-01-2024. Set up a Load task, LOAD_FULL which has the original tables Set up a Load task, LOAD_INCREMENTAL, which has the Query table with just data from 2024. Execute the LOAD_FULL task just once. Schedule & Execute LOAD_Incremental every 30 minutes. If I do it this way, it's not a classic incremental loading process.Can I still utilize TimeXtenders ODX/DSA incremental loading functionaliti
Hi all, I am relatively new to SQL. Although with help and searching, I am making progress, I am not succeeding with the following question. Based on a date, I can easily set a filter in my reports to look ahead for x period of time. However, when a holiday comes into view, my report becomes blank. For this, I am looking for an index/row_number/sequence that skips the holiday dates. Can I set up a Custom Field for this within my BaseCalendar?
Learn about troubleshooting techniques
Find a Partner that fits your needs!
Submit a ticket to our Support Team
Already have an account? Login
No account yet? Create an account
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.