Ask questions about TimeXtender ODX Instances
- 66 Topics
- 271 Replies
Connecting to CBS data in ODX server
Hi Community,I'm trying to connect the database from CBS (in both old and new version of TX). The original request URI gives an error response that the query exceeds 10.000 rows at test connection. This is a well-known issue with data from CBS, so I adapted the URI to include a filter. I tested this new query both in Postman and Qlik Sense Desktop and there the filter works and I retrieve aprox. 2.500 rows, from 2020 until 2022. Using this query in TX does not work unfortunately. The error that normally encounters when testing connection does not pop-up, however, the table still retrieves 10.000 rows. Probably just random from all available years (1995 t/m 2022).I am using below request URI:https://opendata.cbs.nl/ODataFeed/OData/70072NED/TypedDataSet?$filter=substring(Perioden,0,4) ge '2020' and substring(Perioden,4,2) eq 'JJ'with custom header: accept:application/jsonAny ideas what causes this? Thanks!
Connect to a Datalake through JDBC -> ODBC Bridge
Dear Sir,For one of our customers we need to connect with TX to a Datalake (CSI Cloud).Our customer is able to connect with PowerBI to their Datalake with a JDBC → ODBC Bridge piece of software from ZappySys JDBC Bridge Driver. This works fine.Now we need to connect with TimeXtender without this third party JDBC Bridge Driver software to save our client some ZappySys license money.So I checked with the CData component in TimeXtender, but it seems their is no JDBC → ODBC Bridge Driver available...Question: Is their a way to do this with the best Data Management Platform in the world ?Hope you can shine a light on this.Thanx in advance. Regards from Amsterdam,ArthurFourpoints TimeXtender Partner
Upgrading only one ODX project/environment to new version for testing
At one of our customers we have a separate cloud for testing software / infrastructure changes. We are running 20.10.35 and want to evaluate 20.10.40. I would like to be able to upgrade the ODX Cloud repository for only one of the projects/environments instead of upgrading both to 20.10.40 and needing to risk production issues. Is this possble / can this be requested?
Incremental loading with APIs
Dear Support,My customer is using AFAS. From this tool they load data into TimeXtender with an API.They have a wish to load the data from AFAS incrementally in the ODX.Is it possible to load incrementally from an API to the ODX server and what are your suggestions to start with? Best regards,Christian Koeken
HTTP 404 error on incremental load with update & delete detection
I am using ODX version 6143.1. When loading a table from a source system, incremental load on a datetime column (with an offset of 7 days) works fine. But when I set up the incremental load rule with Detect deletes and Detect updates selected, the first incremental load (which is the initial load) runs successful, but a second load (which should be the increment) gives me a 404 HTTP error:Response status code does not indicate success: 404 (The specified path does not exist.). Edit: problem only arises when setting the Delete detection option. When I remove this checkbox, the 404 error disappears (so the Detect primary key updates option does work correct).
CData Excel Online not showing shared Excel files
Hi,We are using the Excel Online connector which is authenticated with a service account and uses delegated permissions to access Excel files on Sharepoint. The idea is that all relevant files will be shared with this account and then loaded into our DWH. The required files are visible on Sharepoint when siging in with the user:But for some reason, they are not when using the Excel Online connector (with option Show shared documents = ‘True’). I know that the connector uses the /SharedWithMe (OneDrive) call to fetch shared items, since that is what I inferred from the logging.This call indeed retrieves no results through the Graph Explorer, but the files are visible through another, similar, call on the Graph Explorer (the one from ‘Insights’): Why are files visible on Sharepoint but not on OneDrive? Is there a way to work around this? I have seen use cases where they actually are visible on both Sharepoint and OneDrive and the connector is working properly.
Load automatic process CVS Files
Hi community, I have to design the following issue, every day, I have a new CVS file in the Azure storage account. These file’s name change with a new day, for exampleFormaExtract.net_new_commission_2023-03-01.csvFormaExtract.net_new_commission_2023-03-02.csvFormaExtract.net_new_commission_2023-03-03.csvHow I could create an automatic process to load a new file to the table in DSA? Now I will do it manually adding every day a new file an process the data.ODX DSA Another requirement is to create a new column SourceMainTable in the DSA table with the file’s name. RegardsIgnacio
20.10.x Theobald DeltaQ with ODX Server and updated records
We are running DeltaQ in ODX Server (20.10.35) to extract BSEG+BKPF data incrementally, but we are noticing that updates to records are not coming through. This is probably due to PK clashes when transporting to DSA. What happens is that after a process in SAP a document number is added to the transaction, this update may happen many days after the initial booking and we want to avoid having to reload huge chunks of data to DSA.It looks like the option Add Serialization Info to Output could help us, but this does not seem to add the expected RequestID, DataPackageID and RowCounter fields to the output. Is this option actually supported?
CDATA Postgres .rsd Schema Files
Hi community,I have a client attempting to load a postgres database into ODX Server (TX version 20.10.x). The table contains a field with dates that are invalid for SQL but valid in Postgres, for example 20011/01/01. Upon transfer, the task completes with errors that it is unable to parse datetime value. All datetime fields have been converted to VARCHAR(MAX) using Override Datatypes and this is confirmed by looking in the ODX Storage in SQL Server Management studio. Since the error still presents itself, it seems likely that the error is coming a step earlier, during the CDATA conversion. Unfortunately, it seems that there is no way to generate RSD/schema files from the CDATA Postgres provider. Normally I would change the datatype there in these situations, since it seems to be going wrong in CDATA somewhere. How can I load this table? Kind regards,Andrew - E-mergo
Upgrading data source provider version
TX: 6143.1 I have a TimeXtender SQL Data Source version 126.96.36.199 that I am trying to upgrade to 188.8.131.52. When I do Manage Data Sources in the ODX Server from the TimeXtender application it tells me there is an update available. When I apply this it tells me to ‘Edit the datasource’ to update the connection string. There does not seem to be an obvious way to do this. The User Portal also does not seem to have a way to update this.What is the process I should apply?
datediff function resulted in an overflow
Hi,I am using the ODX server to get data from the SQL server database.After executing the transfer task, I am getting the below error.Any idea how to solve this?“System.Data.SqlClient.SqlException (0x80131904): The datediff function resulted in an overflow. The number of dateparts separating two date/time instances is too large. Try to use datediff with a less precise datepart.System.Data.SqlClient.SqlException (0x80131904): The datediff function resulted in an overflow. The number of dateparts separating two date/time instances is too large. Try to use datediff with a less precise datepart.”
ODX mapping ussue
Hi Support,We are using an ADO Data Source connection to load data in the ODX into Azure.From there we map the ODX table to the next layer in TimeXtender.This seemed to work.But now we changed the source table with added and removed columns.We sync and transfer it in the ODX and when we use preview in the ODX we are seeing the new table as expected. But when we map it in TimeXtender again to the next layer, the changes are not added. It instead maps to what is used to be. It mappes fields to columns that no longer exist, and it does not map the new added columns.We tried clearing the files in Azure to start fresh but it still mappes to the older version that no longer exist. Why would it still mapp to the old table that no longer exist in the source and no longer exist in Azure?Why would it not mapp to the new table thats in Azure now?Is there some metadata in TimeXender that first has to be cleaned before changes can be mapped?Kind regards,Tamim
Cannot find files in Excel Online (Sharepoint with Delegated Permission)
We use Sharepoint for multiple data sources. Connecting the Sharepoint Lists works fine and I have a OAuth for Excel Online running as well, but my data source throws an error: Cannot find file. I need to handle multiple files according to this documentation:https://support.timextender.com/data%2Dsources%2D112/connect%2Dto%2Dexcel%2Dfiles%2Dwith%2Dexcel%2Donline%2D628 What’s the reason for this? My App Permissions are set accordingly: My Setup looks like this:
DB2 connector querying the wrong schema when selecting tables
We have set up a db2 connector in an installation with the new version of Timextender. The connection is ok.When we make a search to select tables, nothing turns up. Both the schema and table filters are set to <All>.It seems that Timextender searches the schema [sys_tables], but the server uses the schema [systables]. Without the underscore.The schema [sys_tables] exists on the server, but contains no data. I created a Query Table (Select * from [sys_tables]). It was executed succesfully in a Transfer task, but only returned empty columns.Is it possible to change the script so that the query uses the schema systables?I couldn't find anything about it in CData's online manualThe Timextender version is 6143.1, and the CData DB2 version is 22.0.8389.9
Extract CSV from SFTP with public key authentication
We have a SFTP source on which CSV files are stored and which can be authenticated to with a ‘public key’. When using the CData SFTP-connector in ODX Server, we are able to connect with the following settings: The most important setting here is the SSH Auth mode, where we specify that we are using PublicKey authentication. Since I am not familiar with this type of connector, I would not now how to retrieve the data from the CSV files (the only table we get from the source now is the ‘Root’ table that contains the FileNames that can be found in the folder).I would rather use the CData CSV-connector, but that leaves me with a problem when testing the connection: I cannot specify that I want to use the PublicKey authentication mode. That is, it is specified under ‘SSH Auth Mode’ but it is not possible to select under ‘Auth Scheme’: As a result, we keep getting the same error on testing the connection (see attached file). I have tried setting the Auth Scheme to ‘None’, ‘Auto’, ‘SFTP’ or ‘B
CData IBM DB2 - extracting results in The column [xxxx] does not exist
TX: 20.10.39ODX: 20.10.37ODX Storage: SQL Server DB on-premDSA/MDW: SQL Server DB on-prem We are successfully able to connect to a DB2 system and synchronize, we can also successfully use the Data Source Explorer to query tables. Every table we try to extract fails with the following error (schema/table/column names obfuscated): Executing table [schema].[table]:failed with error:edo220W.gm: The column [column] does not exist at edo220W.ai.GetOrdinal(String ) at System.Data.CData.DB2.DB2DataReader.GetOrdinal(String name) at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServerCommon(Int32 columnCount) at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServerAsync(Int32 columnCount, CancellationToken ctoken) at System.Data.SqlClient.SqlBulkCopy.WriteToServer(IDataReader reader) at DataStorageEngine.SQL.SQLStorageEngine.TransferWithBulkCopy(SqlCommand destinationCommand, String destinationTableName, IEnumerable`1 columnModels, IDbCommand transferCommand, DataTable transfe
CSV source loads all fields as bigint
I have created a CSV source in Timextender NG ODX. The data explorer shows the data type correctly for my table.However, when I drag the table to the DSA, all fields appear as bigint type. (by the way, I have setQuote Character to “ and Row Scan Depth to 0). Any help on how to resolve this issue, would be appreciated.
PK table in ODX database is growing rapidly in terms of disk space
Hi community,I’ve a question about the ‘PK’ table that is created in the ODX when you enabled the ‘Handle primary key updates’ and ‘Handle primary key deletes’.I’ve noticed the ODX database (SQL Database) is growing rapidly in disk space after enable an incremental load schedule to reload data every 5 minutes. When I used the default SQL ‘Disk usage by top tables’ I noticed the PK tables are the biggest in terms of disk space.My PK table of the GL Entry table contains 4.5 billion rows! And is 95 GB. While my DATA table contains only 118 million records.When I query the PK table and filter the primay key in this table on 1 value. This value is saved 75 times in this table. For every odx_batch_number (odx_batchnumber 0 – 74). Is this normal? I think it is a bit strange that my PK tables are the biggest tables in the ODX in terms of disk space.Even when I run the storage management task. It doesn’t clean the PK table’s. It always contains the primary key for every odx batch load.The cust
ODX API version error
Hi!We have had our installation for a while now but as far as I’m aware the application has the latest updates. As we are becoming more people in the project working at the same time I have installed the client on both a new server and on my own laptop. The problem is that when I try to connect to the ODX service running on one of our “old” servers I keep getting the error as in the attached pictures. Why is this? As you can see the application has the same version on both thte “old” and the “new” server.The correct ports are opened between the servers at least so that should not be an issue but I get the same error when trying to connect from my laptop. Anyone who has an input on this?
Login to the community
No account yet? Create an account
Login with SSOSSO login
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.