Recently active
Hi, I have a customer for whom with PoC, we need to connect Oracle Data Source using API. Need help on either documentation Or how to connect the same. Anyone has experience to connect, please share.
Hello, When you have multiple endpoints configured in the portal and endpoint B uses dynamic values from endpoint A, the the data on demand functionality doesn't work for tables that need endpoint B. Tables derived from A get the data on demand just fine. The exception message says: "OnDemand execution failed for data source '<datasource>' table '<table>' was not executed” I encountered this in a customer environment and then tried to replicate it locally in my sandbox setup. To my surprise, the data on demand worked in my sandbox. However, I had an older version of TX installed. I updated to the latest version on my local laptop and then encountered the exact same problem. This would suggest that it’s a bug in the new version. My version number for TX is 6848.1.
Question; I have a project that contains the CDATA Google Ads connector. I need the view ShoppingProducts that was added recently to the connector:Unfortunately I can't find in in Data Selection. Is seems to be added in version / build 9053 of the connector but TX shows me version 9036 is the latest:I have resynchronized the source but as you can see below it’s not in the list:Is the latest CDATA connector available in some way?BR\Dirk
Hello, I want to get metadate from my Azure datalake using their Blob API. I wasn't seeing any data in the Ingest storage so I turned on cashing to file, to try to see what's happening. There are three files in my cashing folder: Data_.raw: The return of the call, i.e. my actual data. This look excellent, except that it's a .raw file. Contents: <?xml version="1.0" encoding="utf-8"?><EnumerationResults ServiceEndpoint="https://xxxx.blob.core.windows.net/" ContainerName="datalake"> <Prefix>my_prefix</Prefix> <Blobs> <Blob> .... </Blob> </Blobs> <NextMarker/></EnumerationResults> Data_.xml: Basically the same as the Data_.raw, but with the content of Data_.raw as the data of a value-element. The data also contains the XML header (so now the document has two headers) and the brackets have been encoded (i.e. all the `<` are now `<`). <?xml version="1.0" encoding="utf-8"?><Table_flattening_name
Dear, when creating a new data source connection, my colleague cloned it from an existing instance and the adjusted the ingest instance mapping. Whereas mistakenly there is an unwanted character in the short name as below.After hitting the clone button it appears not possible to change the short name anymore. Thus the name in the storage account container is also incorrect. Could you advice how we can change this short name?The setups (such of pk and incremental settings) of the data source are completed, and the DW_SourceCode is also used as a calculated column and loaded to the following layers so it is not impossible to create the data source from scratch but would cost a lot time.. That is why we want to see if it’s still possible to be edited.Thank you in advance
Hi all,I have an API that uses nested XML to deliver the data. I have used a relational model to be able to retrieve this data in separate tables. Based on key fields that are created I am able to join the tables back together. All works fine, however, I want to reference a (different) field from my parenttable in a nested table. Currently, the output adds a field _id in my parenttable PerformanceInfoRow to be able to reference the underlying data back to the specific period. However, this _id contains an integer. Daily I will get three rows of data and daily I will thus get back the _ids 1, 2 and 3. So, in the end this will not be a unique key when retrieving more days of data. Therefore, I want to specify the Period that also is available in the parenttable, which is unique since it will not only show the date but also whether it is morning, afternoon or evening. This will be unique, but unfortunately I am not able to add this field to the underlying tables. I have tried to adjust my
Anyone have a suggestion about how do i use this XML to ingest data into ODX?<?xml version="1.0" encoding="utf-8"?><soap:Envelope xmlns:soap="http://www.w3.org/2003/05/soap-envelope" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"> <soap:Body> <ExportDocsProjectWithTendersResponse xmlns="http://publicprocurement.com"> <ExportDocsProjectWithTendersResult>{ "ProjectBodies": [ { "ExternalId": "666666", "DefaultEvaluationModel": "LowestPrice", "DocumentNumberingSchema": "Numeric", "SectionNumberingSchema": 0, "RequirementNumberingSchema": "Alphabetic", "Metadata": [ { "Name": "ProjectName", "Value": "Ingesting Data into TX" }, { "Name": "Reference", "Value": "123456" }, { "Name": "ContractType", "Value": "XML" } ], "DataFields": [ { "ExternalId": "1",
Hi,We are trying to connect to several CSV files stored in a local folder. While we can successfully synchronize the data source and perform a full load in the ODX, we encounter an error when attempting to add the table to our data area (DSA).The issue lies in the path to the Parquet file stored in Azure. The correct path should be:CSV_DNB/csv_*/DATA_2024_11_28__11_09_50_2219585/DATA/DATA_0000.parquet However, the path Timextender is looking for is:CSV_DNB/csv_^*/DATA_2024_11_28__11_09_50_2219585/DATA/DATA_0000.parquet It seems that Timextender is misinterpreting the automatically generated name and adds a ^ character.I also attempted to use a specific file aggregation pattern, such as;H100.*.csv(all files in folder have the prefix H100 followed by a random number). However, I encountered the same error. Is there a way to specify the name of the table generated in the ODX? It seems like the “File aggregation pattern” is the issue. Do you have any idea how to fix this? -Execute Executi
Already have an account? Login
No account yet? Create an account
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.