We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.
We use 3 different kinds of cookies. You can choose which cookies you want to accept. We need basic cookies to make this site work, therefore these are the minimum you can select. Learn more about our cookies.
This article describes how to add a CSV Endpoint to a Deliver instance. If you want to use data from the Deliver instance in an endpoint that is not natively supported by TimeXtender Data Integration, you can use the CSV endpoint to create files that can be read by almost everything under the sun. When you execute the endpoint, a file is created for each table in the model. Only regular fields are supported, not measures or custom fields. In the TimeXtender Portal navigate to Data Estate > Instances, click Add Instance> Add Deliver instance. Select CSV Endpoint from the Add endpoints list, then click Add an endpoint. Fill out the following fields in the form. In the Directory box, enter the path to the folder where the files should be placed In the File extension box, enter the file extension you want to use (usually "txt" or "csv") In the Encoding list, click on the encoding you want to use. In the Field names in first row, click No if you do not want the field names to be output as t
As title suggests, I wonder if anyone in this community has experience with ingesting TX with the data source Adnamics and their API. Thanks in advance!
Hey, I use the REST connector to consume a CSV file. In my query parameters I got one for auth key. So when I try to load the file a get an error and the reason is that parameter get encoded (% → %25) . How can I avoid encoding in the query? /Bjørn A.
This is a simple Deploy and its been running for almost 20 minutes.Whats wrong here?
It is possible to use OAuth 2.0 as the authentication method for the TimeXtender REST data source. One API that uses this is the Graph API. Content Content Prerequisites Application setup Access Token URL Scope Initial setup Set up OAuth Authentication Main Endpoints Users Groups Teams Set up pagination Dynamic endpoints Users messages Team members Prerequisites Using the postman collection explained in this guide Use Postman is a good start as the method is pretty much similar. What we will do is the application method aka Client Authentication. Application setup As mentioned above you need to use Application rights for client authentication, so the app you want to use for this must have the correct rights. The Delegated rights are easier to set as they mainly do not require Admin consent, that is not the case for most Application rights, so get these rights authenticated before starting. I got one app where I got all the application rights added If you want access to groups and users
Hi, I am facing an issue with duplicate records when using incremental load on a table that receives data via a table insert from another table in the same data area. The incremental load selection rule is based on DW_TimeStamp, and there are three PK in the target table. The issue is that the primary key does not seem to be enforced in the target table, as I am getting duplicate rows in the table. I tested this with another table where data is inserted through a table insert (but without incremental load), and I still see duplicate values for the PK fields and despite setting the table option to "Use instance setting (error)", no error is thrown when duplicates occur. I have tested this behavior in TDI 6935.1 and 6926.1, and the issue persists. My questions are: How do primary keys work with table inserts? Should they prevent duplicates, or do they not function as expected when using table inserts? Can incremental load be achieved using table inserts? If so, what are the necessary con
Prerequisites Create an Azure App Registration The App Registration should use the Microsoft Graph API with the following permissions: Finding the List ID Use Postman to find the Site ID. Use the Site ID to find the List ID. You can find these by using Postman and by looking at the Graph API collection Graph Fork . You can use the app to connect to this and locate the SharePoint folder under Application. In there you have two requests, https://graph.microsoft.com/v1.0/sites and https://graph.microsoft.com/v1.0/sites/{{SiteID}}/lists . Use sites to find the site ID and use the SiteID to find the List ID. You can also find the ListID by navigating to the SharePoint list and then clicking List settings and then copying the ListID from the URL Use the TimeXtender REST data source connection Connection Settings Use the following Base URL (where {{SiteID}} is the Site ID found in Postman): https://graph.microsoft.com/v1.0/sites/{{SiteID}}/lists Endpoints Create the endpoint using the followi
is it possible, as the title says .. to Copy and entire semantic layer to a new semantic layer, or copy the exsisting tables with relations from 1 semantic layer to a new one ?
Hello, I am trying to setup an endpoint to PowerBI Premium in version 20 (20.10.58) The connection is setup as below, and it tests OK (See pic). When I Deploy I get the error: Failed to save modifications to the server. Error returned: 'Either the database with the ID of 'TXFinance1' does not exist in the server with the ID of 'host002_datasets-231', or the user does not have permissions to access the object. The Semantic model is created fine in PowerBI, but nothing inside of it. Is there a setting I am missing? The stange thing is that it appears to work for a colleague in the same workspace using the same app credentials, but not on my machine Many thanks, Tim
I'm trying to connect to MongoDB and load data into Timextender I have a ODBC driver setup, and test connection works. In Timextender I have ODBC Generic Data setup, and test connection works and to Synchronize Data Source works as well. But when I try to execute the data transfer from MongoDB into Timextender then I get an error.
Suddenly all the excel sheets are not loading anymore however some of them no one touched 6 months ago. the error is: Executing table excel_sheet:failed with error:Exception Type: System.ExceptionMessage: Invalid column count Stack Trace: at DataStorageEngine.DataLakeGen2.DataLakeGen2Transfer.<>c__DisplayClass41_5.<UploadData>b__14(DataTable dataTable, Partition partition) at DataStorageEngine.DataLakeGen2.DataLakeGen2Transfer.UploadData(ADLSGen2APIManager api, DataSourceModel dataSourceModel, DataLakeGen2ExecutionMethod executionMethod, Action`2 writeExecutionLog) at DataStorageEngine.DataLakeGen2.DataLakeGen2StorageEngine.<>c__DisplayClass35_2.<TransferDataAsync>b__7(DataLakeGen2ExecutionMethod executionMethod) at DataSourceEngine.Custom.CustomSourceEngine.InvokeDataSourceExecution(Action`1 action) at DataSourceEngine.Custom.CustomSourceEngine.RunExecution[M](DataExecutionDestination dataExecutionMethod, Action`1 action) at DataStorageEngine.DataLakeGen2.DataLakeGen2StorageEngine.<>c
Hi, We've updatet our Data Integration version from 6766.1 to 6963.1. Which means we had to update our outdated providers. So we followed the steps mentioned in this post Change data source provider | Community (timextender.com). This results in an error when we try to full load changed data source. It keeps giving is the error: “The schema …. already exists on the SQL storage and cannot be used” Which actually means we can’t update our providers/ data sources. We have to create new ones en have to map all our tables in de odx. Anyone else facing this problem and have a solution for this?
I'm using dynamic values function in the TX REST connector 9.1.0.0. I use id's from another endpoint to loop through in my second endpoint path. This works well when I use “From Endpoint Table” but now I want to add a filter to only get the id's with a flag “hasresponse=true”. I've read the page but I still get an error with my Endpoint query. “No such table” the error message says. I've tried several things like adding a schema. But all with the same response. Is there something wrong with my syntax? Error:
Hi, Previously, we used SAP HANA (CDATA) connector to connect SAP HANA with TimeXtender. I understand from documentation that we can use Azure Data Factory - SAP Table and TimeXtender SAP Table Data Source for connecting with SAP HANA Data Sources. My question is with ADF- SAP Table, does customer need to have ADF Subscription/Licenses. Also if they want deployment target as SQL Server On Premise. How can we user the same, TimeXtender SAP Table Data Source - Can we use this without Theobald Connector, if yes, how to use this with SQL server On Premise. Let me know the steps and example at the earliest.
Hi, I was able to connect with SAP Public and Private Cloud, SAP C4C (SAP CRM), in the past using SAP Cloud for Customer (CDATA) connector which is no longer supported by TX. Please let me know on priority how we can connect to SAP Public and Private Cloud, C4C and other SAP Cloud Product using TDI. Please let me know with new options available what connector I need to use and how I can connect above mentioned data sources with TDI. I have a demo with a prospect. Need help on priority. Also share the steps and example, if possible
Hi! We are currently setting up TimeXtender with Snowflake and have a bit of an issue when it comes TimeXtender not having the capability of managing the ODX in Snowflake (internal stage). As you know an external stage is needed via Azure Data Lake Gen2. The issue for us is that we would like to use private endpoint when creating this data lake in Azure but that requires business critical subscription to Snowflake which is 30 percent more expensive. Are there any plans/timeline for when TimeXtender will be capable of handling the Ingest instance in Snowflake and if so could you please give a rough estimate on when that might be available? We need to make some decisions on our end on how to proceed and would like to know the answer before we move forward.
10128 points
3863 points
2732 points
2675 points
1955 points
Learn about troubleshooting techniques
Find a Partner that fits your needs!
Submit a ticket to our Support Team
Already have an account? Login
No account yet? Create an account
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.