Recently active
Dear Sir,We connect to our Excel files through a Sharepoint site with some Document Libraries. Works fine, but the performance is bad 😑 -5 minutes wait to Select Tables -5 minutes wait to execute Synchronize 5 minutes wait for execute Full Load Is there a way to squeeze some performance out of this ? Thanks in advance for looking into this performance issue. Regards,Arthur
I have two particular use cases where I would like the TX REST connector to be able to handle nothing being returned. In one case I've configured the portal to do a PUT request. PUT's generally don't return any body. In a second case, the API is just configured to do this sometimes. We get ID's from another endpoint that we are then supposed to use dynamically like: endpoint/{ID}. The problem is that the source system can have phantom ID's, meaning that ID=13 doesn't really point anywhere. endpoint/13 then returns just a completely empty body. In both cases the problem seems to be that the connector can't handle this. We get back:Failed to execute endpoint 'xxx':Unexpected character encountered while parsing value: }. Path 'TX_Autogenerated_Element', line 1, position 28.Going of the error message, it would seem that the connector is assuming a json was returned and automatically starts trying to convert is somehow. I need to be able to handle these situations somehow, since I have no
Hi,I updated our TX version to the newest 6814.1 I have added many tables in TimeXtender Ingest Server with incremental load. Why do I have to change all incr. tables in "desktop app" to explicit incremental load, can't you use “Automatic” anymore? Error msg: when executing with automatic on a table with incr setup from the datasource I get this type of error “Invalid object name 'shema.table_I'.” Something I haven't done right... regards,Bjørn A.
Today, we’ve published a hotfix release of TimeXtender Desktop (v. 6822.1) that contains the changes listed below. NewSupport for Fabric SQL Database as Prepare storage. FixedFixed issue with Ingest Incrementally loaded tables that was added to a Prepare instance, which caused errors during deployment and execution. Fixed issue with Database Clean up tool wanting to delete _I tables because of the error above. Fixed issue with wrong syntax when deploying objects in Snowflake. Fixed issue with wrong syntax when deploying objects in SQL Synapse.
Dear Sir,I want to connect to a PostgreSQL database via ODBC.So I found these posts how to fix this. But I am using TX version 6766.1.Can someone please point me the way how to fix this with the latest and the greatest version of TimeXtender.See Word document in the attachment with my setup.https://legacysupport.timextender.com/hc/en-us/articles/4410466782621-Create-a-ODBC-data-source-in-a-ODX-or-with-a-Any-Source-ADO-Net-provider-in-a-Business-UnitThanxs in advance.Regards,Arthur
Released in TimeXtender Data Integration 6024.1This article describes how to deploy and execute the Deliver instance to the Power BI Premium XMLA Read/Write Endpoint.Configure Power BI Add Power BI Premium Deliver Instance Endpoint Deploying the endpoint in TimeXtender Data Integration Configure the Power BI Data Source Credential (Optional) Authenticate using a Service Principal Create an App Registration in Azure Allow service principals to use Power BI APIs Give your App Registration Admin Access Update Credentials in Deliver instance Configure Power BILogin to app.powerbi.com Select the workspace where the Deliver instance will be deployed to: Go to Workspace Settings Click the Premium tab Select the Premium per user or Premium per capcity License mode (this requires a Premium license) Copy the workspace connection. Go to the admin portal Under, Premium Per User / Dataset workload settings, Set XMLA to "Read Write". Add Power BI Premium Deliver Instance EndpointIn th
This article describes how to setup Prepare instances with Fabric storage.Fabric Prepare instance storage is available as part of the Standard, Premium or Enterprise PackageThe following functionality is currently supported when using Fabric Prepare instance storage when using the TimeXtender Data Integration 6814.1 release or later:Data extraction from Ingest instances using Fabric storage Simple Mode tablesPrerequisitesYour Ingest Instance must also use Fabric Lakehouse storage. Currently, Fabric Prepare Instances can only use data from Ingest instances with Fabric storage. Using Fabric Prepare instances in combination with non-Fabric Ingest Instances is currently not supported. You must setup an Azure App Registration as described here In Fabric/Power BI Admin Portal, enable “allow service principals to use Power BI APIs” as described here, in order to grant the app registration access to the Fabric workspace. Create a workspace, or navigate to an existing workspace, in the Fabric p
I am deploying a project to Snowflake.On a table I get the following error:Object reference not set to an instance of an object.As read in another ticket I did a ‘Save and reload' but unfortunately that did not solve the issue.It;s on deploying the data cleansing script however it does not show me the script in the deployment log, I only see the error message. Is there an easy way i can see what lookup column is causing the issue ?
Already have an account? Login
No account yet? Create an account
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.