Because Time Matters
Start discussions, ask questions, get answers
Read product guides and how-to's
Submit ideas and suggestions to our team
Read the latest news from our product team
Online training and certification
Connect with like-minded professionals
As a consultant at E-mergo, I was tasked with advising GGD Drenthe on their data warehousing strategy. The GGD (Municipal Health Services) is a public health service who was facing challenges in managing and analyzing a growing volume of data from various sources. Their existing solution, heavily reliant on Qlik Sense, lacked the flexibility and scalability required for future growth.I recommended implementing TimeXtender as a robust and efficient data warehousing solution. This would allow them to:Centralize Data: Consolidate data from diverse sources into a single, unified repository. Improve Data Quality: Implement robust data quality checks and data cleansing processes. Enhance Data Accessibility: Make data readily available to various applications and users. Accelerate Data Analysis: Streamline data integration and transformation processes.I provided guidance on data modeling, ETL development, and best practices for data management. I also trained the GGD Drenthe team on how to us
Hi all,I’m still quite new to timextender so I appreciate all the help you guys give me.I have quite alot of different XML files containing alot of data. It’s product catalogs with porentially 40K products that contains different kind of parameters .. each product’s parameters varies from product to product .. som tell you the country of origin, som dont .. on some of the products there is information about delivery dates some and on some there aren’t.So all-in-all the different products and the different files varies a quite alot. The content of these files can change each day and some multiple times each day .. I would think that we all in all are talking about 600.000 different products.How would you approach this ?Is it possible for timextender to handle this or do I have to make some kind of XML-parser ?I guess I’m going to make a unique key there is based on the products no. and another parameter, but would you predefine the different parameters as a Field so I have to add a new
I would like to test this new functionality.I’ve read: https://support.timextender.com/deliver-109/rest-api-endpoint-1808?tid=1808&fid=109But I can't understand how to send the tables from the SSL instance to the target DB using this technique. Can you explain me with a clearer example?
Hi, we are in a project to update our on premise ERP system Infor to the current version Infor LN CE which is running on the cloud. Since we cannot access the database then directly we are currently checking how to ingest the data to TimeXtender in future. There is a tool within the ERP suite to provide the data to "Infor Data Fabric" which is an Amazon S3 data lake where the ERP data is stored in JSON files. This data fabric provides a SQL Endpoint to query the data. So similar architecture as in MS fabric with lakehouses. The data can be queried in two ways:- JDBC: using a JDBC driver with API Gateway authentification, see Link- API: handle API request in a certain order to start a query job, aks its status and get the results, see link: https://www.youtube.com/watch?v=m4d0awIQ6AgFor TimeXtender, which would be the way to prefer? I could not find the possibility to add a JDBC datasource with API authentification and I am wondering if TimeXtender can handle the process order