Start discussions, ask questions, get answers
Read product guides and how-to's
Submit ideas and suggestions to our team
Read the latest news from our product team
Online training and certification
Connect with like-minded professionals
Explore the new cloud-enabled version of TimeXtender.
I configured a REST API endpoint. It woks as expected as long as I only add tables from my MDW to the endpoint. However, any views I add to the instance in TX are not visible through the API. They do not show up in the /swagger nor can I query data from them nor can I find any mention of then in the webserver configuration. Why is that?
Hi All, Hopefully you can help me with this one. We have a source table from our POS systems containing sales transactions. There are a few value columns we transfer that are stored as a money data type and we see some weird things happening with the values of the columns. Differences of 0.01 up to 0.0001 are appearing from nowhere in the transfer from ODX to DWH and i don't know how this i happening.The columns that have money data type are called ‘Bedrag', ’Korting’ and ‘NettoBedrag’ Take transaction 52. In the source the values are like this: After transfer with ODX, i looked in the parquet file using ParquetViewer and that looks like this: Still the same, nothing wrong here.I imported the table without any modifications in TimeXtender and it looks like this: After a transfer the data looks like this: If you look closely we lost 0.0001 on the first record ‘NettoBedrag’. Also wel lost 0.0001 on the Bedrag of the second record.I've tried a few things to mitigate this. Different numeri
Relates to TimeXtender 6024.1 and later versionsThe following list of servers and ports used by TimeXtender and ODX Server should be opened in your firewall settings.TimeXtender Desktop Azure service bus ODX Server Additional servers Troubleshooting Test-NetConnection Turn off services in Subnet setup worker.database-windows.net issue TimeXtender DesktopTo use the new version of TimeXtender, the desktop software needs to be able to reach these URLs:Instance databases: sql-instances-prod.database.windows.net Server is outside Azure (On-Prem): Port 1433 standard for SQL Server Server is Inside Azure: Port range 11000-11999. Note: The IPs can potentially change over time and hostnames don't always resolve to the same IPs over time, or over machines. The ranges of current IPs published by Microsoft here. If you are configuring the Azure firewall, allow traffic for the tag "Sql.WestEurope" (so that it deals with these lists automatically). More sophisticated on-prem firewalls can a
We are excited to announce the latest version of XPilot, packed with significant improvements and new features to enhance your experience. Here are the highlights:5x Faster Responses: With a new and improved Index, XPilot now delivers responses five times faster, ensuring you get the information you need without delay. 10x More Intelligent Responses: Powered by GPT-4o, XPilot's responses are now ten times more intelligent. Additionally, XPilot now has the capability to remember previous interactions, providing more contextually relevant answers. A response not quite hitting the mark? Ask XPilot to clarify or provide a correction. More Knowledgeable: XPilot is now more knowledgeable than ever, incorporating all the latest knowledge base articles and Exmon user guides. And now even YOU! can help make XPilot smarter, as it's also trained on "answered" user community questions. Improved Usability: The user interface has been completely rebuilt from the ground up, significantly improving u
Hi, Does anyone knows if it is possible to stop/ kill a running execution within Exmon DG?
HiFollowing an upgrade to the latest version of TimeXtender (6675.2) on one of our customers environments I have since had a problem where some of my scheduled jobs do not run at their scheduled times. When digging about I noticed in the event viewer the following;Rerunning the [TimeXtender Execution Server Configuration 6675.2] tool has since resolved this particular issue but I still have issues with the scheduled jobs.I have also deleted and recreated the problematic jobs but this did not resolve the issue.Running the jobs manually work fine.Please advise on next steps
Dear Sir,I am trying to set up a CData Json datasource connection with TimeXtender.This is my connectionstring:Auth Scheme=OAuth;Data Model=Document;Initiate OAuth=REFRESH;OAuth Access Token URL=https://pilot.binnenbeter.nl/api/v2/login/venray;OAuth Grant Type=PASSWORD;OAuth Refresh Token URL=https://pilot.binnenbeter.nl/api/v2/login/venray;Row Scan Depth=100;User=api@venray.nlI am able to get the bearer Token with Postman with this:url: https://pilot.binnenbeter.nl/api/v2/login/venrayJSON Body:{ "id": "", "language": "en", "data": { "resourceType": "AUTH", "item": { "loginName": api@venray.nl, "password": "password", "module": "API" } }This is the documentation I used for the connectionstring:PasswordFinally there also is an Password method. It is similar to the others, in that it requires an Access Token URL and an Refresh Token URL, the difference is that it uses an User and Password instead of client and secret. Notice that these fields are
I have an API that requires an access token to be retrieved using a POST request, which is returned as JSON. Most of the endpoints also return the data in JSON, which I can already extract using custom rsd files. However, there is one endpoint that returns custom reports in text/CSV format. So I'm looking for a way to retrieve the token from a JSON response and use it to do a second request to retrieve a CSV response.I've tried using the CSV provider to do this, but I get an error when I want to combine jsonproviderGet and csvproviderGet in a nested call using an rsd file based on my already working all-json rsd file: Is there a way to make this work? Thanks,Jasper
Hi,As mentioned here https://support.timextender.com/data-sources-112/advanced-rsd-file-options-884?postid=7688#post7688 I'm having some issues with nested API calls using the CData REST provider. It looks like it maxes out at 1,000 calls per iteration. Is this the expected behaviour or is there a setting I haven't found?Thanks!
PrerequisitesODX Server machine has access to the URLs: *.googleapis.com/* Profile: 9-digit number connected to the GA profile you want to collect data from Property/Properties: 9-digit number relating to the property (e.g. website) you will collect data from. You will need a separate data source for each property (I think). Account with admin privileges in the API for granting access as well as authenticating OAUTHMigrating Universal Analytics to V4If you already have a connection to GA Universal Analytics with the CDATA for Google Analyics 2022 Provider, setting up the connection to V4 will be easy. I recommend creating a new CDATA data-source for V4, so that you can compare the tables from Universal Analytics and V4. The structure of the data model has changed significantly!Copy connection string from GA Universal CDATA data-source Create a new data source with the GA CDATA connector and version Import connection string Change the version to V4 at ‘Authentication’ → ‘Schema’ as wel
I am connection to a REST API with multiple Endpoints. Does anyone have experience with setting up a control table for TimeXtender to use to dynamically change the endpoints instead of creating an endpoint per table?
We have an issue with an Oracle data source, having a different “Owner” between the acceptance and production environments.We're on TimeXtender 20.10.51 using Business Units.The Oracle data source is configured using a global database. On acceptance it looks like this:When the data source is added on acceptance, objects are read as accpOwner.tableName.The production global database properties look like this, with a different owner:However, on production TimeXtender expects the same owner that was configured in acceptance:And these selected tables don't exist in the production database, because they exist as prodOwner.tableName.This means that the tables cannot be found in the data source, and consequently cannot be executed. When syncing data source on prod all tables are dropped from the business unit, losing mappings to DSA, and no production tables are selected.One solution I can think of is to add ACCP and PROD tables in both environments, and use an environment variable to compens
Learn about troubleshooting techniques
Find a Partner that fits your needs!
Submit a ticket to our Support Team
Already have an account? Login
No account yet? Create an account
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.