Recently active
Hi, I must use tenant_id (company specifict) when authorize my endpoint with OAuth2 (credentials in body and request header).Parameters are client_id, client_secret, tenant_id, etcI really dont want to create a connection for each of my tenant_id's. So my question…How can I use dynamic values to update authentication for each tenant_id? regards,Bjørn A.
I have a need to calculate a measure which would reflect the revenue growth over the year on different categories in percentage.Currently i have a custom measure in semantic layer with following formula: Revenue Growth = ([Sales Revenue] / [Sales Revenue LY]) - 1 The categories in dimension which i have are Cat1, Cat2 (no data in the fact), Cat3. However if i try to show this on excel, it includes the Cat2 which has no records in the fact table, and also gives me an “empty” category which, i assume, is “summarization” like this:Is there any way to make sure that the empty row and the category with no data in the fact would not appear?
Question; I have a project that contains the CDATA Google Ads connector. I need the view ShoppingProducts that was added recently to the connector:Unfortunately I can't find in in Data Selection. Is seems to be added in version / build 9053 of the connector but TX shows me version 9036 is the latest:I have resynchronized the source but as you can see below it’s not in the list:Is the latest CDATA connector available in some way?BR\Dirk
I wanted to clean the jobs in TX Integration version 6822.1. When I deleted the last job, the folder disappeared and i am not able to add a job anymore. anyone has a solution for this issue?
It’s our pleasure to announce the release 24.3 of TimeXtender Orchestration & Data Quality and TimeXtender Master Data Management, featuring exciting new updates and enhancements.SummaryThis release brings several enhancements to boost usability, flexibility, and user experience across TimeXtender's platforms. Key highlights include improved integration with TDI, new capabilities for optimizing cloud resources, better time zone management, and easier access to previous versions. We have also updated product naming and expanded database permissions for users. These updates demonstrate our commitment to meeting customer needs and delivering a more seamless, intuitive experience. GeneralAccess to Previous VersionsUsers can now easily download executable versions of O&DQ and MDM through the provided links to TimeXtender's SharePoint. These versions don't require installation, allowing for seamless switching between different versions as needed. No special permissions are required t
Hello, I have an Acceptance and Production environment and using multiple environments transfer.Now both servers are new and also TX is upgraded (latest version) and the connection between them seems to be gone after this. How do I setup the multiple environment again ? Just by adding an environment or can I get it back from the ‘old’ TimeXtender ? Thanks, Ronny
Good moringI am in the TimeXtender DG Desktop version 24.3When I click on Tools → Configuration → Advanced I see a lot of timeout settings.I can not see what the unit of measurement is, are they all in minutes? Could not find this in the help.Greetings, Roy
We're experiencing projects to open very slowly using TX 20.10.38 (we're above version 10000 of the project) Is there an easy way to remove old versions of a project ? We are using multi environment with global databases, and don't want to go through the export, create new repo DB and import hustle.We're also trying to delete the logs, but deleting them from within TX seems to take forever.
Hi,TimeXtender v.6848.1Can we use Sql version Microsoft SQL Azure (RTM) - 12.0.2000.8 Oct 2 2024 11:51:41 Copyright (C) 2022 Microsoft Corporation as our ingested database?I keep getting this error loading the tablesThe execution failed with error:Exception Type: Microsoft.Data.SqlClient.SqlExceptionMessage: Reference to database and/or server name in 'xxxxxx.sys.extended_properties' is not supported in this version of SQL Server. best regards,Bjørn A.
Hi support,I am experiencing some strange behaviour when the Scheduler service wants to start execution packages. The following is happening:the schedule is supposed to start at every full hour. I know that it can take up to 2 minutes for an execution to actually start. Until last monday 18:00 everything was fine. The packages started within 2 minutes and were finished within 10 minutes. After last monday 18:00, we are experiencing a delay in the start. Almost anytime the packages starts, it takes a 12 minutes delay. It also looks like the execution takes way longer than 10 minutes. However, when I look at the gantt chart, I see it still takes up to about 10 minutes to finish. According to the screenshot below it was supposed to start at 03:00, it actually started at 03:12 but if you look closely at the actual times in the gantt chart it only started at 03:34 to finish at 03:42, which is still only 8 minutes. It is TX v20.10.39.64. The database is Azure SQL, sometimes we use 100% CPU
HiJust upgraded existing TX ingest and Data Integration to latest versions 6848.1 on dev server and now I get a failure when running my ODX Data Source steps, specifically it seems just what used to be know as the synchronise step but now called the import metadata task. Originally each data source has 3 steps, synchronise, transfer and storage management, and they used to run fine. Now I get seemingly random job fail errors on what looks like so far just on what was the synchronise step.If I run the failed step manually it processes without issue. However, when I run the entire ODX job I have set up it seems that one or more data source sync steps now fails. The execution failed with error:Exception Type: Microsoft.Data.SqlClient.SqlExceptionMessage: Transaction (Process ID 76) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction. Stack Trace: at Microsoft.Data.SqlClient.SqlConnection.OnError(SqlException e
This is a follow-up of Using XML to ingest data, that i have managed to solve.I need some help with creating a nested statement. The first rsd which lists out all the IDs is this:<api:script xmlns:api="http://apiscript.com/ns?v1" xmlns:xs="http://www.w3.org/2001/XMLSchema"> <!-- See Column Definitions to specify column behavior and use XPaths to extract column values from XML. --> <api:info title="contract" desc="Generated schema file." xmlns:other="http://apiscript.com/ns?v1"> <attr name="contractid" xs:type="string" readonly="false" other:xPath="/Envelope/Body/contracts/contract@contractid" /> <attr name="description" xs:type="string" readonly="false" other:xPath="/Envelope/Body/contracts/contract@description" /> </api:info> <api:set attr="DataModel" value="RELATIONAL" /> <api:set attr="URI" value="https://my.soap.endpoint/service.asmx?WSDL" /> <api:set attr="PushAttributes" value="true" /> <api:s
This article clarifies the concepts of slowly changing dimensions and explains how to implement a History table in TimeXtender Data Integration. What is SCD? SCD Types History Fields Example Instructions Enable History Configure Fields Additional Settings Surrogate Keys What is SCD?A Slowly Changing Dimension or SCD is a complex data warehouse pattern that maintains historical data for reporting even when this data is no longer available in the source. Consider this example scenario: You report on customer shipments. Unfortunately, your data source only supports a primary address for each customer and previous versions are not available. Subsequently, a customer moves to a new location and the account manager updates the customer address in the source system. The Prepare Instance is updated, but now the historical shipping records are pointing to the row in the customer table which has the new address, instead of the original address they were shipped to. By implementing a slowl
Hello,I'm working with a REST API where the response is a valid JSON but has a Content-Type: text/plain. The response looks like this:[{"OrderId":"46D2A6BD-8B34-4F55-B816-B90017CDBA28","OrderNumber":131,"DeviceId":"POS41232","VendorId":1,"VendorName":"Test Facility","StoreId":114}]However, the log shows the following:2025-01-10T15:33:16.340+01:00 2 [1|Q-Id] [HTTP|Res: 0] HTTP/1.1 200 OK, 1615 Bytes Transferred2025-01-10T15:33:16.340+01:00 2 [1|Q-Id] [HTTP|Res: 0] Request completed in 676 ms.2025-01-10T15:33:16.344+01:00 3 [1|Q-Id] [META|Schema: JSONValidation] https://pos.no/getsales is not a valid JSON resource.2025-01-10T15:33:16.345+01:00 3 [1|Q-Id] [META|Schema: JSONValidation] Invalid JSON markup. Expected json, but instead found [text/plain; charset=utf-8]. To attempt to fix this, I set Accept: application/json in the request headers, but this results in a response wrapped in a string, like this:"[{\"OrderId\":\"46D2A6BD-8B34-4F55-B816-B90017CD
Hi,We are using a REST data source with dynamic values to fetch data for approximately 12,000 records, where each record corresponds to an individual API call.Occasionally, we encounter issues with the API provider where some calls return a 400 error. Unfortunately, it’s unpredictable which dynamic values will cause these errors.Currently, it appears that the TimeXtender REST data source stops processing upon encountering a 400 error, resulting in a "Completed with errors" status.Is there a way to configure the REST data source to continue processing the remaining dynamic values, skipping over those that result in a 400 error? Ideally, it would also log the calls that failed for review. Best regards,Pontus
Hello, When you have multiple endpoints configured in the portal and endpoint B uses dynamic values from endpoint A, the the data on demand functionality doesn't work for tables that need endpoint B. Tables derived from A get the data on demand just fine. The exception message says: "OnDemand execution failed for data source '<datasource>' table '<table>' was not executed” I encountered this in a customer environment and then tried to replicate it locally in my sandbox setup. To my surprise, the data on demand worked in my sandbox. However, I had an older version of TX installed. I updated to the latest version on my local laptop and then encountered the exact same problem. This would suggest that it’s a bug in the new version. My version number for TX is 6848.1.
Hello,With TimeXtender business unit on prem at an internal server, we were able to call an API via below settings, and load the data successfully. However, when we set up the same data source on TimeXtender Portal with basically the same setting. The data source tested either timeout or with error ‘Exception has been thrown by the target of an invocation.’, or connection tested okay but no metadata can be imported/synced. I have tried all below connection types but couldn’t get it work. Do you know why (for example to setup firewall or proxy) or have any advice? Thanks in advance.
Anyone have a suggestion about how do i use this XML to ingest data into ODX?<?xml version="1.0" encoding="utf-8"?><soap:Envelope xmlns:soap="http://www.w3.org/2003/05/soap-envelope" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"> <soap:Body> <ExportDocsProjectWithTendersResponse xmlns="http://publicprocurement.com"> <ExportDocsProjectWithTendersResult>{ "ProjectBodies": [ { "ExternalId": "666666", "DefaultEvaluationModel": "LowestPrice", "DocumentNumberingSchema": "Numeric", "SectionNumberingSchema": 0, "RequirementNumberingSchema": "Alphabetic", "Metadata": [ { "Name": "ProjectName", "Value": "Ingesting Data into TX" }, { "Name": "Reference", "Value": "123456" }, { "Name": "ContractType", "Value": "XML" } ], "DataFields": [ { "ExternalId": "1",
Use Microsoft Fabric Lakehouse as Ingest Instance storage, and ingest data sources into Microsoft Fabric OneLake Delta Paquet format.Fabric Ingest instance storage is available as part of the Standard, Premium or Enterprise PackagePrerequisitesCreate an App Registration in the Azure Portal - It is recommended to use a dedicated app registration to ensure this account is the only one with access to the client credentials Add a user that does not require multi-factor authentication (i.e. a non-MFA account) as Owner of the App Registration, in order to allow for unattended re-authentication Add a platform and select Mobile and desktop application and enter https://localhost as the custom redirect URI and click Configure and then Save This will result in the following Redirect URIs being added automatically Navigate to Authentication settings for the App Registration. Set Allow public client flows to Yes In Fabric/Power BI Admin Portal, enable “allow service principals to use Power BI
TimeXtender Exact Online Data SourceAccess and ingest data from Exact Online REST APIs using the TimeXtender Exact Online data source.TimeXtender Exact Online Data Source Authentication Setup Create an APP in Exact Set up the Authorization request to retrieve the code value to use to generate the tokens Use the code value to generate the initial tokens Find the division value Endpoints Query parameters Table flattening Other AuthenticationExact Online REST APIs use OAuth Refresh Token flow for authentication. This requires the user to perform initial steps before it is possible to set up the authentication in TimeXtender Exact Online data source. When you have generated a client ID, client secret, and the initial access token and refresh token, you can fill in the corresponding fields for authentication.NOTE: the initial access token timestamp needs to be converted to UTC time-zone and then to UNIX timestamp seconds.To find these values you need to go through a three-
Hello, I want to get metadate from my Azure datalake using their Blob API. I wasn't seeing any data in the Ingest storage so I turned on cashing to file, to try to see what's happening. There are three files in my cashing folder: Data_.raw: The return of the call, i.e. my actual data. This look excellent, except that it's a .raw file. Contents: <?xml version="1.0" encoding="utf-8"?><EnumerationResults ServiceEndpoint="https://xxxx.blob.core.windows.net/" ContainerName="datalake"> <Prefix>my_prefix</Prefix> <Blobs> <Blob> .... </Blob> </Blobs> <NextMarker/></EnumerationResults> Data_.xml: Basically the same as the Data_.raw, but with the content of Data_.raw as the data of a value-element. The data also contains the XML header (so now the document has two headers) and the brackets have been encoded (i.e. all the `<` are now `<`). <?xml version="1.0" encoding="utf-8"?><Table_flattening_name
Hi all,I have an API that uses nested XML to deliver the data. I have used a relational model to be able to retrieve this data in separate tables. Based on key fields that are created I am able to join the tables back together. All works fine, however, I want to reference a (different) field from my parenttable in a nested table. Currently, the output adds a field _id in my parenttable PerformanceInfoRow to be able to reference the underlying data back to the specific period. However, this _id contains an integer. Daily I will get three rows of data and daily I will thus get back the _ids 1, 2 and 3. So, in the end this will not be a unique key when retrieving more days of data. Therefore, I want to specify the Period that also is available in the parenttable, which is unique since it will not only show the date but also whether it is morning, afternoon or evening. This will be unique, but unfortunately I am not able to add this field to the underlying tables. I have tried to adjust my
In this article, you will read about querying Windows Service status and restart with exMon.The PowerShell Data Provider in exMon can be used to query Windows Service and make sure they are up and running. In this example, we will query the status of the MapManager within Windows. If the service is stopped for some reason, we will attempt to restart it and send a message to an IT administrator.Follow these steps to create the query:Create a Query in exMon Administrator Choose "PowerShell" as a Data Provider Enter the following PowerShell script $serviceName = '{@servicename[preview:exMon Command Service]}'$service = Get-Service -Name $serviceName | select Displayname,Status,@{label="NewStatus";expression={$_.Status}}if ($service.Status -ne 'Running') { #The service is not running. Restart and fetch new status Start-Service $serviceName $service.NewStatus = (Get-Service -Name $serviceName | select Status).Status $service} else { # return empty results $exMonResult = Ne