Recently active topics
Hello,I’m using the CData ADO.NET Provider for JSON 2021 and get this error message in the logg file.[HTTP|Res: 29] HTTP/1.1 200 OK, 548950 Bytes Transferred[HTTP|Res: 29] Request completed in 6828 ms.[EXEC|Page ] Page successful: 1000 results (6844 ms) NextPage: [EXEC|Messag] Executed query: [SELECT [Actualizations_id], [ActualizationType.Code],[META|Schema] Engine Invalid object name 'sys_resultset_close'[MDUL|SbMDUL] Drop BulkRows table:ErrorInfo[INFO|Connec] Executed sys_disconnect: Success: (16 ms)[INFO|Connec] Closed JSON connection The execution log file reports:Executing table [JSON].[LCSOC_EC_Actualizations]:failed with error:System.Data.SqlClient.SqlException (0x80131904): Execution Timeout Expired. The timeout period elapsed prior to completion of the operation or the server is not responding. ---> System.ComponentModel.Win32Exception (0x80004005): The wait operation timed out What does it mean and what to do about it?Cheers/Reith
Hi to all,In order to improve performance of data loading from Netsuite , we were advised to use Netsuite ODBC provided connetor and use Suiteanalytics. I have download the odbc connector provided by Netsuite, and installed it on the Timextender VM, i get an succesful connection in connecting to NS, using ODBC-test connection. However i cannot acess the odbc driver in Timextender ODX, so how do i do that. Keeping in mind that the Netsuite CData connector uses Rest-calls with limits 1000 records.. so that is not feasable. thank you in advance Peter
Data SourcesData Sources
The ODX Server can connect to a data source through the following types of providers:TimeXtender's own providers (with tweaks and improvements). This is usually the best choice if one is available for your data source. CData: For setting up CData providers, please refer to Add a CData Data Source ADF: Providers from TimeXtender that use Azure Data Factory for transferring data. ADO: Providers installed on the machine that are built on ADO.NET. OLE DB: Providers installed on the machine that supports the OLE DB standard.Adding a Data Source in TimeXtender PortalFollow the steps in Add and map a data source connectionUnder Connection settings, enter the connection information required by the data source you've selected.The content of the page depends on the provider you have chosen. Below is an example of TimeXtender SQL Server provider:Configuring a Data Source in TimeXtender DesktopTo add a new data source, follow the steps in Configuring a data source on the desktopSynchronizing Obj
For one of our clients I need to connect to a database on Amazon Redshift. The IT Vendor who made the system is telling us to use SSL mode Required ( https://docs.aws.amazon.com/redshift/latest/mgmt/connecting-ssl-support.html#connect-using-ssl ). Using this setting in the ODBC connector from Amazon itself works fine but the CData connector only gives a true false dropdown. I've checked if this perhaps was one of those fields where you can overrule the drop down with manual input but that's not the case for this one. Is there any way to get this working in the CData connector or would our only option for now by create DSN and then use the ADO ODBC option the the ODX to connect with the DSN.
I have been trying for quite a while to connect to an REST endpoint requiring authentication. Have gone through the the cdata documentation and various ressources on the support site.I have working setups both in postman and SSIS Script Task, but is very keen on migrating to a more native TX setup (maintainability), will be migratning several more in the time to come.The process is as follows:Get Authentication Token (HTTP POST) The POSTed information is send as ‘x-www-form-urlencoded’ in the body part (Postman) The response is a json object, containing the token in a attribute Get Actual Data using the Authentication Token (HTTP GET with token as a url parameter)The source in this case i ArcGIS, and the documentation does not mention anything about a standardised authentication scheme.Is there a way around implementing this in any of the standardised authentication schemes, or do I need to go in the direction of RSD files?Please adviceThanks in advance.
I have a transaction table in odx with the following fields (simplified)Key,Value, transactiondate Example100, V100, 2023-01-01100, V100A,2023-01-31 09100, V100B,2023-01-31 10 Each day there are multiple transactions added , of course with another transaction date.Note in the example I have only shown the date part (not hours, minutes, seconds) What I need is a table with the latest update in DSAI created the same table. Set primary key to KEY and added an incremental selection rule on transactiondate . Added history settings to update the Value and transaction date based on the keyWhere it goes wrong is if we get in one day multiple transactions 100, V100A,2023-01-31 09100, V100B,2023-01-31 10I traced down the cleansing procedure and Tx detects that there are twice the same key. So far so good.Next it puts the ID of the latest transcation into <table>_L and only processes the ID’s that ar not in <table>_LThe result is that I get 100, V100A,2023-01-31 09 which is incor
Dear Community,I like to build my data estates with supernatural keys but in lager datasets the data cleansing starts to take very, very long. Do you happen to have the same issues? Is there a way to make the supernatural keys load faster? Even with incremental loading it begins to be super slow:I've ran a test on 435,397 records. This is on a Azure SQL with 10 vCores1. is a full load on the table with 7 supernatural keys.2. is a full load on the same table without the supernatural keys.1 has data cleansing of 1 second. 2 has a data cleansing of 104 seconds!Second I've done a test on the same tables but now incremental loads:1. incremental load with 7 supernatural keys2. incremental load without supernatural keys1 has a data cleansing time 1,6 seconds and 2 a data cleansing of a whopping 129 seconds!I'm not so sure I want to keep using the supernatural keys. What do you guys do? Take care= Daniel
Incremental Load in an ODX InstanceIncremental
Relates to TimeXtender 6024.1 and later versions. The subtract from value feature was released in TimeXtender 6024.1This article describes how to setup incremental load in an ODX instance, for more information on setting up incremental load in a data area within a Data Warehouse instance see Incremental load in Data Warehouse Instances.Setup Incremental Loading in an ODX InstanceWhen you have created a data source in the ODX, you have the option of setting up Incremental Load.In there you can add a rule with the following options.You can set it up for some specific schemas, tables, or specific columns. Most importantly is the column it will look for. In the above, I look for the ModifiedDateTime field across all tables. The Subtract from value is an option to subtract from the field your rule applies to. The ability to apply offset incremental selection rules can be used for data sources where the modified date is a Date field, rather than a DateTime field. It can also be used for data
Hello,I'm working on a project (new license model) where we want to use the Azure Data Factory for Data Movement. I've installed the TX 6143.1 version but got the message “Data Factory source is out of date”. Looking at the release notes this version of TX doesn't support ADF Data Movement at the moment:“Warning: The new version does not support the data source providers that move data using Azure Data Factory (e.g. "Azure Data Factory - SQL Server (126.96.36.199) 64 bit").”I've decided to downgrade to TX version 6117 but I'm getting the message as shown below (screenshot). It looks like there is another Repository from both installations.How can I solve this and remove one of the repositories. I want to use the ADF Movement so the repository for version 6117 is needed.Thanks in advance!VinceVicta B.V.
Hi, I have a list that maps product_codes to product_ID's.What I would like to create is a Key Store that, when fed a product_code from the list, produces the same product_ID and when fed a new code (so not in list) produces a supernatural_key as usual. Is it possible to force the keystore to create the ID's as shown above?
Connect to XMLTutorial
Go through this guide before you start.Add a CData data sourceContentsContents XML Information Setup With Elements With Attributes Getting the data Advanced features Automatically getting the correct data types Parsing Hierarchical Data Flatten Arrays XML InformationXML files can be structured in different waysHere below is how a XML file structured in element form looks:<?xml version="1.0" encoding="utf-8"?><top100rocksongs_billboard_2013> <t> <ID>1</ID> <Sequence>1</Sequence> <Song>STAIRWAY TO HEAVEN</Song> <Performer>Led Zeppelin</Performer> </t> <t> <ID>2</ID> <Sequence>2</Sequence> <Song>BOHEMIAN RHAPSODY</Song> <Performer>QUEEN</Performer> </t></top100rocksongs_billboard_2013>Here is how a XML file structured in attribute form looks:<top100rocksongs_billboard_2013> <t ID="1" Sequence="1" Song="STAIRWAY
I'm trying to connect to MongoDB and load data into Timextender I have a ODBC driver setup, and test connection works. In Timextender I have ODBC Generic Data setup, and test connection works and to Synchronize Data Source works as well.But when I try to execute the data transfer from MongoDB into Timextender then I get an error.
I have used the data export tool to export to .csv files. However I can not do it straight from the Business Unit. Is that a feature that can be added? Or maybe not included in client license. The client wants a few tables from source to be transfered to Azure Data Lake as a csv file and do some tranformation there.
Good afternoon, When I transfer my project from our DEV to PROD environment, it takes over 25 minutes. I also have the same problem when opening the project from PROD to deploy and execute changes. Is there anything I can do in order to make these times more manageable? Thanks, Chuck
Manage Data Source ODX Providers on the Desktopmanage providers
This guide describes how to manage Data Source ODX providers on TimeXtender Desktop.Open your ODX instance > Right-click on Data Sources > Manage Data Source Providers This shows the list of available providers.Add new providers using the Add button, or Add Specific Versions of providers. The update button will update an existing providerSelect multiple rows to apply bulk update or delete.Reference:How to configure TimeXtender Portal and Desktop
Hi Support, Mount Anvil have a Incremental project to run the incremental load of the finance system data. Changing the incremental execution rule on a table called G/L Budget Entry from the 'Modified At' field to the 'Last Date Modified' field is preventing the execution from running. The 'Modified At' field is in a date\time format and the 'Last Date Modified' field is just a date only format that appears to be causing the execution failure. The error messages caused are below: Finished executing project 'Incremental' Execution Package 'Update Project' Execution failed Start Time : 24/01/2023 16:31:05 End Time : 24/01/2023 16:32:33 on server: MAV01APP01 -Execute Execution Package Update Project 'Failed' -Execute Business Units 'Failed' -Execute Business Unit Business Unit 'Failed' -'One or more errors occurred.' -Execute JetBCStage_I 'Failed' -'One or more errors occurred.' -Execute Table JetBCStage_I TEST.BC_G/L Entry (17) 'Successful' -Execute Table JetBCStage_I TEST.RowCountGL 'Su
PrerequisitesEnsure that the data source has been added in the TimeXtender Portal, and is mapped to an ODX instance, before attempting to add the Data Source in TimeXtender Desktop. Adding a Data Source in TimeXtender DesktopThis guide describes how to add a new data source on TimeXtender DesktopDouble-click on your ODX instance Right-click “Data Sources” > Add Data SourceEnter the Name and Description of the data source. Select the Connection from available connections you have configured in TimeXtender Portal.Go through Table selection, define a Transfer task if needed. For additional details, refer to Tasks in an ODX InstanceOn-Demand Data Warehouse IngestionWhen ‘Data on demand’ option is enabled (in TX Desktop Data Source > Advanced Settings), the data source will transfer data into ODX storage before the MDW ingests the data. This will work without configuring an explicit "Transfer task" under the data sourceRight-click on your data source > Edit Click on Advanced Sett
Hi,I have the following setup in my Dev/Prod environments:ODX Shared, all on my live environment DSA / MDW separate for each environmentA daily refresh starts with ODX and on success moves to DSA. So far so good.On the ODX step, there's a Usage condition for Environment (project variable) = ‘Prod’.I added this because I don't want to start 2 refreshes of the same ODX from both Prod&Dev.However, the Dev environment goes straight into the DSA refresh because the ODX ‘starts’ and finishes instantly. Meaning the DSA only gets a few thousand rows from the still-refreshing ODX (largest table has 2000 rows where I'd expect 2million+).Is my assumption correct in using the Usage condition? How can I use the same Execution packages for Dev/Prod while still using the same ODX ?Thanks for any help!
There's loads of documentation about AAS and PowerBI Premium tabular models being mostly identical in use. Currently we are using 2 expensive AAS servers to host our development and testing semantic models. Transferring these 2 environments to a PPU environment within PowerBI would save allot of money. Within the current version of TimeXtender there is a separate option to select a Premium Tabular model instead of AAS. This is something the legacy version does not have. But why would TimeXtender need to know the difference. In working with either Premium or AAS their XMLA connection is exactly the same. Setting a migration from AAS to Premium and then simply exchanging the links within the environment properties seems like a simple enough plan. As i found in the following link the important part would be having an updated library to do this data transfer. https://learn.microsoft.com/en-us/analysis-services/client-libraries?view=azure-analysis-services-currentOther than this i don't see
I need to be able to call an API using a reference number from the ERP system ( which is already in the ODX) as part of the URL. It looks like the Dynamic parameters as descibed on the old support site could work. However it is stated that this will only work in the old fashioned business unit. Is there a solution that would work in the ODX or would a Power Automate/Logic App be the only solution.
I'm encountering a truncation error when synchronizing Dynamic AX Adapter data source in my Business Unit
We have an upgraded MS Dynamics 365 environment, from which we publish to an Azure SQL database. When I attempt to attach an existing project to it to synchronize data source (using a SQL data source connector as the project normally does) I can connect with a test connect, but then after building the selection tree, TimeXtender throws the error “The given value of type String from the data source cannot be converted to type nvarchar of the specified target column. String or binary data would be truncated." Since it isn’t yet part of project execution but rather a TimeXtender comparison operation, I’m somewhat befuddled as to how to proceed. I’m not sure if this belongs in data sources or desktop, so took a guess on placement.
Relates to TimeXtender 6024.1 and later versions. The On-Demand Data Warehouse Ingestion feature was released in TimeXtender 6134.1What are Jobs? Execution Packages Adding Execution Packages in Data Warehouse and Semantic Model Instances Configuring an Execution Package Jobs Adding a Job Set up a Job for a Data Warehouse or Semantic Model Instance Set up a Job for an ODX Instance Edit and Delete Jobs and Schedules Adding a Notification for an Execution Package Adding a Prioritization to an Execution Package Adding a Usage Condition to an Execution Package On-Demand Data Warehouse Ingestion What are Jobs?Jobs are used to schedule executions for ODX, Data Warehouse, and Semantic Model instances. For ODX instances, jobs are used to schedule synchronization, transfer, and storage management tasks. For Data Warehouse and Semantic Model instances, jobs are used to schedule execution packages.Execution PackagesAdding Execution Packages in Data Warehouse and Semantic Model InstancesBef
This guide will walk you through the steps that to Configure an On-Prem Server. If you are interested in a cloud-based deployment or to continue with the remaining configuration steps, please navigate to the overall guide: Configure your TimeXtender Environment. Follow the below steps to configure an On-Prem Server. Choose and Deploy the desired Server ConfigurationYou have multiple options regarding server configuration depending on where you wish to house the TimeXtender application, Services, and SQL Server databases.Consolidated BI ServerTimeXtender services and SQL Server databases are all installed on the same windows server. This setup is easier to maintain and offers some performance advantages due to zero network latency.Separate Application and Database ServersTimeXtender services are installed on an "Application" server and the SQL Databases are stored on a separate "Database" server. This setup can mitigate risk be ensuring additional applications aren't installed and users
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.