timextender-en Logo
      • Recently active topics
      • Unanswered questions
      • TimeXtender Desktop Q&A
      • ODX40
      • Data Warehouse59
      • Semantic Models16
      • Jobs & Executions47
      • TimeXtender Portal Q&A
      • General0
      • Data Sources Q&A
      • General2
      • REST APIs3
      • RSD File Customization10
      • SAP1
      • Oracle3
      • ODBC3
      • General Q&A
      • General12
      • Interoperability with other Platforms & Tools4
      • Legacy & Upgrades34
      • Security & Compliance0
      • Infrastructure & Networking1
      • Learn & Share
      • Tips, Tricks and Best Practices0
      • TimeXtender Tuesdays29
      • Community Announcements
      • Getting started in the community5
      • Knowledge base overview
      • Getting Started
      • General16
      • ODX
      • ODX9
      • Data Warehouse
      • Data Warehouse9
      • Semantic Model
      • Semantic Model4
      • Executions & Jobs
      • Executions & Jobs5
      • TimeXtender Portal
      • TimeXtender Portal3
      • Data Sources
      • Data Sources12
      • Infrastructure & Networking
      • Infrastructure3
      • Documentation & Lineage
      • Documentation and Lineage1
  • Product Updates
  • Training
  • Portal
  • For Partners
  • Legacy
Logo
  • Home
  • Community overview

Community

Community Categories

TimeXtender Desktop Q&A

  • 162 topics
  • 453 Replies

TimeXtender Portal Q&A

  • 0 topics
  • 0 Replies

Data Sources Q&A

  • 22 topics
  • 100 Replies

General Q&A

  • 51 topics
  • 148 Replies

Learn & Share

  • 29 topics
  • 33 Replies

Community Announcements

  • 5 topics
  • 5 Replies

  • Recently active
  • Help others
  • Categories
R
Visitor
reiolsVisitor
 REST APIs

Error when retrieving data

Hello,I’m using the CData ADO.NET Provider for JSON 2021 and get this error message in the logg file.[HTTP|Res: 29] HTTP/1.1 200 OK, 548950 Bytes Transferred[HTTP|Res: 29] Request completed in 6828 ms.[EXEC|Page  ] Page successful: 1000 results (6844 ms) NextPage: [30000][EXEC|Messag] Executed query: [SELECT [Actualizations_id], [ActualizationType.Code],[META|Schema] Engine Invalid object name 'sys_resultset_close'[MDUL|SbMDUL] Drop BulkRows table:ErrorInfo[INFO|Connec] Executed sys_disconnect: Success: (16 ms)[INFO|Connec] Closed JSON connection The execution log file reports:Executing table [JSON].[LCSOC_EC_Actualizations]:failed with error:System.Data.SqlClient.SqlException (0x80131904): Execution Timeout Expired.  The timeout period elapsed prior to completion of the operation or the server is not responding. ---> System.ComponentModel.Win32Exception (0x80004005): The wait operation timed out What does it mean and what to do about it?Cheers/Reith 

1
S
Community Manager
11 hours ago
P
Starter
petscheStarter
 ODBC

Configuration of Netsuite provided odbc connector

Hi to all,In order to improve performance of data loading from Netsuite , we were advised to use Netsuite ODBC provided connetor and use Suiteanalytics. I have download the odbc connector provided by Netsuite, and installed it on the Timextender VM, i get an succesful connection in connecting to NS, using ODBC-test connection. However i cannot acess the odbc driver in Timextender ODX, so how do i do that. Keeping in mind that the Netsuite CData connector uses Rest-calls with limits 1000 records.. so that is not feasable. thank you in advance  Peter  

9
Christian Hauggaard
Community Manager
20 hours ago
S
Community Manager
Syed YousufCommunity Manager
 Data Sources

Data SourcesData Sources

The ODX Server can connect to a data source through the following types of providers:TimeXtender's own providers (with tweaks and improvements).  This is usually the best choice if one is available for your data source. CData: For setting up CData providers, please refer to Add a CData Data Source ADF: Providers from TimeXtender that use Azure Data Factory for transferring data. ADO: Providers installed on the machine that are built on ADO.NET. OLE DB: Providers installed on the machine that supports the OLE DB standard.Adding a Data Source in TimeXtender PortalFollow the steps in Add and map a data source connectionUnder Connection settings, enter the connection information required by the data source you've selected.The content of the page depends on the provider you have chosen.  Below is an example of TimeXtender SQL Server provider:Configuring a Data Source in TimeXtender DesktopTo add a new data source, follow the steps in Configuring a data source on the desktopSynchronizing Obj

100
S
Community Manager
1 day ago
hugo.winkelhorst
Explorer
hugo.winkelhorstExplorer
 ODX

Amazon Redshift Connector SSL mode drop down missing options

For one of our clients I need to connect to a database on Amazon Redshift. The IT Vendor who made the system is telling us to use SSL mode Required ( https://docs.aws.amazon.com/redshift/latest/mgmt/connecting-ssl-support.html#connect-using-ssl ). Using this setting in the ODBC connector from Amazon itself works fine but the CData connector only gives a true false dropdown. I've checked if this perhaps was one of those fields where you can overrule the drop down with manual input but that's not the case for this one. Is there any way to get this working in the CData connector or would our only option for now by create DSN and then use the ADO ODBC option the the ODX to connect with the DSN.

4
S
Community Manager
1 day ago
E
Starter
EvidaXremStarter
 REST APIs

Connecting to Authenticated REST Endpoint

I have been trying for quite a while to connect to an REST endpoint requiring authentication. Have gone through the the cdata documentation and various ressources on the support site.I have working setups both in postman and SSIS Script Task, but is very keen on migrating to a more native TX setup (maintainability), will be migratning several more in the time to come.The process is as follows:Get Authentication Token (HTTP POST) The POSTed information is send as ‘x-www-form-urlencoded’ in the body part (Postman) The response is a json object, containing the token in a attribute Get Actual Data using the Authentication Token (HTTP GET with token as a url parameter)The source in this case i ArcGIS, and the documentation does not mention anything about a standardised authentication scheme.Is there a way around implementing this in any of the standardised authentication schemes, or do I need to go in the direction of RSD files?Please adviceThanks in advance.

7
Thomas Lind
Community Manager
1 day ago
A
Explorer
alain.thireExplorer
 Data Warehouse

Loading transactions

I have a transaction table in odx with the following fields (simplified)Key,Value, transactiondate Example100, V100, 2023-01-01100, V100A,2023-01-31 09100, V100B,2023-01-31 10 Each day there are multiple transactions added , of course with  another transaction date.Note in the example I  have only shown the date part (not hours, minutes, seconds) What I need is a table with the latest update in DSAI created the same table. Set primary key to KEY and added an incremental selection rule on transactiondate . Added history settings to update the Value and transaction date based on the keyWhere it goes wrong is if we get in one day multiple transactions 100, V100A,2023-01-31 09100, V100B,2023-01-31 10I traced down the cleansing procedure and Tx detects that there are twice the same key. So far so good.Next it puts the ID of  the latest transcation into <table>_L  and only processes the ID’s that ar not in <table>_LThe result is that I get  100, V100A,2023-01-31 09 which is incor

11
A
Explorer
1 day ago
daniel
Contributor
danielContributor
 Data Warehouse

Supernatural keys & Data Cleansing performance

Dear Community,I like to build my data estates with supernatural keys but in lager datasets the data cleansing starts to take very, very long. Do you happen to have the same issues? Is there a way to make the supernatural keys load faster? Even with incremental loading it begins to be super slow:I've ran a test on 435,397 records. This is on a Azure SQL with 10 vCores1. is a full load on the table with 7 supernatural keys.2. is a full load on the same table without the supernatural keys.1 has data cleansing of 1 second. 2 has a data cleansing of 104 seconds!Second I've done a test on the same tables but now incremental loads:1. incremental load with 7 supernatural keys2. incremental load without supernatural keys1 has a data cleansing time 1,6 seconds and 2 a data cleansing of a whopping 129 seconds!I'm not so sure I want to keep using the supernatural keys. What do you guys do? Take care= Daniel

2
rory.smith
Contributor
1 day ago
Thomas Lind
Community Manager
Thomas LindCommunity Manager
 ODX

Incremental Load in an ODX InstanceIncremental

Relates to TimeXtender 6024.1 and later versions. The subtract from value feature was released in TimeXtender 6024.1This article describes how to setup incremental load in an ODX instance, for more information on setting up incremental load in a data area within a Data Warehouse instance see Incremental load in Data Warehouse Instances.Setup Incremental Loading in an ODX InstanceWhen you have created a data source in the ODX, you have the option of setting up Incremental Load.In there you can add a rule with the following options.You can set it up for some specific schemas, tables, or specific columns. Most importantly is the column it will look for. In the above, I look for the ModifiedDateTime field across all tables. The Subtract from value is an option to subtract from the field your rule applies to. The ability to apply offset incremental selection rules can be used for data sources where the modified date is a Date field, rather than a DateTime field. It can also be used for data

1203
rvgfox
Participant
1 day ago
V
Starter
vw-victaStarter
 ODX

Incorrect Job Repository: Expected 1 found 2, after downgrade

Hello,I'm working on a project (new license model) where we want to use the Azure Data Factory for Data Movement. I've installed the TX 6143.1 version but got the message “Data Factory source is out of date”. Looking at the release notes this version of TX doesn't support ADF Data Movement at the moment:“Warning: The new version does not support the data source providers that move data using Azure Data Factory (e.g. "Azure Data Factory - SQL Server (11.0.0.0) 64 bit").”I've decided to downgrade to TX version 6117 but I'm getting the message as shown below (screenshot). It looks like there is another Repository from both installations.How can I solve this and remove one of the repositories. I want to use the ADF Movement so the repository for version 6117 is needed.Thanks in advance!VinceVicta B.V.

8
V
Starter
1 day ago
S
Starter
sierd.zuiderveldStarter
 Data Warehouse

Force Supernatural key value

Hi, I have a list that maps product_codes to product_ID's.What I would like to create is a Key Store that, when fed a product_code from the list, produces the same product_ID and when fed a new code (so not in list) produces a supernatural_key as usual. Is it possible to force the keystore to create the ID's as shown above? 

2
Christian Hauggaard
Community Manager
3 days ago
R
Visitor
reiolsVisitor
 REST APIs

Error when retrieving data

Hello,I’m using the CData ADO.NET Provider for JSON 2021 and get this error message in the logg file.[HTTP|Res: 29] HTTP/1.1 200 OK, 548950 Bytes Transferred[HTTP|Res: 29] Request completed in 6828 ms.[EXEC|Page  ] Page successful: 1000 results (6844 ms) NextPage: [30000][EXEC|Messag] Executed query: [SELECT [Actualizations_id], [ActualizationType.Code],[META|Schema] Engine Invalid object name 'sys_resultset_close'[MDUL|SbMDUL] Drop BulkRows table:ErrorInfo[INFO|Connec] Executed sys_disconnect: Success: (16 ms)[INFO|Connec] Closed JSON connection The execution log file reports:Executing table [JSON].[LCSOC_EC_Actualizations]:failed with error:System.Data.SqlClient.SqlException (0x80131904): Execution Timeout Expired.  The timeout period elapsed prior to completion of the operation or the server is not responding. ---> System.ComponentModel.Win32Exception (0x80004005): The wait operation timed out What does it mean and what to do about it?Cheers/Reith 

1
S
Community Manager
11 hours ago
P
Starter
petscheStarter
 ODBC

Configuration of Netsuite provided odbc connector

Hi to all,In order to improve performance of data loading from Netsuite , we were advised to use Netsuite ODBC provided connetor and use Suiteanalytics. I have download the odbc connector provided by Netsuite, and installed it on the Timextender VM, i get an succesful connection in connecting to NS, using ODBC-test connection. However i cannot acess the odbc driver in Timextender ODX, so how do i do that. Keeping in mind that the Netsuite CData connector uses Rest-calls with limits 1000 records.. so that is not feasable. thank you in advance  Peter  

9
Christian Hauggaard
Community Manager
20 hours ago
hugo.winkelhorst
Explorer
hugo.winkelhorstExplorer
 ODX

Amazon Redshift Connector SSL mode drop down missing options

For one of our clients I need to connect to a database on Amazon Redshift. The IT Vendor who made the system is telling us to use SSL mode Required ( https://docs.aws.amazon.com/redshift/latest/mgmt/connecting-ssl-support.html#connect-using-ssl ). Using this setting in the ODBC connector from Amazon itself works fine but the CData connector only gives a true false dropdown. I've checked if this perhaps was one of those fields where you can overrule the drop down with manual input but that's not the case for this one. Is there any way to get this working in the CData connector or would our only option for now by create DSN and then use the ADO ODBC option the the ODX to connect with the DSN.

4
S
Community Manager
1 day ago
E
Starter
EvidaXremStarter
 REST APIs

Connecting to Authenticated REST Endpoint

I have been trying for quite a while to connect to an REST endpoint requiring authentication. Have gone through the the cdata documentation and various ressources on the support site.I have working setups both in postman and SSIS Script Task, but is very keen on migrating to a more native TX setup (maintainability), will be migratning several more in the time to come.The process is as follows:Get Authentication Token (HTTP POST) The POSTed information is send as ‘x-www-form-urlencoded’ in the body part (Postman) The response is a json object, containing the token in a attribute Get Actual Data using the Authentication Token (HTTP GET with token as a url parameter)The source in this case i ArcGIS, and the documentation does not mention anything about a standardised authentication scheme.Is there a way around implementing this in any of the standardised authentication schemes, or do I need to go in the direction of RSD files?Please adviceThanks in advance.

7
Thomas Lind
Community Manager
1 day ago
S
Starter
sierd.zuiderveldStarter
 Data Warehouse

Force Supernatural key value

Hi, I have a list that maps product_codes to product_ID's.What I would like to create is a Key Store that, when fed a product_code from the list, produces the same product_ID and when fed a new code (so not in list) produces a supernatural_key as usual. Is it possible to force the keystore to create the ID's as shown above? 

2
Christian Hauggaard
Community Manager
3 days ago
Peter Dijkstra
Starter
Peter DijkstraStarter
 Data Warehouse

Data selection rule is not working

If i try to add a dataselection rule my tables stay empty.Tried allready the selection rule on the table and with an without ‘ ‘ If i do the same in the query tool it looks good. Is it a bug?Using version 6143.1

3
Thomas Lind
Community Manager
3 days ago
K
Visitor
KashifVisitor
 Jobs & Executions

Incremental Load Execution Error Due To Change Of Filter

Hi Support, Mount Anvil have a Incremental project to run the incremental load of the finance system data. Changing the incremental execution rule on a table called G/L Budget Entry from the 'Modified At' field to the 'Last Date Modified' field is preventing the execution from running. The 'Modified At' field is in a date\time format and the 'Last Date Modified' field is just a date only format that appears to be causing the execution failure. The error messages caused are below: Finished executing project 'Incremental' Execution Package 'Update Project' Execution failed Start Time : 24/01/2023 16:31:05 End Time : 24/01/2023 16:32:33 on server: MAV01APP01 -Execute Execution Package Update Project 'Failed' -Execute Business Units 'Failed' -Execute Business Unit Business Unit 'Failed' -'One or more errors occurred.' -Execute JetBCStage_I 'Failed' -'One or more errors occurred.' -Execute Table JetBCStage_I TEST.BC_G/L Entry (17) 'Successful' -Execute Table JetBCStage_I TEST.RowCountGL 'Su

1
S
Community Manager
4 days ago
L
Visitor
LuukNTSVisitor
 Legacy & Upgrades

Using PowerBI Premium Tabular for TimeXtender Legacy instead of AAS

There's loads of documentation about AAS and PowerBI Premium tabular models being mostly identical in use. Currently we are using 2 expensive AAS servers to host our development and testing semantic models. Transferring these 2 environments to a PPU environment within PowerBI would save allot of money. Within the current version of TimeXtender there is a separate option to select a Premium Tabular model instead of AAS. This is something the legacy version does not have. But why would TimeXtender need to know the difference. In working with either Premium or AAS their XMLA connection is exactly the same. Setting a migration from AAS to Premium and then simply exchanging the links within the environment properties seems like a simple enough plan. As i found in the following link the important part would be having an updated library to do this data transfer. https://learn.microsoft.com/en-us/analysis-services/client-libraries?view=azure-analysis-services-currentOther than this i don't see

2
Trine Stuhr
Employee
4 days ago
hugo.winkelhorst
Explorer
hugo.winkelhorstExplorer
 ODX

Using a field from an ERP to use as a Parameter for a REST call

I need to be able to call an API using a reference number from the ERP system ( which is already in the ODX) as part of the URL. It looks like the Dynamic parameters as descibed on the old support site could work. However it is stated that this will only work in the old fashioned business unit. Is there a solution that would work in the ODX or would a Power Automate/Logic App be the only solution. 

1
Thomas Lind
Community Manager
4 days ago
rvgfox
Participant
rvgfoxParticipant
 Data Warehouse

Setup Pre and Post script

The setup phases are confusing, can anyone explain where exactly they run? The Pre script runs previously to insert the records in the valid table, it’s correct in my case, but the post script that must be run after the records inset, doesn’t run.My script it’s: 

3
Christian Hauggaard
Community Manager
4 days ago

TimeXtender Desktop Q&A

Ask questions and find answers about the TimeXtender Desktop Application
ODX

ODX

  • 40 topics
  • 110 Replies
Data Warehouse

Data Warehouse

  • 59 topics
  • 190 Replies
Semantic Models

Semantic Models

  • 16 topics
  • 29 Replies
Jobs & Executions

Jobs & Executions

  • 47 topics
  • 124 Replies

TimeXtender Portal Q&A

Ask questions about the TimeXtender Portal
General

General

  • 0 topics
  • 0 Replies

Data Sources Q&A

Ask questions of about setting up data sources
General

General

  • 2 topics
  • 11 Replies
REST APIs

REST APIs

  • 3 topics
  • 17 Replies
RSD File Customization

RSD File Customization

  • 10 topics
  • 51 Replies
SAP

SAP

  • 1 topic
  • 1 Reply
Oracle

Oracle

  • 3 topics
  • 7 Replies
ODBC

ODBC

  • 3 topics
  • 13 Replies

General Q&A

Ask general questions
General

General

  • 12 topics
  • 26 Replies
Interoperability with other Platforms & Tools

Interoperability with other Platforms & Tools

  • 4 topics
  • 13 Replies
Legacy & Upgrades

Legacy & Upgrades

  • 34 topics
  • 107 Replies
Security & Compliance

Security & Compliance

  • 0 topics
  • 0 Replies
Infrastructure & Networking

Infrastructure & Networking

  • 1 topic
  • 2 Replies

Learn & Share

Share tips and best practices, show what you've created and inspire others
Tips, Tricks and Best Practices

Tips, Tricks and Best Practices

  • 0 topics
  • 0 Replies
TimeXtender Tuesdays

TimeXtender Tuesdays

  • 29 topics
  • 33 Replies

Community Announcements

Everything you need to know about this community
Getting started in the community

Getting started in the community

  • 5 topics
  • 5 Replies

User leaderboard

Show full leaderboard

Badges

  • Certified Solutions Architect
    Stefanhas earned the badge Certified Solutions Architect
  • Certified Solutions Architect
    Trine Stuhrhas earned the badge Certified Solutions Architect
  • Data Warehouse Expert
    Christian Hauggaardhas earned the badge Data Warehouse Expert
  • Innovator
    rvgfoxhas earned the badge Innovator
  • Certified Solutions Architect
    cbutcherhas earned the badge Certified Solutions Architect
Show all badges

Need more help?

Troubleshooting Tips

Learn about troubleshooting techniques

Find a Partner

Find a Partner that fits your needs!

Contact Support

Submit a ticket to our Support Team

Powered by inSided
Terms and ConditionsCookie settings
Back to top

Sign up

Already have an account? Login

Login with SSO

SSO login
or

Login to the community

No account yet? Create an account

Login with SSO

SSO login
or
Forgot password?

Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.

Back to overview

Scanning file for viruses.

Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.

OK

This file cannot be downloaded

Sorry, our virus scanner detected that this file isn't safe to download.

OK