Ask questions about TimeXtender Data Warehouse Instances
- 139 Topics
- 531 Replies
I’m testing out the new version of TimeXtender. I’ve downloaded the newest version of SQL Server that was suggested in the training material. I’ve created ODX and MDW instance, when I try to deploy a table in the MDW instance I get the following error: Not supported in current SQL version Version:Microsoft SQL Server 2022 (RTM) - 16.0.1000.6 (X64) Oct 8 2022 05:58:25 Copyright (C) 2022 Microsoft Corporation Developer Edition (64-bit) on Windows 10 Enterprise 10.0 <X64> (Build 22621: ) (Hypervisor)
Hey,I face this issue quite frequently and have not found out how to solve it yet so I’m hoping someone on here can tell me how to do this.I have a REST data source in which I make a selection of columns. I drag that table into the DSA layer, rename some columns and voila, I have my table. Now I found out that I need some additional columns from the data source so I go into my ODX and in the table selection I include those columns as well. The transfer is succesful and I have my data with additional columns. Now here comes my problem.I want to sync my table with the data source and add those additional columns to my existing table. When I right-click-drag the table into the DSA layer I can sync, smart sync and sync only with existing fields. The first options adds all columns to the bottom of my table (including the ones I already have). The other two options only sync the columns that I already have…Hope that someone can help me out as I face this more often in development.
Hi TX Community!I get data from an external-database everyday.Since we have an aim of incremental-loading the data that we extract, we are doing some query-tables from the datasource to create a “incremental-load”-key.In that matter we are experiencing two issues:1- TX cannot read the date-formats that are extracted from the database. This is the format that we get out: Of course, we can right-click on the field and edit the datatype, but that we would have to do everytime we syncronize the datasource because everytime we syncronize the datasource all the date-fields are back to the “unknown”-format and therefore we would have to right-click on each field and edit the data-type. We’ve tried to use the “Data type overrides” but it doesnt seem that we can convert from an “unknown”-format. How can we solve this problem? As mentioned, the tables are query tables and therefore we would like to think that the date-formatting could be solved with a CAST or a CONVERT function. Any ideas?2- In
HelloI currently have a column with the following file name in one of my tables. C:\Users\John\OneDrive - Sales Solutions\Desktop\TimeXtender\TMX-DataSamples\RLZ\BESTERS - BESTERS POINT - BESTERS IND_DEC 22_101602_0.txtIs there a way to remove the path name in the column to only have the file name?I have a file for each month and have merged the files so all the data is in one table. For example, remove this piece: C:\Users\John\OneDrive - Sales Solutions\Desktop\TimeXtender\TMX-DataSamples\RLZ\and only show this piece:BESTERS - BESTERS POINT - BESTERS IND_DEC 22 as the file name.I also need to have the Month Year (DEC 22) of each file copied into a new column called “Date”. Is this at all possible?Thank you
How can I solve extension _R tables failing to create? I am working on the old project Now.This project was copied from another project repository.Environment: SandboxVersion: 18.104.22.168Data Source: Business unitProvider: SQL Server Data SourceThe connection with the Data source is good and synchronized well.When I am deploying the tables in ODX storage getting an Error.An error occurred during create a table. See exception details for the failing object: Create failed for Table 'BSA.BSA_dbo_Inspection_R'.An exception occurred while executing a Transact-SQL statement or batch.The specified schema name "BSA" either does not exist or you do not have permission to use it.
Did anyone try to connect azure sql database using Azure AD Integrated Authentication? I notice MFA was missing as an Auth type but AD Integrated was failing with error. Let me know if anyone resolved this error. One or more errors occurred.Could not discover endpoint for Integrate Windows Authentication. Check your ADFS settings. It should support Integrate Widows Authentication for WS-Trust 1.3 or WS-Trust 2005. Details: Could not discover endpoint for Integrate Windows Authentication. Check your ADFS settings. It should support Integrate Widows Authentication for WS-Trust 1.3 or WS-Trust 2005.
On clean procedures generated by TX, they seem to have a RECOMPILE embedded into the procedure itself as a general rule. I think this is to keep them fresh to other changes in a project that get introduced. But do we know the exact reason why, and is there a way to remove other that doing a customized change table by table?
Dear Support,The reload in TimeXtender is giving the error: An item with the same key has already been added.It seems to be that one column is mapped on two different columns in the same table.Is there a quick solution to find the column which is causing this error? Thanks in advance! Christian
I have 3 tables in my src data warehouse:src.SalesOrderItemsDelta, this table gets filled every day with a delta (changes today vs yesterday) of our order lines. Via src.SalesOrderItems, this table gets filled every Saturday night with all the order lines available. src.preFactSalesOrderItems, this table gets filled via a custom table insert with the following table insert: SELECT [SalesOrderItemID] ,[SapClient] ,[SalesOrderNumber] ,[SalesOrderItemNumber] ,[SalesOrganization] ,[DistributionChannel] ,[Division] ,[FaboryArticleNumber] ,[SoldToCustomerCode] ,[BinCode] ,[CreatedOnDate] ,[CreatedOnDateID] ,[ChangedOnDate] ,[ReasonForRejectionCode] ,[PromisedDeliveryDate] ,[PromisedDeliveryDateID] ,[CommunicatedDeliveryDate] ,[CommunicatedDeliveryDateID] ,[PlantCode] ,[SalesAmount] ,[SalesCurrency] ,[ItemCategoryCode] ,[OrderedQuantity] ,[BinQuantity] ,[SalesOrderCategoryCode] ,[SalesOfficeCode] ,[ConfirmedDeliveryDate] ,[ConfirmedDeliveryDateID] ,[CommittedDeliveryDate] ,[CommittedD
How can I conditionally fill down/flash fill NULL values with previous values based on certain criteria?
I would like to be able to flash fill down NULL values in my DSA table with certain conditions.In the table below I have multiple NULL values. Take for example the column ‘CardCode DUAL’.Row 2 with Company key MTW and project 1201121979 shows for CardCode DUAL DB0006. I would like to show value DB0006 also for all other rows where company key = MTW and project = 1201121969.Same for Route Bron column. I would like to fill down NULL values on the most recent NON Blank value for that Company_Key+Project combination.I think it should be possible with a self join or self select, but not sure how.
Hi community,we are facing the problem, that we create duplicates when we bring data from an ODX API source to the MDW. We are working with an overlapping sliding window of two days in the schema file (because data can change and there is no last modified date) and only set the primary key on the ODX source.In the MDW (dedicated SQL pool) we enabled the history and set the ID as natural key. All fields are marked as type 1 fields.The execution brings us duplicated ID values - no updates are madeThanks for your helpMichael
We are using Business unit to land our data in the DW. We have a few columns where we need to obfuscate some of the data.At what point should a script action be placed to update the field in the raw table prior to the data being moved into the valid table, so our script does not need to include an update to the field in both the raw and valid table. the script uses UPDATE table SET column
Hi,I obtain a deployment error when I try to make my table historic. Computed column 'SCD Surrogate Hash Key' in table ' <mytable>' cannot be persisted because the column is non-deterministic. I think this has to do with the excessive amount of columns in the table: 695 columns, all varchar(2000). What confuses me is that when I set the hashing algorithms to debug, I still get the same error I don't have any transformations on this table, its a straight copy from to ODX. Is there a limit on the amount of fields a History table can have?
Hello,We have a data source that contains several additional/stacked connections in our Business Unit. We now wanted to add the 4th additional connection (so in total 5 data connections), but TimeXtender crashes when clicking on OK after configuring the connection. I think it tries to connect to the data source and get meta data for the tables it needs to select but somehow this fails. Found this in the Event Viewer: The errors are as follows:.NET = System.OutOfMemoryExceptoiin Application Error = Faulting application path: C:\Program Files\TimeXtender\TimeXtender 22.214.171.124\timeXtender.exe Faulting module path: C:\Windows\System32\KERNELBASE.dll Application Error = same as 2.)We are on version 20.10.38, do you have any idea what might be causing this? Why don't we get a proper error message in TX interface?Connection to the data source through SSMS is working without any issues by the way.Best regards,Kaj
I’ve got the following user request but I am not sure what the most optimal solution is, as i can think of many possibilities.I’ve got 2 Facts Tables:DSA.Fact_Turnover DSA.Fact_TransportOrdersBoth Facts contain the ‘Project’ column. The user request is to finally get the Turnover for all projects which are in Fact_TransportOrder. What is the most optimal way to get a check column to see if the Project value of Fact_TransportOrder is also present in Fact_Turnover and why? I want to use this filter column to eventually only present him the Turnover for projects which are also present in the Fact_TransportOrder.I’m really curious, as I already have such solutions, but am not sure if there are more easy TimeXtender functions or tricks which i’m not using currently.
Hi community,I created a role in the MDW and added Grant permission on one specific database schema. This schema contains some tables and some Stored Procedures. This role need to be able to select data in the tables and execute the stored procedures. I created my Object Security Setup using these settings: When I login using SSMS (using the user in the role), I see the correct tables and I'm able to select data in the tables. But I'm not able to see the stored procedures. When I check the permissions on the ‘ExcelAddin’ database schema. Only the ‘select’ permission in granted. Is it possible to grant permission for ‘execute’ as well? Using TimeXtender?
I have a weird issue in my new TimeXtender project, where my tables in my dwh layer won't load data from the Business Unit. I have a very simple setup to start with. I have one business unit which loads a couple of tables, and the Bussines unit configured to store data on our “VMTXTD001” server in a database. The Datawarehouse uses the same database as the Bussiness Unit and should load from that Business Unit. But when I execute the DWH layer I get this message for all tables: Start Time : 4/19/2023 4:14:08 PM End Time : 4/19/2023 4:14:10 PM on server: VMTXTD001-Execute Execution Package Default 'Failed' Invalid object name 'VMTXTD001.dbo.DM_CRM_Replica_dbo__CustomAccount_DSA_dsa_test_T'. Details: SQL Server: 'vmtxtd001' SQL Procedure: 'dsa.usp_CrDM_dsa_test_CRM_dbo__CustomAccount' SQL Line Number: 12 SQL Error Number: 208 Invalid object name 'VMTXTD001.dbo.DM_CRM_Replica_dbo__CustomAccount_DSA_dsa_test_T'. To me the issue appears to be that TimeXtend
I am setting up a table relation where the tables join based on a date range (see screenshot below)In setting this up in TimeXtender, I noticed that I can only set this up on the lookup field and not as the default relation between the tables.Does anyone know of a way this can be setup on the default table relation? Or can we had as an enhancement for TX?(NOTE: I have edited the table names and columns for this post)
In my DSA I have my Business Partner Master Data. I want to add a field that named Intercompany that shows ‘Intercompany’ when BP Group Code = 104 or 109 and else show Extern. BP Group Code has a smallint datatype. I added a custom field with the CASE WHEN statement. However, the extra field does not show the right results.When executing the CASE statements SMSS everything works fine, but when I implement it in my DSA, it does not work correctly.I tried all kinds of CASE statements with e.g. Like,OR statements, strings, however it does not show the desired result, but rather tons of nulls.The screenshot below shows the intercompany field with the Nulls and some tests CASE statements.
Hi all,Too many parameters in nasa api data available year and month. I changed their name via RSD file because and have only one table like below; I have to change this columns in this dataset from this station WS10_MAX201901 WS10_MAX201902 WS10_MAX201903 ….. WS10_MIN201901 WS10_MIN201902 WS10_MIN201903 …... to this station MM.YYYY WS_10_MAX WS_10_MIN How can I do it? I tried Custom Table Insert but doesnt work. I tried also Junk Dimension but I had to clone my dataset 12 time to be able to print for example 12 months 2019 data consecutively.
I'm following the steps explained on this page (https://legacysupport.timextender.com/hc/en-us/articles/360034695651-Connect-to-a-REST-API), but I can't figure out. Situtation: I have to get for all station weather data from DW. An example for stationnummer 10184 like this:https://s3.eu-central-1.amazonaws.com/app-prod-static.warnwetter.de/v16/forecast_mosmix_10184.json You can find the all station list here: https://www.dwd.de/DE/leistungen/met_verfahren_mosmix/mosmix_stationskatalog.cfg?view=nasPublication&nn=16102 What I did?first I set up a dataset for the sample station 10184 and got the ‘days.rsd’ file. Then I made some changes in ‘days_changed’ file(it doesnt work). And I coudnt generate it.
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.