Ask questions about TimeXtender Data Warehouse Instances
- 116 Topics
- 421 Replies
Extracting data from External-database
Hi TX Community!I get data from an external-database everyday.Since we have an aim of incremental-loading the data that we extract, we are doing some query-tables from the datasource to create a “incremental-load”-key.In that matter we are experiencing two issues:1- TX cannot read the date-formats that are extracted from the database. This is the format that we get out: Of course, we can right-click on the field and edit the datatype, but that we would have to do everytime we syncronize the datasource because everytime we syncronize the datasource all the date-fields are back to the “unknown”-format and therefore we would have to right-click on each field and edit the data-type. We’ve tried to use the “Data type overrides” but it doesnt seem that we can convert from an “unknown”-format. How can we solve this problem? As mentioned, the tables are query tables and therefore we would like to think that the date-formatting could be solved with a CAST or a CONVERT function. Any ideas?2- In
Remove path name from File Name in table column
HelloI currently have a column with the following file name in one of my tables. C:\Users\John\OneDrive - Sales Solutions\Desktop\TimeXtender\TMX-DataSamples\RLZ\BESTERS - BESTERS POINT - BESTERS IND_DEC 22_101602_0.txtIs there a way to remove the path name in the column to only have the file name?I have a file for each month and have merged the files so all the data is in one table. For example, remove this piece: C:\Users\John\OneDrive - Sales Solutions\Desktop\TimeXtender\TMX-DataSamples\RLZ\and only show this piece:BESTERS - BESTERS POINT - BESTERS IND_DEC 22 as the file name.I also need to have the Month Year (DEC 22) of each file copied into a new column called “Date”. Is this at all possible?Thank you
Business Unit_ODX , Error with _R tables
How can I solve extension _R tables failing to create? I am working on the old project Now.This project was copied from another project repository.Environment: SandboxVersion: 126.96.36.199Data Source: Business unitProvider: SQL Server Data SourceThe connection with the Data source is good and synchronized well.When I am deploying the tables in ODX storage getting an Error.An error occurred during create a table. See exception details for the failing object: Create failed for Table 'BSA.BSA_dbo_Inspection_R'.An exception occurred while executing a Transact-SQL statement or batch.The specified schema name "BSA" either does not exist or you do not have permission to use it.
Unable to Set up MDW connecting to Azure SQL database using Azure AD Integrated Authentication type
Did anyone try to connect azure sql database using Azure AD Integrated Authentication? I notice MFA was missing as an Auth type but AD Integrated was failing with error. Let me know if anyone resolved this error. One or more errors occurred.Could not discover endpoint for Integrate Windows Authentication. Check your ADFS settings. It should support Integrate Widows Authentication for WS-Trust 1.3 or WS-Trust 2005. Details: Could not discover endpoint for Integrate Windows Authentication. Check your ADFS settings. It should support Integrate Widows Authentication for WS-Trust 1.3 or WS-Trust 2005.
Intent of Recompile - Default/Pro/Con
On clean procedures generated by TX, they seem to have a RECOMPILE embedded into the procedure itself as a general rule. I think this is to keep them fresh to other changes in a project that get introduced. But do we know the exact reason why, and is there a way to remove other that doing a customized change table by table?
An item with the same key has already been added
Dear Support,The reload in TimeXtender is giving the error: An item with the same key has already been added.It seems to be that one column is mapped on two different columns in the same table.Is there a quick solution to find the column which is causing this error? Thanks in advance! Christian
Incremental load based on history table with soft deletes
I have table A with history enabled, all columns SCD type 2 except the PK and a record is marked as deleted when deleted in the source (with a seperate record).In the next data area / data warehouse (DSA) i want to create an incremental table B based on table A, but I do not want to have deleted records from A in table B.When follwing the incremental rule wizzard the incremental selection rule can be set based on the incrementaltimestamp but deletes can not be handled because table A has no deletes (is thomstone =1).How to solve this the easiest? I can only think of taking along the Isthombstone field to table B and delete the records in a post script but this should be possible musch easier I would say.Suggestions?
Incremental loading on table using a custom table insert with a union
I have 3 tables in my src data warehouse:src.SalesOrderItemsDelta, this table gets filled every day with a delta (changes today vs yesterday) of our order lines. Via src.SalesOrderItems, this table gets filled every Saturday night with all the order lines available. src.preFactSalesOrderItems, this table gets filled via a custom table insert with the following table insert: SELECT [SalesOrderItemID] ,[SapClient] ,[SalesOrderNumber] ,[SalesOrderItemNumber] ,[SalesOrganization] ,[DistributionChannel] ,[Division] ,[FaboryArticleNumber] ,[SoldToCustomerCode] ,[BinCode] ,[CreatedOnDate] ,[CreatedOnDateID] ,[ChangedOnDate] ,[ReasonForRejectionCode] ,[PromisedDeliveryDate] ,[PromisedDeliveryDateID] ,[CommunicatedDeliveryDate] ,[CommunicatedDeliveryDateID] ,[PlantCode] ,[SalesAmount] ,[SalesCurrency] ,[ItemCategoryCode] ,[OrderedQuantity] ,[BinQuantity] ,[SalesOrderCategoryCode] ,[SalesOfficeCode] ,[ConfirmedDeliveryDate] ,[ConfirmedDeliveryDateID] ,[CommittedDeliveryDate] ,[CommittedD
Automatically deploy Object Security Setup after changing/deploying a view
Hi community,A customer is using a specific role that is allowed to select data from specific views in SQL. When one of these views are deployed, the Object Security Setup isn't deployed. The Database Role/Security Settings needs to be redeployed manually.Is it possible to automatically deploy the Object Security Setup, if the a view/table is deployed that is using security settings?
How can I conditionally fill down/flash fill NULL values with previous values based on certain criteria?
I would like to be able to flash fill down NULL values in my DSA table with certain conditions.In the table below I have multiple NULL values. Take for example the column ‘CardCode DUAL’.Row 2 with Company key MTW and project 1201121979 shows for CardCode DUAL DB0006. I would like to show value DB0006 also for all other rows where company key = MTW and project = 1201121969.Same for Route Bron column. I would like to fill down NULL values on the most recent NON Blank value for that Company_Key+Project combination.I think it should be possible with a self join or self select, but not sure how.
Duplicates loading data from ODX to MDW
Hi community,we are facing the problem, that we create duplicates when we bring data from an ODX API source to the MDW. We are working with an overlapping sliding window of two days in the schema file (because data can change and there is no last modified date) and only set the primary key on the ODX source.In the MDW (dedicated SQL pool) we enabled the history and set the ID as natural key. All fields are marked as type 1 fields.The execution brings us duplicated ID values - no updates are madeThanks for your helpMichael
We are using Business unit to land our data in the DW. We have a few columns where we need to obfuscate some of the data.At what point should a script action be placed to update the field in the raw table prior to the data being moved into the valid table, so our script does not need to include an update to the field in both the raw and valid table. the script uses UPDATE table SET column
SCD Surrogate Hash Key is non-deterministic.
Hi,I obtain a deployment error when I try to make my table historic. Computed column 'SCD Surrogate Hash Key' in table ' <mytable>' cannot be persisted because the column is non-deterministic. I think this has to do with the excessive amount of columns in the table: 695 columns, all varchar(2000). What confuses me is that when I set the hashing algorithms to debug, I still get the same error I don't have any transformations on this table, its a straight copy from to ODX. Is there a limit on the amount of fields a History table can have?
TX crashes when adding 'Additional Connection'
Hello,We have a data source that contains several additional/stacked connections in our Business Unit. We now wanted to add the 4th additional connection (so in total 5 data connections), but TimeXtender crashes when clicking on OK after configuring the connection. I think it tries to connect to the data source and get meta data for the tables it needs to select but somehow this fails. Found this in the Event Viewer: The errors are as follows:.NET = System.OutOfMemoryExceptoiin Application Error = Faulting application path: C:\Program Files\TimeXtender\TimeXtender 188.8.131.52\timeXtender.exe Faulting module path: C:\Windows\System32\KERNELBASE.dll Application Error = same as 2.)We are on version 20.10.38, do you have any idea what might be causing this? Why don't we get a proper error message in TX interface?Connection to the data source through SSMS is working without any issues by the way.Best regards,Kaj
Most optimal way to check if Value of Table A, Column A is also in Table B, Column A
I’ve got the following user request but I am not sure what the most optimal solution is, as i can think of many possibilities.I’ve got 2 Facts Tables:DSA.Fact_Turnover DSA.Fact_TransportOrdersBoth Facts contain the ‘Project’ column. The user request is to finally get the Turnover for all projects which are in Fact_TransportOrder. What is the most optimal way to get a check column to see if the Project value of Fact_TransportOrder is also present in Fact_Turnover and why? I want to use this filter column to eventually only present him the Turnover for projects which are also present in the Fact_TransportOrder.I’m really curious, as I already have such solutions, but am not sure if there are more easy TimeXtender functions or tricks which i’m not using currently.
Object security, allow execute of stored procedures on specific database schema
Hi community,I created a role in the MDW and added Grant permission on one specific database schema. This schema contains some tables and some Stored Procedures. This role need to be able to select data in the tables and execute the stored procedures. I created my Object Security Setup using these settings: When I login using SSMS (using the user in the role), I see the correct tables and I'm able to select data in the tables. But I'm not able to see the stored procedures. When I check the permissions on the ‘ExcelAddin’ database schema. Only the ‘select’ permission in granted. Is it possible to grant permission for ‘execute’ as well? Using TimeXtender?
Tables in DWH won't load data from Business Unit
I have a weird issue in my new TimeXtender project, where my tables in my dwh layer won't load data from the Business Unit. I have a very simple setup to start with. I have one business unit which loads a couple of tables, and the Bussines unit configured to store data on our “VMTXTD001” server in a database. The Datawarehouse uses the same database as the Bussiness Unit and should load from that Business Unit. But when I execute the DWH layer I get this message for all tables: Start Time : 4/19/2023 4:14:08 PM End Time : 4/19/2023 4:14:10 PM on server: VMTXTD001-Execute Execution Package Default 'Failed' Invalid object name 'VMTXTD001.dbo.DM_CRM_Replica_dbo__CustomAccount_DSA_dsa_test_T'. Details: SQL Server: 'vmtxtd001' SQL Procedure: 'dsa.usp_CrDM_dsa_test_CRM_dbo__CustomAccount' SQL Line Number: 12 SQL Error Number: 208 Invalid object name 'VMTXTD001.dbo.DM_CRM_Replica_dbo__CustomAccount_DSA_dsa_test_T'. To me the issue appears to be that TimeXtend
Table Relations using Greater than or Less than Operator
I am setting up a table relation where the tables join based on a date range (see screenshot below)In setting this up in TimeXtender, I noticed that I can only set this up on the lookup field and not as the default relation between the tables.Does anyone know of a way this can be setup on the default table relation? Or can we had as an enhancement for TX?(NOTE: I have edited the table names and columns for this post)
SQL Case When Statement not working correctly with TimeXtender
In my DSA I have my Business Partner Master Data. I want to add a field that named Intercompany that shows ‘Intercompany’ when BP Group Code = 104 or 109 and else show Extern. BP Group Code has a smallint datatype. I added a custom field with the CASE WHEN statement. However, the extra field does not show the right results.When executing the CASE statements SMSS everything works fine, but when I implement it in my DSA, it does not work correctly.I tried all kinds of CASE statements with e.g. Like,OR statements, strings, however it does not show the desired result, but rather tons of nulls.The screenshot below shows the intercompany field with the Nulls and some tests CASE statements.
Hi all,Too many parameters in nasa api data available year and month. I changed their name via RSD file because and have only one table like below; I have to change this columns in this dataset from this station WS10_MAX201901 WS10_MAX201902 WS10_MAX201903 ….. WS10_MIN201901 WS10_MIN201902 WS10_MIN201903 …... to this station MM.YYYY WS_10_MAX WS_10_MIN How can I do it? I tried Custom Table Insert but doesnt work. I tried also Junk Dimension but I had to clone my dataset 12 time to be able to print for example 12 months 2019 data consecutively.
Connecting Rest API with changed parameter in RSD file
I'm following the steps explained on this page (https://legacysupport.timextender.com/hc/en-us/articles/360034695651-Connect-to-a-REST-API), but I can't figure out. Situtation: I have to get for all station weather data from DW. An example for stationnummer 10184 like this:https://s3.eu-central-1.amazonaws.com/app-prod-static.warnwetter.de/v16/forecast_mosmix_10184.json You can find the all station list here: https://www.dwd.de/DE/leistungen/met_verfahren_mosmix/mosmix_stationskatalog.cfg?view=nasPublication&nn=16102 What I did?first I set up a dataset for the sample station 10184 and got the ‘days.rsd’ file. Then I made some changes in ‘days_changed’ file(it doesnt work). And I coudnt generate it.
Login to the community
No account yet? Create an account
Login with SSOSSO login
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.