Ask questions about TimeXtender Data Warehouse Instances
- 139 Topics
- 531 Replies
Hi, I am syncing my data between ODX (csv files) and my DSA_Clean data area (Data waehouse). No i’m constantly getting an error that 1 of my previous inserted rows doesn’t exist anymore. I’ve tried to sync all my possible steps, so ODX, Source, Data are, but I keep getting the error and I don’t know hpw to resolve this problem.I tried several steps:1 Add the table as new table in my datawarehouse area, but still same error.2 Remove and add column again
I have table A with history enabled, all columns SCD type 2 except the PK and a record is marked as deleted when deleted in the source (with a seperate record).In the next data area / data warehouse (DSA) i want to create an incremental table B based on table A, but I do not want to have deleted records from A in table B.When follwing the incremental rule wizzard the incremental selection rule can be set based on the incrementaltimestamp but deletes can not be handled because table A has no deletes (is thomstone =1).How to solve this the easiest? I can only think of taking along the Isthombstone field to table B and delete the records in a post script but this should be possible musch easier I would say.Suggestions?
Is it normal when loading fact table incrementally with new data, that it can take up to one hour?Are there any tips to have this incremental load to perform better?Moving records from staging table to fact tablefact table is 140 million records, new load is around 500 thousand records with new timestamp here is the execution log overview the table is not that complex.17 fields are moved over, from staging to fact table1 field is from keystore3 fields are have custom value script table settings are.. we are running TX, version. 6346.1Azure database, running with 6 cpu
Hi,I have seen clients use Supernatural key fields to create keys between the fact and dimension tables, usually when the logical key is a combination of fields. I have seen these two methods used:The Supernatural field key is created in both the fact and dim table, using the same key store The Supernatural key is created in the dim table, then mapped to the fact table with a conditional lookupIs there a difference in performance here? Is there a best practice to advocate for?
Hi team I have the following tableAnd I have to create this transformation on incidentGroup Column CASE WHEN [incidentTypeID] >= 100 AND [incidentTypeID] <= 199 THEN '1XX-Fire' WHEN [incidentTypeID] >= 200 AND [incidentTypeID] <= 299 THEN '2XX-Overpressure, Explosion, Overheat(no fire)' WHEN [incidentTypeID] >= 300 AND [incidentTypeID] <= 399 THEN '3XX-Rescue & Emergency Medical' WHEN [incidentTypeID] >= 400 AND [incidentTypeID] <= 499 THEN '4XX-Hazardous Condition(No Fire)' WHEN [incidentTypeID] >= 500 AND [incidentTypeID] <= 599 THEN '5XX-Service Call' WHEN [incidentTypeID] >= 600 AND [incidentTypeID] <= 699 THEN '6XX-Good Intent Call' WHEN [incidentTypeID] >= 700 AND [incidentTypeID] <= 799 THEN '7XX-False Alarm & False Call' WHEN [incidentTypeID] >= 800 AND [incidentTypeID] <= 899 THEN '8XX-Servere Weather & Natural' WHEN [incidentTypeID] >= 900 AND [incidentTy
Hi team I am beginning work with a new client, I created environments development, UAT, and Prod, and after I created Global DataBases DSA and MDW for now.My question is, How I have managed the projects on Global DataBases? Only can I create one project when using Global DataBases or Can I create a lot?Thanks for your helpIgnacio
Hi,I have added a new field “PCD_Prev” to my table and populated this field using the LAG function (Custom transformation) Like so:LAG([PCD], 1, '0') OVER (Partition BY [AdministrationID], [ApplicationID] ORDER BY [AcceptanceID] )Now I try to use this field in a case statement of another field, again using Custom transformation, like so:COALESCE(SUM(CASE WHEN [PCD_Prev] in ('0', 'ka_aa') THEN 1 WHEN ([PCD] = 'ka_hb' AND [PCD_Prev] = [PCD] ) THEN 1 ELSE 0 END ) OVER(partition BY [AdministrationID], [ApplicationID] ORDER BY [AcceptanceID] ROWS between unbounded preceding and current row ), 0)I get an error that I can't use a window frame in another window frame although I don't do it in the same transformation.. Is this standard behavior? Do you have a solution for this?In SSMS I add the new field in the FROM clause that is adding at as an additional field to my existing table in the FROM clau
We are facing a performance issue while fetching the data from D365 F&O using CData provider. While fetching a data entity with around 700K records, It was taking around 1:20 mins, now when we execute it, its running for 10+ hours with no response. While executing we noticed a SQL session with bulk insert command from TimeXtender with suspended state. and on the D365 database there are multiple session of batch process to fetch the data based on partition number. Since we are witnessed multiple sessions keeps executing, we would like to understand how the batch size is defined. And how we can customize the batch. Or any other way to set the connection properties which pull’s the data faster.Query for Ref :FROM Table_XXXXXXX T2 WHERE ((((((T2.PARTITION=5637144576) AND ((T2.PARTITION#2=5637144576) OR (T2.PARTITION#2 IS NULL))) AND (T2.PARTITION#3=5637144576)) AND (T2.PARTITION#4=5637144576)) AND (T2.PARTITION#5=5637144576)) AND (T2.PARTITION#6=5637144576)) )T1 WHERE ((T1.rowNumber>
I’m importing a table which is supposed to have an integer primary key, but the data is messy and we find things like “6TEST3” or “CREDIT3” in this field. I want to read the table and use this field as an integer, however can’t convert it because of those rows. Is there a way to remove a row if I can’t convert a field value to integer?
In v20.10.43, I’ve come across a bunch of auto-generated views with a _BCV suffix.Are they used for merged (incremental?) loading in the next layer, or what’s the purpose of these views?Why do they need to be redeployed whenever I do a managed deployment?And why does the SQL Clean Up Wizard delete them when they’re being used?(I haven’t tested if this could happen in a v6284.1 MDW)
Hi,To my knowledge it is not possible to setup alerts for errors or warnings on validation rules.However, I read the following on the TimeXtender Sales Hub:"Set up automated alerts to notify you when data quality rules are violated, data quality issues are detected, or executions fail to complete.” (https://coda.io/d/TimeXtender-Sales-Hub_d3yl25w6y3Y/Data-Quality_sueAz#_luZ9c).Does someone know if this is possible, and if so, how? If not this would be a nice idea though! Kind regards,Devin
Hi,I normally use Custom transformation (CASE statements) when I need to populate a field based on multiple conditions, in this case multiple “OR” conditions (Logical disjunction) . It was brought to my attention not to use CASE statements in Custom transformations. So I have below an example using a fixed value transformation but I want to add two conditions to populate only when it is ‘FL’ or ‘OL’. When I add these conditions the conditions are on separate lines but still TX makes an “AND” logic of it so instead of (like ‘FL%’ or like ‘OL%’) it translate it to (like ‘FL%’ & like ‘OL%’). How can I get the “OR” instead of “AND”?
Hello,I have a history enabled table with a lookup field. The lookup field is marked as type I (update record). However, the record is only updated with the new values in the table from which the lookup field is retrieved, if the record is available in the raw variant of the history table. If the record has been updated in the mean time, the lookup field is never triggered to be updated.Is there a way to have the lookup fields be updated in a history enable table without the record being present in the raw table?I have currently created a work around by not truncating the raw table, but this causes an unnecessary increase in the size of the raw table and produces many primary key violation errors.Looking forward to your answer!
I need to make a field transformation such that if the field value starts with “DOK”, then I want to replace the value with “1001”. I tried to set up this custom field transformation using something along the lines of CASE WHEN …, but didn’t manage to get this to work. How would I write this code in TX?
We have a historic table and we use Dw_id as an unique indentifier. But when we update the record the DW_Id will stay the same, so we have duplicate values in the table on that field then. the original DW_id field is changing but the transformantion field isn’t. Because we work with custom data aswell that we want to have there own value we work with the custom transformation. I will provide some screens from the issue: Can you provide me a way so the field eenheidssleutel will be unique over time and is equal to the DW_ID now and in the past of the historic table.
v126.96.36.199A while ago I was looking for a solution on how to conditionally fill down/flash fill values with certain criteria. Unfortunately the answers in this ticket didn't do the trick for me, as I kept getting NULL values with all solutions I've tried. How can I conditionally fill down/flash fill NULL values with previous values based on certain criteria? | Community (timextender.com)Now I seem to get it working with a Self Lookup view and a Lookup Field (Top first-non value). However, I have to repeat this process for almost 10 columns. The loading of this table went from 06:00 to 30:00 minutes. The lookup field looks as follows. What it does is:Look if column Route Bron is empty, If Route Bron is not empty, just show Route Bron. If Route Bron is empty, lookup the TOP first non empty Route Bron Value with the same Source, Project, Cardcode where project is not empty and not 999999 Is there a way how I can optimize this process? Thanks in advance!
In fact tables we are using subrogate keys from the dimension tables using supernatural keys.The relationship between the fact table and the dimension table it’s using the bussines key and getting the subrogate key from the dimension table into fact table. But some times it doesn’t exists, and I’m using -1 to put that value.What it’s the best way to include the unknow member (-1) in the dimension table?
In Custom data insert i have one field which is varbinary(8000) data type. When i try to insert a column i get an error: The given value of type String from the data source cannot be converted to type varbinary of the specified target column.Failed to convert parameter value from a String to a Byte.Invalid cast from 'System.String' to 'System.Byte'.Details:Invalid cast from 'System.String' to 'System.Byte'.Module: mscorlibSystem.InvalidCastExceptionat System.Convert.DefaultToType(IConvertible value, Type targetType, IFormatProvider provider)at System.Data.SqlClient.SqlParameter.CoerceValue(Object value, MetaType destinationType, Boolean& coercedToDataFeed, Boolean& typeChanged, Boolean allowStreaming)Failed to convert parameter value from a String to a Byte.Module: System.DataSystem.InvalidCastExceptionat System.Data.SqlClient.SqlParameter.CoerceValue(Object value, MetaType destinationType, Boolean& coercedToDataFeed, Boolean& typeChanged, Boolean allowStreaming
Hi community,A customer is using a specific role that is allowed to select data from specific views in SQL. When one of these views are deployed, the Object Security Setup isn't deployed. The Database Role/Security Settings needs to be redeployed manually.Is it possible to automatically deploy the Object Security Setup, if the a view/table is deployed that is using security settings?
Hi,I am working on a column that evaluates a date against GetDate() and returns a 1 or 0. The table can be loaded incrementally. I assumed that if I turned on "Keep field values up-to-date", this column would always be recalculated at loadtime, but it seems that this is not true.Is there a way to make these two work together or do I have to give up on incremental loading?Thanks
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.