Solved

ODX "loses" transactions


When loading regularly (every hour) using incremental load, the ODX seems to “lose” transactions from source. Next run of the incremental load, ODX does not catch these missing transactions. The work around is to full load data, but the this takes long time. And it renders the incremental load useless as it is not to be trusted.

We are running 20.10.34. Does any of the newer 20 versions have a fix to this?
Regards Mads

icon

Best answer by rory.smith 5 April 2023, 16:17

View original

18 replies

Userlevel 5
Badge +7

How are your incrementals set up? Are the missing transactions updates of existing transactions or new PK? Are you also handling deletes / updates?

As far as I am aware no such problems have manifested and been fixed up to 20.10.41

Userlevel 4
Badge +5

Good questions Rory. I've had this issue where we had some of the data arriving late. by this I mean that the records got skipped by TX because the record got inserted with an earlier timestamp then the timestamp that TX created to do the incremental. So these records got skipped. You can set up a second Load task in the ODX with an incremental set up where you can set the incremental timestamp to be modified and run them every other hour. (You can delay the incremental timestamp with a certain amount)
Or you can check the settings for dirty reads?

Hope this helps.

Userlevel 5
Badge +7

If @daniel 's suggestion is what is happening and the extraction is very important you could also consider only running this incremental through the BU approach and apply subtractions to create a wide enough rolling window to catch the pre-dated records.

The new release features subtractions for incremental load in the ODX Server, but would most likely require the upgrade tool to be available for you.

Userlevel 4
Badge +5

@rory.smith is right. This option is not available in the ‘old’ 20.10v. 
 

Hopefully the upgrade tool will be released soon.

Userlevel 6
Badge +5

Hi Mads

I agree with the others.

Your incremental value in the ODX, needs to be very precise with a date and time to be able to update every hour. If it only contains the date you will loose rows applied later on. In essence to avoid this you would have to only run it once a day just before or after midnight or it would potentially loose rows.

Is that the case for you, I mean that it only contains a date or is the datetime field somehow not being kept up to date on each update or addition?

Thanks you all for comments.

 @rory.smith 

The missing transactions are new pk. 

Incremental are handling Updates (not deletes)

We use Timestamp for incremental load. Source is On premp SQL server for Business Central.

 

@daniels suggestion seems to be what happens. My theory is that while ODX is reading from SQL source, new transactions are written in the source. The time matches with the load times to ODX and the DateTime for new entries in source.

 

@Thomas Lind 

Do you agree that we could use BU for incremental load for at selected group of entry tables? And still use ODX for the rest?

 

Userlevel 6
Badge +5

Hi Mads

Yes you could use the subtraction option in a BU. If you wanted to keep using the ODX, you would have to stop using incremental load and do query tables with dynamic date ranges in them. This would be a lot of work to keep working.

Regarding the issue and the question to Daniel. It is supposed to add the rows added during the execution and only add the max date from rows applied. So it should not be possible to add a max datetime value higher than what is in the rows applied. It would have to be updated rows that somehow do not get a higher date than what was there initially or similar.

@Thomas Lind 

I have tried to setup the BC connector using MS SQL provider.https://legacysupport.timextender.com/hc/en-us/articles/360051368132-Microsoft-Dynamics-Business-Central-NAV-Adapter

I do not seem to get any of the data tables. 

Can you give a hint to what can be wrong?

 

 

Userlevel 6
Badge +5

Hi Mads

Is it actually a NAV database you connect to? How did you get around the issue with all the accounts generating multiple versions of all the tables when you used the ODX to connect to it?

What do you mean by data tables?

Here is a setup I got for NAV2018
 

 

No. It is a Business Central. The database is on a Azure VM.

I created a ODX connection for each account and used the DW_Sourcecode as PK.

By data tables I mean tables like Dimension, G_L Entry ect.

I figured it out regarding the connector (check box at SIFT) 🙄

Userlevel 6
Badge +5

Hi Mads

For BC sources stored in a SQL database each table will usually have a timestamp field. This is not a date, but a specific timestamp data type. That is the field I always use as a incremental rule in all BC sources where this is available.

So my point with this is, if you already got the setup as explained, I would use the timestamp field as my incremental rule and then not worry about the issue with missing rows as the timestamp field will get updated with a higher number every time a new row is added or change to a row is made.

You should also use the timestamp field in the BU BC data source. It should be available for all non online versions.

I was may be a bit unclear. 

We are already using timestamp. See screen shot

If we use timestamp in Business Unit, will we be able to subtract a value from the latest timestamp? So we are sure not to run into the same issue as ODX? and what is the value to subtract to say go 1 day back?

Or should we use a date field instead for this?

I noticed in your screen shot, that the BU BC connector has an option to merge Extensions. We do not have that in our version 20.10.34.64.  Is this the feature in 20.10.39?

 

Userlevel 6
Badge +5

Hi Mads
You can add a subtracted number to the setup instead of a date.

It is odd that it does not work with timestamp though, I don’t remember seeing that before.

What version of BC is it? It will usually have a number that explains the edition, so 11 is 2018 and 20 is 2020.

The merge feature is new yes. It only applies to BC version 2020 (20-0) .

Version: DK Business Central 20.4 (platform 20.0.44266.44359 + program 20.4.44313.44365)

The number to subtract is what? Milliseconds? 

Userlevel 6
Badge +5

No it is subtracted numbers for a timestamp field as it is just a numeric value that increases on changes. If it was a datetime field it would be calculated into seconds that would be then removed.

If you are using that new version, you should be sure to upgrade so you have the merge extensions option available.

I also think that could explain the issues with the timestamp fields. I think the extensions tables at times do not get updated similarly to normal tables.

Userlevel 6
Badge +5

Hi @mads.nielsen has the issue been resolved? If so please help us by marking the best answer above. Please let us know if you have any follow up questions

hi @Christian Hauggaard 

Yes, I have used the BU approach as a work around.
Thanks for all the input.

Regards Mads

Reply