When loading regularly (every hour) using incremental load, the ODX seems to “lose” transactions from source. Next run of the incremental load, ODX does not catch these missing transactions. The work around is to full load data, but the this takes long time. And it renders the incremental load useless as it is not to be trusted.
We are running 20.10.34. Does any of the newer 20 versions have a fix to this? Regards Mads
Best answer by rory.smith
If @daniel 's suggestion is what is happening and the extraction is very important you could also consider only running this incremental through the BU approach and apply subtractions to create a wide enough rolling window to catch the pre-dated records.
The new release features subtractions for incremental load in the ODX Server, but would most likely require the upgrade tool to be available for you.
Good questions Rory. I've had this issue where we had some of the data arriving late. by this I mean that the records got skipped by TX because the record got inserted with an earlier timestamp then the timestamp that TX created to do the incremental. So these records got skipped. You can set up a second Load task in the ODX with an incremental set up where you can set the incremental timestamp to be modified and run them every other hour. (You can delay the incremental timestamp with a certain amount) Or you can check the settings for dirty reads?
If @daniel 's suggestion is what is happening and the extraction is very important you could also consider only running this incremental through the BU approach and apply subtractions to create a wide enough rolling window to catch the pre-dated records.
The new release features subtractions for incremental load in the ODX Server, but would most likely require the upgrade tool to be available for you.
Your incremental value in the ODX, needs to be very precise with a date and time to be able to update every hour. If it only contains the date you will loose rows applied later on. In essence to avoid this you would have to only run it once a day just before or after midnight or it would potentially loose rows.
Is that the case for you, I mean that it only contains a date or is the datetime field somehow not being kept up to date on each update or addition?
We use Timestamp for incremental load. Source is On premp SQL server for Business Central.
@daniels suggestion seems to be what happens. My theory is that while ODX is reading from SQL source, new transactions are written in the source. The time matches with the load times to ODX and the DateTime for new entries in source.
Yes you could use the subtraction option in a BU. If you wanted to keep using the ODX, you would have to stop using incremental load and do query tables with dynamic date ranges in them. This would be a lot of work to keep working.
Regarding the issue and the question to Daniel. It is supposed to add the rows added during the execution and only add the max date from rows applied. So it should not be possible to add a max datetime value higher than what is in the rows applied. It would have to be updated rows that somehow do not get a higher date than what was there initially or similar.
Is it actually a NAV database you connect to? How did you get around the issue with all the accounts generating multiple versions of all the tables when you used the ODX to connect to it?
For BC sources stored in a SQL database each table will usually have a timestamp field. This is not a date, but a specific timestamp data type. That is the field I always use as a incremental rule in all BC sources where this is available.
So my point with this is, if you already got the setup as explained, I would use the timestamp field as my incremental rule and then not worry about the issue with missing rows as the timestamp field will get updated with a higher number every time a new row is added or change to a row is made.
You should also use the timestamp field in the BU BC data source. It should be available for all non online versions.
If we use timestamp in Business Unit, will we be able to subtract a value from the latest timestamp? So we are sure not to run into the same issue as ODX? and what is the value to subtract to say go 1 day back?
Or should we use a date field instead for this?
I noticed in your screen shot, that the BU BC connector has an option to merge Extensions. We do not have that in our version 20.10.34.64. Is this the feature in 20.10.39?
No it is subtracted numbers for a timestamp field as it is just a numeric value that increases on changes. If it was a datetime field it would be calculated into seconds that would be then removed.
If you are using that new version, you should be sure to upgrade so you have the merge extensions option available.
I also think that could explain the issues with the timestamp fields. I think the extensions tables at times do not get updated similarly to normal tables.
Hi @mads.nielsen has the issue been resolved? If so please help us by marking the best answer above. Please let us know if you have any follow up questions
We use 3 different kinds of cookies. You can choose which cookies you want to accept. We need basic cookies to make this site work, therefore these are the minimum you can select. Learn more about our cookies.