Solved

Data export in batch size

  • 11 December 2020
  • 3 replies
  • 49 views

Currently we are facing a problem with memory resources when we're trying to do a data export. When writing a huge file, 4 gb, the memory allocated to this TX process takes up more resources than  we would like (> 22gb of RAM). Is there any possibility for breaking up the data export in partions, such that the process takes up less RAM? I would think there would be some kind of option like the data batch cleansing in the table settings, but I can't seem to find it anywhere.  

Within the data export we use the TimeXtender File Export 2.0.0.0 provider. 

icon

Best answer by Thomas Lind 16 December 2020, 16:15

View original

3 replies

Userlevel 5
Badge +5

Hi Ben

There isn't an option to do batches.
I assume you always create all the data in the tables you push to txt files.

If that is not necessary, meaning it already exists, then you can make a view that contain a subset of data for the day and then just export that. Remember that you will need to run some programs to rename the file afterwards as our program doesn't do that either.

Hi Thomas,

Unfortunately, its needed to export all the data, since the data can still change. 

Userlevel 5
Badge +5

Where does you then use the CSV files afterwards.

How far back do you imagine it could be changed, if it is less than a year, you can do yearly or monthly files instead of daily.

Also the DW_TimeStamp should always be updated if there are changes in the tables, no matter how far back it is.

Reply