Skip to main content

Currently we are facing a problem with memory resources when we're trying to do a data export. When writing a huge file, 4 gb, the memory allocated to this TX process takes up more resources than  we would like (> 22gb of RAM). Is there any possibility for breaking up the data export in partions, such that the process takes up less RAM? I would think there would be some kind of option like the data batch cleansing in the table settings, but I can't seem to find it anywhere.  


Within the data export we use the TimeXtender File Export 2.0.0.0 provider. 

Hi Ben


There isn't an option to do batches.
I assume you always create all the data in the tables you push to txt files.


If that is not necessary, meaning it already exists, then you can make a view that contain a subset of data for the day and then just export that. Remember that you will need to run some programs to rename the file afterwards as our program doesn't do that either.


Hi Thomas,


Unfortunately, its needed to export all the data, since the data can still change. 


Where does you then use the CSV files afterwards.


How far back do you imagine it could be changed, if it is less than a year, you can do yearly or monthly files instead of daily.


Also the DW_TimeStamp should always be updated if there are changes in the tables, no matter how far back it is.


Reply