Skip to main content

We are trying to get a csv file of 10 gb into TimeXtender but it takes very long for TimeXtender to get this loaded and sometimes we are getting an error (system out of memory). Is there a solution or something else we can try to get big csv files into TimeXtener or can we change the CSV file to get it loaded into TimeXtender? We are now using the Cdata adapter and already used the multiple/single file adapter.


 


Thanks for the response!

Hi Rob... A few things to try. 

  1. Make sure SQL Server db is set to Simple Recovery Mode
  2. Decrease the batch Size on the Staging Db
     

     
  3. Also you can set Batch Data Cleansing in Table Settings.
     

     


Tried all options but unfortunately no solution.... thanks for the quick response!


You may also want to try this: 

https://legacysupport.timextender.com/hc/en-us/articles/360022596691-How-to-configure-segmented-table-executions-Chunking-


Reply