Big tables = slow

Topics: Developer Forum
Jun 8, 2010 at 10:06 AM

Hello,

 

First of all I want to say thanks for this great library. It couldn't be easier to use.

 

The only problem I and, apparently, others are having is that when a worksheet reaches a quite large size the library seems to be very slow.

It takes me over 3 minutes to make an excel file with 4 worksheet (worksheet 1 = 2000*4 cells, worksheet 2 - 4 = 200*4)

Is there a way to make it faster at our end?

 

Greets

Robin

 

Jul 1, 2010 at 10:47 AM

Yep, it is that slow. The problem is that it is using xpath to access/modify cells which tends to be extremely slow when there is a lot of data.

To be honest i managed to do my own routine, which maybe doesnt represent perfect code, but it works very fast. I am able to create 0,5 million of rows and it is still fast (few seconds).

It was all done by using StringBuilder, building xml strings for particular package parts and then injecting it all. It was tricky.

Unfortunately i cannot publish it as it is highly integrated with our custom solution :( but i think EP should follow this way ASAP.

 

Cheers,

Mike

Jul 19, 2010 at 9:52 PM

I am having similar problem. Did you guys found a solution for slowness when you try to write to Excel file ? Please, share your solution.

Sep 24, 2010 at 11:54 PM

I too was having performance problems reading large xlsx files. My approach was to spin up a couple of threads to do the actual reading. A thread would ask what row to read, read/process that row, then repeat until all rows were read.  Collect the output from each (remembering the row number it came from). Sort the results by the Excel row number & construct the output. I found that using n reader threads, where n = number of cores gave me the best performance. Using 4 threads to do the reading, really did cut the time to about a fourth of using a single thread.