I\'m trying to parse a very large csv file with FileHelpers (http://www.filehelpers.net/). The file is 1GB zipped and about 20GB unzipped.
string fil
This isn't a complete answer, but if you have a 20GB csv file, you'll need 20GB+ to store the whole thing in memory at once unless your reader keeps everything compressed in memory (unlikely). You need to read the file in chunks, and the solution you're using of putting everything into an array will not work if you don't have huge amounts of ram.
You need a loop a bit more like this:
CsvReader reader = new CsvReader(filePath)
CSVItem item = reader.ReadNextItem();
while(item != null){
DoWhatINeedWithCsvRow(item);
item = reader.ReadNextItem();
}
C#'s memory management will then be smart enough to dispose of the old CSVItems as you go through them, provided you don't keep references to them hanging around.
A better version would read a chunk from the CSV (eg. 10,000 rows), deal with all those, then get another chunk, or create a task for DoWhatINeedWithCsvRow if you don't care about processing order.