问题
I'm reading huge csv files (about 350K lines by file) using this way:
StreamReader readFile = new StreamReader(fi);
string line;
string[] row;
readFile.ReadLine();
while ((line = readFile.ReadLine()) != null)
{
row = line.Split(';');
x=row[1];
y=row[2];
//More code and assignations here...
}
readFile.Close();
}
The point here is that reading line by line a huge file for every day of the month may be slow and I think that it must be another method to do it faster.
回答1:
Method 1
By using LINQ:
var Lines = File.ReadLines("FilePath").Select(a => a.Split(';'));
var CSV = from line in Lines
select (line.Split(',')).ToArray();
Method 2
As Jay Riggs stated here
Here's an excellent class that will copy CSV data into a datatable using the structure of the data to create the DataTable:
A portable and efficient generic parser for flat files
It's easy to configure and easy to use. I urge you to take a look.
Method 3
Rolling your own CSV reader is a waste of time unless the files that you're reading are guaranteed to be very simple. Use a pre-existing, tried-and-tested implementation instead.
回答2:
In a simple case (there're no quotation, i.e. '"'
within the file) when you expect partial reading, you may find useful
var source = File
.ReadLines(fileName)
.Select(line => line.Split(';'));
for instance if you want to find out if there's a line in CSV such that 3d column value equals to 0
:
var result = source
.Any(items => items[2] == "0");
来源:https://stackoverflow.com/questions/35437313/read-a-csv-file-in-c-sharp-efficiently