With nodejs I want to parse a .csv file of 10000 records and do some operation on each row. I tried using http://www.adaltas.com/projects/node-csv. I couldnt get this to pau
The node-csv project that you are referencing is completely sufficient for the task of transforming each row of a large portion of CSV data, from the docs at: http://csv.adaltas.com/transform/:
csv()
.from('82,Preisner,Zbigniew\n94,Gainsbourg,Serge')
.to(console.log)
.transform(function(row, index, callback){
process.nextTick(function(){
callback(null, row.reverse());
});
});
From my experience, I can say that it is also a rather fast implementation, I have been working with it on data sets with near 10k records and the processing times were at a reasonable tens-of-milliseconds level for the whole set.
Rearding jurka's stream based solution suggestion: node-csv IS stream based and follows the Node.js' streaming API.