Is there a way to speed up inserts to a mdb?
using (StreamReader sr = new StreamReader(_localDir + \"\\\\\" + _filename))
while ((line = sr.ReadLine()) !=
Another change that might speed it up a little more is to prepare the command one time and create all the parameters. Then in the loop, just assign the parameter values and execute it each time. That may avoid the parsing and semantic checking of the statement each iteration and should improve the time some. However, I don't think it would be a significant improvement. The statement parsing should be a relatively small portion of the total cost even if it is parsed every time.
I found a very good solution here: Writing large number of records (bulk insert) to Access in .NET/C# Instead of using OleDb, use DAO.
Is it possible for you to use a query that inserts directly from csv? For example:
SELECT ID,Field1 INTO NewTable
FROM [Text;HDR=YES;FMT=Delimited;IMEX=2;DATABASE=C:\Docs\].Some.CSV
You can use something similar with non-standard delimiters, but you will need a Schema.ini file in the same directory as the file to be imported. It need only contain:
[tempImportfile.csv]
TextDelimiter='
You will have to alter the connect string slightly, this seems to work:
Text;HDR=YES;FMT=Delimited;DATABASE=C:\Docs\
Microsoft Jet to handle Sql parsing (INSERT/UPDATE) is slow in general. In other words, you may have the most efficient code possible, but the choke point is Jet. Keep in mind, that in your original posting your connecting (open file, create lock, seek file, insert row, dispose lock, close file, dispose object) for every line. You need to connect ONCE (outside the while), read the lines, generate Sql (OleDbCommand), and then execute.
You'd probably realize some performance benefits by moving the loop inside of the using blocks. Create 1 connection/command and execute it N times instead of creating N connections/commands.