Improve large data import performance into SQLite with C#

后端 未结 4 1786
礼貌的吻别
礼貌的吻别 2021-02-07 05:12

I am using C# to import a CSV with 6-8million rows.

My table looks like this:

CREATE TABLE [Data] ([ID] VARCHAR(100)  NULL,[Raw] VARCHAR(200)  NULL)
CREA         


        
4条回答
  •  Happy的楠姐
    2021-02-07 05:58

    You can gain quite some time when you bind your parameters in the following way:

    ...
    string insertText = "INSERT INTO Data (ID,RAW) VALUES( ? , ? )";  // (1)
    
    SQLiteTransaction trans = conn.BeginTransaction();
    command.Transaction = trans;
    
    command.CommandText = insertText;
    
    //(2)------
       SQLiteParameter p0 = new SQLiteParameter();
       SQLiteParameter p1 = new SQLiteParameter();
       command.Parameters.Add(p0);
       command.Parameters.Add(p1);
    //---------
    
    Stopwatch sw = new Stopwatch();
    sw.Start();
    using (CsvReader csv = new CsvReader(new StreamReader(@"C:\Data.txt"), false))
    {
       var f = csv.Select(x => new Data() { IDData = x[27], RawData = String.Join(",", x.Take(24)) });
    
       foreach (var item in f)
       {
          //(3)--------
             p0.Value = item.IDData;
             p1.Value = item.RawData;
          //-----------
          command.ExecuteNonQuery();
       }
     }
     trans.Commit();
    ...
    

    Make the changes in sections 1, 2 and 3. In this way parameter binding seems to be quite a bit faster. Especially when you have a lot of parameters, this method can save quite some time.

提交回复
热议问题