Load very big CSV-Files into s SQL-Server database

前端 未结 3 554
渐次进展
渐次进展 2020-12-06 15:52

Is there a performant way to load very big CSV-files (which have size of several gigabytes) into an SQL-Server 2008 database with .NET?

相关标签:
3条回答
  • 2020-12-06 16:15

    Use SqlBulkCopy http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx

    0 讨论(0)
  • 2020-12-06 16:19

    I would combine this CSV reader with SqlBulkCopy; i.e.

    using (var file = new StreamReader(path))
    using (var csv = new CsvReader(file, true)) // true = has header row
    using (var bcp = new SqlBulkCopy(connection)) {
        bcp.DestinationTableName = "TableName";
        bcp.WriteToServer(csv);
    }
    

    This uses the bulk-copy API to do the inserts, while using a fully-managed (and fast) IDataReader implementation (crucially, which streams the data, rather than loading it all at once).

    0 讨论(0)
  • 2020-12-06 16:29

    Look into using the SQLBulkCopy class.

    0 讨论(0)
提交回复
热议问题