How to export more than 1 million rows from SQL Server table to CSV in C# web app?

戏子无情 提交于 2020-02-06 03:38:39

问题


I am trying to export a SQL Server table with 1 million rows and 45 columns to a .csv file for the user to download via the web interface but it takes so long that I eventually have to stop the process manually.

I use a SqlDataReader and write into the file as the reader reads to avoid memory problems. The code works for small tables (less than 3k rows) but the large one keeps running and the destination file stays at 0 KB.

using (spContentConn) { using (var sdr = sqlcmd.ExecuteReader())
    using (CsvfileWriter)
    { 
        DataTable Tablecolumns = new DataTable();

        for (int i = 0; i < sdr.FieldCount; i++)
        {
            Tablecolumns.Columns.Add(sdr.GetName(i));
        }

        CsvfileWriter.WriteLine(string.Join("~", Tablecolumns.Columns.Cast<DataColumn>().Select(csvfile => csvfile.ColumnName)));

        while (sdr.Read())
            for (int j = Tablecolumns.Columns.Count; j > 0; j--)
            {
                if (j == 1)
                    CsvfileWriter.WriteLine("");
                else
                    CsvfileWriter.Write(sdr[Tablecolumns.Columns.Count - j].ToString() + "~");
            }
    }

I used the same answer recommended in this thread but still doesn't work. Please help. export large datatable data to .csv file in c# windows applications


回答1:


It is not clear from the .NET documentation whether FileWriter has efficient buffering, therefore I always use a BufferedStream instead when I need to read/write large volumes of data. With a stream, you would have to write byte data instead of strings, but that requires only a minor adaptation of your code.

It also looks like you are reading and writing the columns of a DataTable in a loop, which would affect performance. Since the number and order of the columns would not change during an export operation, consider using the positional index to access the column values instead. It would also be better to write one row at a time instead of one column at a time.

Finally, you are using a data-reader, so that should provide the best throughput of data from your SQL Server (limited by your server and bandwidth, obviously). This would also suggest that the performance bottleneck is in the way that your data is being written to file.

For comparison, I just wrote 1,000,000 rows of 45 columns to a text file in under 60 seconds. Granted that my code does not read from a database, but that should still provide a good enough baseline for you.



来源:https://stackoverflow.com/questions/57242159/how-to-export-more-than-1-million-rows-from-sql-server-table-to-csv-in-c-sharp-w

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!