Handling large SQL select queries / Read sql data in chunks

强颜欢笑 提交于 2019-12-18 12:29:19

问题


I'm using .Net 4.0 and SQL server 2008 R2.

I'm running a big SQL select query which returns millions of results and takes up a long time to fully run.

Does anyone know how can I read only some of the results returned by the query without having to wait for the whole query to complete?

In other words, I want to read the first by 10,000 records chunks while the query still runs and getting the next results.


回答1:


It depends in part on whether the query itself is streaming, or whether it does lots of work in temporary tables then (finally) starts returning data. You can't do much in the second scenario except re-write the query; however, in the first case an iterator block would usually help, i.e.

public IEnumerable<Foo> GetData() {
     // not shown; building command etc
     using(var reader = cmd.ExecuteReader()) {
         while(reader.Read()) {
             Foo foo = // not shown; materialize Foo from reader
             yield return foo;
         }
     }
}

This is now a streaming iterator - you can foreach over it and it will retrieve records live from the incoming TDS data without buffering all the data first.

If you (perhaps wisely) don't want to write your own materialization code, there are tools that will do this for you - for example, LINQ-to-SQL's ExecuteQuery<T>(tsql, args) will do the above pain-free.




回答2:


You'd need to use data paging.

SQL Server has the TOP clause (SQL TOP 10 a,b,c from d) and BETWEEN:

SELECT TOP 10000 a,b,c from d BETWEEN X and Y

Having this, I guess you'd be able of retrieving an N number of rows, do some partial processing, then load next N number of rows and so on.

This can be achieved by implementing a multithreaded solution: one will be retrieving results while the other will asynchronously wait for data and it'll be doing some processing.




回答3:


if you really have to process millions of records Why dont you load 10,000 each round process them and then load the next 10,000? if not consider using the DBMS to filter the data before loading it as the performance on the database is much better than in you logic leyer.

Or follow a lazy load concept and load only Ids to which you load the actual data only when you need it.



来源:https://stackoverflow.com/questions/5726551/handling-large-sql-select-queries-read-sql-data-in-chunks

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!