How to process data from large ResultSet without loading them all to memory?

末鹿安然 提交于 2019-12-22 01:19:03

问题


My Database is hosting on mysql server & I'm using Java to analyze data.

My issue: after execute 'Select' query will return a 2.5 GB result set. I don't want to load all the data to memory. So is there any ways that I could continuously retrieve data & process it?

'limit by rows' will not be an option, b/c this 2.5 GB data is joined & retrieved from 4 tables. So 'limit by rows' will increase my total run-time a lot.

I've tried statement.setFetchSize(50), but it seemed not working as I expected.

Any suggestions will be really appreciated! Thanks!


回答1:


Statement stmt = readOnlyConn.createStatement(java.sql.ResultSet.TYPE_FORWARD_ONLY, java.sql.ResultSet.CONCUR_READ_ONLY);

stmt.setFetchSize(Integer.MIN_VALUE);

The code above solved my issues. Thanks for the help!




回答2:


BlockquoteStatement stmt = readOnlyConn.createStatement(java.sql.ResultSet.TYPE_FORWARD_ONLY, java.sql.ResultSet.CONCUR_READ_ONLY); stmt.setFetchSize(Integer.MIN_VALUE); The code above solved my issues. Thanks for the help!

Yes, BUT if you are using postgres, you also have to have autocommit turned OFF! (realised after 2h of work =D )

see postgres docs here



来源:https://stackoverflow.com/questions/16860677/how-to-process-data-from-large-resultset-without-loading-them-all-to-memory

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!