Camel Sql Consumer Performance for Large DataSets

青春壹個敷衍的年華 提交于 2019-12-08 03:31:15

问题


I am trying to cache some static data in Ignite cache in order to query faster so I need to read the data from DataBase in order to insert them into cache cluster.

But number of rows is like 3 million and it causes OutOfMemory error normally because SqlComponent is trying to process all the data as one and it tries to collect them once and for all.

Is there any way to split them when reading result set (for ex 1000 items per Exchange)?


回答1:


You can add a limit in the SQL query depending on what SQL database you use.

Or you can try using jdbcTemplate.maxRows=1000 to use that option. But it depends on the JDBC driver if it supports limiting using that option or not.

And also mind you need some way to mark/delete the rows after processing, so they are not selected in the next query, such as using the onConsume option.

You can look in the unit tests to find some examples with onConsume etc: https://github.com/apache/camel/tree/master/components/camel-sql/src/test



来源:https://stackoverflow.com/questions/38764814/camel-sql-consumer-performance-for-large-datasets

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!