问题
I am trying to cache some static data in Ignite cache in order to query faster so I need to read the data from DataBase in order to insert them into cache cluster.
But number of rows is like 3 million and it causes OutOfMemory error normally because SqlComponent is trying to process all the data as one and it tries to collect them once and for all.
Is there any way to split them when reading result set (for ex 1000 items per Exchange)?
回答1:
You can add a limit in the SQL query depending on what SQL database you use.
Or you can try using jdbcTemplate.maxRows=1000
to use that option. But it depends on the JDBC driver if it supports limiting using that option or not.
And also mind you need some way to mark/delete the rows after processing, so they are not selected in the next query, such as using the onConsume
option.
You can look in the unit tests to find some examples with onConsume etc: https://github.com/apache/camel/tree/master/components/camel-sql/src/test
来源:https://stackoverflow.com/questions/38764814/camel-sql-consumer-performance-for-large-datasets