camel jdbc out of memory exception

无人久伴 提交于 2019-12-23 04:45:52

问题


I am trying to ingest data from postgres to another DB and I am using camel-jdbc component to do it. I have a large table so I want to read few rows at a time instead of the whole table altogether. so my route looks like below (only for testing purpose) from(fromUri).setBody("select * from table limit 10").to("jdbc://myDataSource?resetAutoCommit=false&statement.fetchSize=2").split(body()).streaming().process(test)

As shown above, I am only getting 10 rows at a time for testing purpose and I have set fetchSize to 2 to only receive 2 rows at a time. However, I am still receiving all 10 rows altogether. When I remove the "limit 10" from the query I get Out of Memory error just before the split command which tells me that its trying to load the entire result set in memory.

What am I missing here or what am I doing wrong?

Thanks for help.


回答1:


I think fetchSize is more of a hint to the JDBC driver. You can use the maxRows option to really limit on the server side, eg statement.maxRows=2. You can read more about these options on the JDBC javadoc documentation.

https://docs.oracle.com/javase/7/docs/api/java/sql/Statement.html



来源:https://stackoverflow.com/questions/37395970/camel-jdbc-out-of-memory-exception

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!