问题
I have a spring batch project that reads a huge zip file containing more than 100.000 xml files.
I am using MultiResourcePartitioner, and I have a Memory issue and my batch fails with
java.lang.OutOfMemoryError: GC overhead limit exceeded.
It seems like if all the xml files are loaded in memory and not garbaged after processing.
Is there a performant way to do this ?
Thanks.
来源:https://stackoverflow.com/questions/38793243/performance-issue-with-multiresourcepartitioner-in-spring-batch