问题
I am successfully using ZipSplitter() to process files inside a zip file. I would like to use parallel processing if possible, but calling parallelProcessing() results in the stream being closed prematurely. This results results in an IOException when the stream is being cached by DefaultStreamCachingStrategy.
I note that when parallel processing is enabled, ZipIterator#checkNullAnswer(Message) is called which closes the ZipInputStream. Curiously, everything is dandy if I loiter on this method in my debugger, which suggests that the iterator is being closed before processing has completed. Is this a bug or have I messed up something?
A simplified version of my route which exhibits this behaviour is:
from("file:myDirectory").
split(new ZipSplitter()).streaming().parallelProcessing().
log("Validating filename ${file:name}").
end();
This is using Camel 2.13.1.
回答1:
Can you just try to apply the CAMEL-7415 into the camel 2.13.1 branch?
I'm not quit sure if it can fix your issue, but it is worth to give it a shot.
来源:https://stackoverflow.com/questions/23986515/apache-camel-zipinputstream-closed-with-parallel-processing