Is there a good way to extract chunks of data from a java 8 stream?
问题 I an ETL process I'm retrieving a lot of entities from a Spring Data Repository. I'm then using a parallel stream to map the entities to different ones. I can either use a consumer to store those new entities in another repository one by one or collect them into a List and store that in a single bulk operation. The first is costly while the later might exceed the available memory. Is there a good way to collect a certain amount of elements in the stream (like limit does), consume that chunk,