问题
There are some SQL statements in my program that contain IN
-clauses with given Ids. The problem is that in some cases there might be more than 1000 Ids which causes Oracle to crash with ORA-01795. Too many items.
So I want to divide this list into multiple sub-lists.
Example: I have 2403 Ids
Result would be three lists:
- 0 - 999
- 1000 - 1999
- 2000 - 2402
I have written a piece of code that works, but looks terrible. Is there any better solution for this problem? Maybe something with Collectors & groupingby or anything like that?
My code:
Map<Integer, List<Long>> result = new HashMap<>();
ArrayList<Long> asList = new ArrayList<Long>(listOfIds);
IntStream.range(0, (listOfIds.size() / 1000) + 1)
.forEach(partGroup -> result.put(partGroup, asList.subList(partGroup * 1000, (partGroup * 1000) + Math.min(1000,
asList.size() - partGroup * 1000))));
回答1:
Short of using a third party library I don't think you can do much better. I personally use this utility function, which is close to what you've done:
public static <T> Stream<List<T>> splitListStream(List<T> input, int batchSize) {
if (batchSize <= 0)
throw new IllegalArgumentException("batchSize must be positive (" + batchSize + ")");
if (input.size() <= batchSize) return Stream.of(input);
return IntStream.range(0, (input.size() + batchSize - 1) / batchSize)
.mapToObj(i -> {
int from = i * batchSize;
int to = Math.min((i + 1) * batchSize, input.size());
return input.subList(from, to);
});
}
回答2:
You can write your own collector for that - that will parallelize efficiently. You can put that into a utility method also.
private static <T> Collector<T, ?, List<List<T>>> partitioning(int size) {
class Acc {
int count = 0;
List<List<T>> list = new ArrayList<>();
void add(T elem) {
int index = count++ / size;
if (index == list.size()) {
list.add(new ArrayList<>());
}
list.get(index).add(elem);
}
Acc merge(Acc right) {
List<T> lastLeftList = list.get(list.size() - 1);
List<T> firstRightList = right.list.get(0);
int lastLeftSize = lastLeftList.size();
int firstRightSize = firstRightList.size();
// they are both size, simply addAll will work
if (lastLeftSize + firstRightSize == 2 * size) {
list.addAll(right.list);
return this;
}
// last and first from each chunk are merged "perfectly"
if (lastLeftSize + firstRightSize == size) {
int x = 0;
while (x < firstRightSize) {
lastLeftList.add(firstRightList.remove(x));
--firstRightSize;
}
right.list.remove(0);
list.addAll(right.list);
return this;
}
right.list.stream().flatMap(List::stream).forEach(this::add);
return this;
}
public List<List<T>> finisher() {
return list;
}
}
return Collector.of(Acc::new, Acc::add, Acc::merge, Acc::finisher);
}
And usage would be:
List<List<Integer>> list = Arrays.asList(1, 3, 4, 5, 9, 8, 7)
.stream()
.parallel()
.collect(partitioning(3));
回答3:
As an alternative to rolling your own, you could consider jOOL or Guava (Iterators.partition(stream.iterator(), batchSize)
).
来源:https://stackoverflow.com/questions/45078134/divide-longstream-into-substreams-with-maximal-length