问题
Question
How to execute an action after the last item of an ordered stream is processed but before it's closed ? This Action should be able to inject zero or more items in the stream pipe.
Context
I've got a very large file of the form :
MASTER_REF1
SUBREF1
SUBREF2
SUBREF3
MASTER_REF2
MASTER_REF3
SUBREF1
...
Where SUBREF (if any) is applicable to MASTER_REF and both are complex objects (you can imagine it somewhat like JSON).
On first look I tried something like :
public void process(Path path){
MyBuilder builder = new MyBuilder();
Files.lines(path)
.map(line->{
if(line.charAt(0)==' '){
builder.parseSubRef(line);
return null;
}else{
Result result = builder.build()
builder.parseMasterRef(line);
return result;
}
})
//eliminate null
.filter(Objects::nonNull)
//some processing on results
.map(Utils::doSomething)
//terminal op
.forEachOrdered(System.out::println);
}
[EDIT] using forEach
here was a bad idea ... the good way was to use forEachOrdered
But, for obvious reasons, the last item is never appended to the stream : it is still being built.
Therefore I'm wondering how to flush it in the stream at the end of line processing.
回答1:
Your question sounds confusing. The Stream is closed when the close()
method is called explicitly or when try-with-resources construct is used. In your code sample the stream is not closed at all. To perform custom action before the stream is closed, you can just write something at the end of try-with-resource statement.
In your case it seems that you want to concatenate some bogus entry to the stream. There's Stream.concat()
method to do this:
Stream.concat(Files.lines(path), Stream.of("MASTER"))
.map(...) // do all your other steps
Finally note that my StreamEx library which enhances the Stream API provides partial reduction methods which are good to parse multi-line entries. The same thing can be done using StreamEx.groupRuns() which combines adjacent elements into intermediate list by given BiPredicate
:
public void process(Path path){
StreamEx.of(Files.lines(path))
.groupRuns((line1, line2) -> line2.charAt(0) == ' ')
// Now Stream elements are List<String> starting with MASTER and having
// all subref strings after that
.map(record -> {
MyBuilder builder = new MyBuilder();
builder.parseMasterRef(record.get(0));
record.subList(1, record.size()).forEach(builder::parseSubRef);
return record.build();
})
//eliminate null
.filter(Objects::nonNull)
//some processing on results
.map(Utils::doSomething)
//terminal op
.forEach(System.out::println);
}
Now you don't need to use side-effect operations.
回答2:
The primary problem here is that you are streaming - effectively - two types of records and this makes it difficult to manage because streams are primarily for amorphous data.
I would pre-process the file data and collect it into MasterAndSub
records. You can then groupingBy
these by the Master
field.
class MasterAndSub {
final String master;
final String sub;
public MasterAndSub(String master, String sub) {
this.master = master;
this.sub = sub;
}
}
/**
* Allows me to use a final Holder of a mutable value.
*
* @param <T>
*/
class Holder<T> {
T it;
public T getIt() {
return it;
}
public T setIt(T it) {
return this.it = it;
}
}
public void process(Path path) throws IOException {
final Holder<String> currentMaster = new Holder<>();
Files.lines(path)
.map(line -> {
if (line.charAt(0) == ' ') {
return new MasterAndSub(currentMaster.getIt(), line);
} else {
return new MasterAndSub(currentMaster.setIt(line), null);
}
})
...
来源:https://stackoverflow.com/questions/33868014/java8-stream-lines-and-aggregate-with-action-on-terminal-line