Replacing traditional newForLoop with Java 8 Streams

不羁岁月 提交于 2019-12-05 04:13:48

You can use the stream API to compose a function out of your list of functions.

static List<Function<? super Double, Double>> myList
    = Arrays.asList(d -> d + 4, d -> d * 2, d -> d - 3);

static Function<Double, Double> total=myList.stream()
    .map(f -> (Function<Double, Double>) d -> d + f.apply(d))
    .reduce(Function::andThen).orElse(Function.identity());

static double calculate(double val) {
    return total.apply(val);
}

public static void main(String[] args) {
    System.out.println(calculate(10));
}

The stream operation which produces the composed function does not have the associativity problem and could in theory even run in parallel (though there is no benefit here) while the result is a single function which is per se sequential and never dissolved into parts which would need to be associative.

Yes, you can use a stream solution, by performing a reduction:

private double calculate(double val) {
    return myList.stream().reduce(val, (d, f) -> d + f.apply(d), (a, b) -> a + b);
}

A reduction takes each element and aggregates (reduces) it to one value. There are 3 flavours of the reduce() method - the one used here does the trick.


Some test code:

static Function<? super Double, Double> a = (d) -> d + 4;
static Function<? super Double, Double> b = (d) -> d * 2;
static Function<? super Double, Double> c = (d) -> d - 3;
static List<Function<? super Double, Double>> myList = Arrays.asList(a, b, c);

static double calculate(double val) {
    return myList.stream().reduce(val, (d, f) -> d + f.apply(d), (a, b) -> a + b);
}

public static void main(String[] args) {
    System.out.println(calculate(10));
}

Output:

141.0

This particular example is very problematic for the Java 8 streams. They are designed for operations in which the order is not important.

Function application is not associative. To explain, let's take a simpler example, in which one wants to take a number and divide it by a list of numbers:

static List<Double> dividers = Arrays.asList( 3.5, 7.0, 0.5, 19.0 );

public double divideByList( double a ) {
    for ( Double d : dividers ) {
        a /= d;
    }
    return a;
}

So, what you get is

a ÷ 3.5 ÷ 7.0 ÷ 0.5 ÷ 19.0

The arithmetic is simple - division is left-associative, meeaning that this is equivalent to

a ÷ ( 3.5 × 7.0 × 0.5 × 19.0)

not

a ÷ ( 3.5 ÷ 7.0 ÷ 0.5 ÷ 19.0 )

and not

( a ÷ 3.5 ÷ 7.0 ) ÷ ( 0.5 ÷ 19.0 )

The stream operations, those that are based on reduce/collectors, require that the "reducing" operation will be left-associative. This is because they want to allow the operation to be parallelized, such that some threads will do some of the operations, and then the result could be combined. Now, if your our operator was multiplication rather than division, this would not be a problem, because

a × 3.5 × 7.0 × 0.5 × 19.0

is the same as

(a × 3.5 × 7.0 ) × (0.5 × 19)

which means one thread could do the a × 3.5 × 7.0, and another could do the 0.5 × 19.0 operation, and then you could multiply the result and get the same thing as in the sequential calculation. But for division, that doesn't work.

Function application is also non-associative, just like division. That is, if you have functions f,g and h, and you run your sequential calculation, you'll end up with:

result = val + f(val) + g(val + f(val)) + h(val + f(val) + g(val + f(val)))

Now, if you have two intermediate threads, one applying f and g, the other one applying h, and you want to combine the result - there is no way to get the correct values into the h in the first place.


You may be tempted to try this with a method like Stream.reduce, as @Bohemian suggested. But the documentation warns you against this:

<U> U reduce(U identity,
             BiFunction<U,? super T,U> accumulator,
             BinaryOperator<U> combiner)

...

The identity value must be an identity for the combiner function. This means that for all u, combiner(identity, u) is equal to u. Additionally, the combiner function must be compatible with the accumulator function; for all u and t, the following must hold:

combiner.apply(u, accumulator.apply(identity, t)) == accumulator.apply(u, t)

For an operation like +, the identity is 0. For *, the identity is 1. So it is against the documentation to use your val as the identity. And the second condition is even more problematic.

Although the current implementation for a non-parallel stream does not use the combiner part, which makes both the conditions unneeded, this is not documented, and a future implementation, or a different JRE's implementation, may decide to create intermediate results and use the combiner to join them, perhaps to improve performance or for any other consideration.

So, despite the temptation, one should not use Stream.reduce to try to imitate the original sequential processing.


There is a way to do it, that doesn't actually break the documentation. It involves keeping a mutable object that holds the result (it has to be an object so that it's effectively final while still mutable), and using the Stream.forEachOrdered, which guarantees that the operations will be performed in the order they appear in the stream, if the stream is ordered. And a list's stream has a defined order. This works even when you use myList.stream().parallel().

public static double streamedCalculate(double val) {
    class MutableDouble {
        double currVal;
        MutableDouble(double initVal) {
            currVal = initVal;
        }
    }

    final MutableDouble accumulator = new MutableDouble(val);

    myList.stream().forEachOrdered((x) -> accumulator.currVal += x.apply(accumulator.currVal));
    return accumulator.currVal;
}

Personally, I find that your original loop is more readable than this, so there is really no advantage to using streams here.

Based on a comment by @Tagir Valeev, there is a foldLeft operation planned for future Java versions. It may look more elegant when that happens.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!