I am unsure if this is a bug, so if you do something like this
// d:spark.RDD[String]
d.distinct().map(x => d.filter(_.equals(x)))
you w
what about the windowing example provided in the Spark 1.3.0 stream programming guide
val dataset: RDD[String, String] = ...
val windowedStream = stream.window(Seconds(20))...
val joinedStream = windowedStream.transform { rdd => rdd.join(dataset) }
SPARK-5063 causes the example to fail since the join is being called from within the transform method on an RDD