java.io.NotSerializableException in Spark Streaming with enabled checkpointing

孤者浪人 提交于 2019-12-01 08:12:39

问题


code below:

def main(args: Array[String]) {
    val sc = new SparkContext
    val sec = Seconds(3)
    val ssc = new StreamingContext(sc, sec)
    ssc.checkpoint("./checkpoint")
    val rdd = ssc.sparkContext.parallelize(Seq("a","b","c"))
    val inputDStream = new ConstantInputDStream(ssc, rdd)

    inputDStream.transform(rdd => {
        val buf = ListBuffer[String]()
        buf += "1"
        buf += "2"
        buf += "3"
        val other_rdd = ssc.sparkContext.parallelize(buf)   // create a new rdd
        rdd.union(other_rdd)
    }).print()

    ssc.start()
    ssc.awaitTermination()
}

and throw exception:

java.io.NotSerializableException: DStream checkpointing has been enabled but the DStreams with their functions are not serializable
org.apache.spark.streaming.StreamingContext
Serialization stack:
    - object not serializable (class: org.apache.spark.streaming.StreamingContext, value: org.apache.spark.streaming.StreamingContext@5626e185)
    - field (class: com.mirrtalk.Test$$anonfun$main$1, name: ssc$1, type: class org.apache.spark.streaming.StreamingContext)
    - object (class com.mirrtalk.Test$$anonfun$main$1, <function1>)
    - field (class: org.apache.spark.streaming.dstream.DStream$$anonfun$transform$1$$anonfun$apply$21, name: cleanedF$2, type: interface scala.Function1)
    - object (class org.apache.spark.streaming.dstream.DStream$$anonfun$transform$1$$anonfun$apply$21, <function2>)
    - field (class: org.apache.spark.streaming.dstream.DStream$$anonfun$transform$2$$anonfun$5, name: cleanedF$3, type: interface scala.Function2)
    - object (class org.apache.spark.streaming.dstream.DStream$$anonfun$transform$2$$anonfun$5, <function2>)
    - field (class: org.apache.spark.streaming.dstream.TransformedDStream, name: transformFunc, type: interface scala.Function2)

when I remove code ssc.checkpoint("./checkpoint"), the application can work well, but I need enable checkpoint.

how to fix this issue when enable checkpoint?


回答1:


You can move context initialization and configuration tasks outside main:

object App {
  val sc = new SparkContext(new SparkConf().setAppName("foo").setMaster("local"))
  val sec = Seconds(3)
  val ssc = new StreamingContext(sc, sec)
  ssc.checkpoint("./checkpoint") // enable checkpoint

  def main(args: Array[String]) {
    val rdd = ssc.sparkContext.parallelize(Seq("a", "b", "c"))
    val inputDStream = new ConstantInputDStream(ssc, rdd)

    inputDStream.transform(rdd => {
      val buf = ListBuffer[String]()
      buf += "1"
      buf += "2"
      buf += "3"
      val other_rdd = ssc.sparkContext.parallelize(buf)
      rdd.union(other_rdd) // I want to union other RDD
    }).print()

    ssc.start()
    ssc.awaitTermination()
  }
}


来源:https://stackoverflow.com/questions/38522618/java-io-notserializableexception-in-spark-streaming-with-enabled-checkpointing

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!