Spark Streaming: Could not compute split, block not found

前端 未结 3 422
感动是毒
感动是毒 2021-01-01 23:59

I am trying to use Spark Streaming with Kafka (version 1.1.0) but the Spark job keeps crashing due to this error:

14/11/21 12:39:23 ERROR TaskSetManager: Tas         


        
3条回答
  •  萌比男神i
    2021-01-02 00:30

    Check the following.

    1) Did you create the streaming context properly as in

    def functionToCreateContext(): StreamingContext = {
        val ssc = new StreamingContext(...)   // new context
        val lines = ssc.socketTextStream(...) // create DStreams
        ...
        ssc.checkpoint(checkpointDirectory)   // set checkpoint directory
        ssc
    }
    
    // Get StreamingContext from checkpoint data or create a new one
    val context = StreamingContext.getOrCreate(checkpointDirectory, functionToCreateContext _)
    
    // Do additional setup on context that needs to be done,
    // irrespective of whether it is being started or restarted
    context. ...
    
    // Start the context
    context.start()
    context.awaitTermination()
    

    Your initialization is incorrect.

    Have a look at the below

    Eg : code at recoverableNetworkCount App

    2) Have you enabled the property write ahead log "spark.streaming.receiver.writeAheadLog.enable"

    3) Check the stability of streaming in the Streaming UI. processing time < batch interval.

提交回复
热议问题