In Spark Streaming, is there a way to detect when a batch has finished?

前端 未结 2 1463
北海茫月
北海茫月 2020-12-19 13:22

I use Spark 1.6.0 with Cloudera 5.8.3.
I have a DStream object and plenty of transformations defined on top of it,

val stream = KafkaUtils.c         


        
2条回答
  •  臣服心动
    2020-12-19 13:57

    Using streaming listeners should solve the problem for you:

    (sorry it's a java example)

    ssc.addStreamingListener(new JobListener());
    
    // ...
    
    class JobListener implements StreamingListener {
    
     @Override
        public void onBatchCompleted(StreamingListenerBatchCompleted batchCompleted) {
    
            System.out.println("Batch completed, Total delay :" + batchCompleted.batchInfo().totalDelay().get().toString() +  " ms");
    
        }
    
       /*
    
       snipped other methods
    
       */
    
    
    }
    

    https://gist.github.com/akhld/b10dc491aad1a2007183

    https://jaceklaskowski.gitbooks.io/mastering-apache-spark/content/spark-streaming/spark-streaming-streaminglisteners.html

    http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.streaming.scheduler.StreamingListener

提交回复
热议问题