How to implement custom job listener/tracker in Spark?

前端 未结 3 1131
忘掉有多难
忘掉有多难 2020-12-05 05:50

I have a class like below, and when i run this through command line i want to see progress status. some thing like,

10% completed... 
30% completed... 
100%         


        
3条回答
  •  温柔的废话
    2020-12-05 06:09

    If you are using scala-spark this code will help you to adding spark listener.

    Create your SparkContext

    val sc=new SparkContext(sparkConf) 
    

    Now you can add your spark listener in spark context

    sc.addSparkListener(new SparkListener() {
      override def onApplicationStart(applicationStart: SparkListenerApplicationStart) {
        println("Spark ApplicationStart: " + applicationStart.appName);
      }
    
      override def onApplicationEnd(applicationEnd: SparkListenerApplicationEnd) {
        println("Spark ApplicationEnd: " + applicationEnd.time);
      }
    
    });
    

    Here is the list of Interface for listening to events from the Spark schedule.

提交回复
热议问题