Getting app run id for a Spark job

后端 未结 4 1274
忘掉有多难
忘掉有多难 2020-12-03 21:05

I\'d like, from where I run a Spark job, to get the unique id of that job.

Via the Spark master node website, I can see that id. It\'s something like:



        
4条回答
  •  感情败类
    2020-12-03 21:32

    It depends on which language you are using.

    Scala

    https://spark.apache.org/docs/1.6.1/api/scala/index.html#org.apache.spark.SparkContext

    sc.applicationId
    

    Java

    https://spark.apache.org/docs/1.6.2/api/java/org/apache/spark/api/java/JavaSparkContext.html

    sparkContext.sc().applicationId();
    

    Python

    http://spark.apache.org/docs/1.6.2/api/python/pyspark.html#pyspark.SparkContext

    sc.applicationId
    

    It can also depend on Spark version.

提交回复
热议问题