I\'d like, from where I run a Spark job, to get the unique id of that job.
Via the Spark master node website, I can see that id. It\'s something like:
It depends on which language you are using.
Scala
https://spark.apache.org/docs/1.6.1/api/scala/index.html#org.apache.spark.SparkContext
sc.applicationId
Java
https://spark.apache.org/docs/1.6.2/api/java/org/apache/spark/api/java/JavaSparkContext.html
sparkContext.sc().applicationId();
Python
http://spark.apache.org/docs/1.6.2/api/python/pyspark.html#pyspark.SparkContext
sc.applicationId
It can also depend on Spark version.