I\'d like, from where I run a Spark job, to get the unique id of that job.
Via the Spark master node website, I can see that id. It\'s something like:
For those using pyspark, see this nearly identical question: How to extract application ID from the PySpark context
The answer from @vvladymyrov worked for me running pyspark in yarn-client mode.
>>> sc._jsc.sc().applicationId() u'application_1433865536131_34483'