Getting app run id for a Spark job

后端 未结 4 1286
忘掉有多难
忘掉有多难 2020-12-03 21:05

I\'d like, from where I run a Spark job, to get the unique id of that job.

Via the Spark master node website, I can see that id. It\'s something like:



        
4条回答
  •  广开言路
    2020-12-03 21:40

    For those using pyspark, see this nearly identical question: How to extract application ID from the PySpark context

    The answer from @vvladymyrov worked for me running pyspark in yarn-client mode.

    >>> sc._jsc.sc().applicationId()
    u'application_1433865536131_34483'
    

提交回复
热议问题