How to extract application ID from the PySpark context

后端 未结 3 1354
广开言路
广开言路 2020-12-17 16:10

A previous question recommends sc.applicationId, but it is not present in PySpark, only in scala.

So, how do I figure

3条回答
  •  暖寄归人
    2020-12-17 16:57

    For PySpark 2.0.0+

    spark_session = SparkSession \
        .builder \
        .enableHiveSupport() \
        .getOrCreate()
    
    app_id = spark_session._sc.applicationId
    

提交回复
热议问题