How to know deploy mode of PySpark application?

后端 未结 3 461
佛祖请我去吃肉
佛祖请我去吃肉 2020-12-18 01:28

I am trying to fix an issue with running out of memory, and I want to know whether I need to change these settings in the default configurations file (spark-defaults.c

3条回答
  •  太阳男子
    2020-12-18 02:18

    Since sc.deployMode is not available in PySpark, you could check spark.submit.deployMode

    scala> sc.getConf.get("spark.submit.deployMode")
    res0: String = client
    

    This is not available in PySpark

    Use sc.deployMode

    scala> sc.deployMode
    res0: String = client
    
    scala> sc.version
    res1: String = 2.1.0-SNAPSHOT
    

提交回复
热议问题