I am trying to fix an issue with running out of memory, and I want to know whether I need to change these settings in the default configurations file (spark-defaults.c
Since sc.deployMode is not available in PySpark, you could check spark.submit.deployMode
scala> sc.getConf.get("spark.submit.deployMode")
res0: String = client
Use sc.deployMode
scala> sc.deployMode
res0: String = client
scala> sc.version
res1: String = 2.1.0-SNAPSHOT