Is it possible to get the current spark context settings in PySpark?

前端 未结 13 1390
孤街浪徒
孤街浪徒 2021-01-29 22:05

I\'m trying to get the path to spark.worker.dir for the current sparkcontext.

If I explicitly set it as a config param, I can read

13条回答
  •  你的背包
    2021-01-29 22:31

    For a complete overview of your Spark environment and configuration I found the following code snippets useful:

    SparkContext:

    for item in sorted(sc._conf.getAll()): print(item)
    

    Hadoop Configuration:

    hadoopConf = {}
    iterator = sc._jsc.hadoopConfiguration().iterator()
    while iterator.hasNext():
        prop = iterator.next()
        hadoopConf[prop.getKey()] = prop.getValue()
    for item in sorted(hadoopConf.items()): print(item)
    

    Environment variables:

    import os
    for item in sorted(os.environ.items()): print(item)
    

提交回复
热议问题