Is it possible to get the current spark context settings in PySpark?

前端 未结 13 1280
孤街浪徒
孤街浪徒 2021-01-29 22:05

I\'m trying to get the path to spark.worker.dir for the current sparkcontext.

If I explicitly set it as a config param, I can read

13条回答
  •  独厮守ぢ
    2021-01-29 22:28

    Just for the records the analogous java version:

    Tuple2 sc[] = sparkConf.getAll();
    for (int i = 0; i < sc.length; i++) {
        System.out.println(sc[i]);
    }
    

提交回复
热议问题