Pyspark append executor environment variable

后端 未结 1 1106
再見小時候
再見小時候 2020-12-06 03:22

Is it possible to append a value to the PYTHONPATH of a worker in spark?

I know it is possible to go to each worker node, configure spark-env.sh file and do it, but

1条回答
  •  日久生厌
    2020-12-06 03:48

    I figured out myself...

    The problem is not with spark, but in ConfigParser

    Based on this answer, I fixed the ConfigParser to always preserve case.

    After this, I found out that the default spark behavior is to append the values to existing worker env.variables, if there is a env.variable with the same name.

    So, it is not necessary to mention $PYTHONPATH within dollar sign.

    .setExecutorEnv('PYTHONPATH', '/custom_dir_that_I_want_to_append/')
    

    0 讨论(0)
提交回复
热议问题