How to pass environment variables to spark driver in cluster mode with spark-submit

后端 未结 5 1648
有刺的猬
有刺的猬 2021-01-01 18:33

spark-submit allows to configure the executor environment variables with --conf spark.executorEnv.FOO=bar, and the Spark REST API allows to pass so

5条回答
  •  天涯浪人
    2021-01-01 18:56

    On Yarn in cluster mode, it worked by adding the environment variables in the spark-submit command using --conf as below-

    spark-submit --master yarn-cluster --num-executors 15 --executor-memory 52g --executor-cores 7 --driver-memory 52g --conf "spark.yarn.appMasterEnv.FOO=/Path/foo" --conf "spark.executorEnv.FOO2=/path/foo2" app.jar

    Also, you can do it by adding them in conf/spark-defaults.conf file.

提交回复
热议问题