When running with master 'yarn' either HADOOP_CONF_DIR or YARN_CONF_DIR must be set in the environment

前端 未结 2 586
小鲜肉
小鲜肉 2021-01-12 14:37

I am trying to run Spark using yarn and I am running into this error:

Exception in thread \"main\" java.lang.Exception: When running with master \'yarn\' either

2条回答
  •  不思量自难忘°
    2021-01-12 15:17

    just an update to answer by Shubhangi,

     cd $SPARK_HOME/bin
     sudo nano load-spark-env.sh
    

    add below lines , save and exit

    export SPARK_LOCAL_IP="127.0.0.1"

    export HADOOP_CONF_DIR="$HADOOP_HOME/etc/hadoop"

    export YARN_CONF_DIR="$HADOOP_HOME/etc/hadoop"

提交回复
热议问题