Spark Shell - __spark_libs__.zip does not exist

前端 未结 3 505
感动是毒
感动是毒 2020-12-16 23:00

I\'m new to Spark and I\'m busy setting up a Spark Cluster with HA enabled.

When starting a spark shell for testing via: bash spark-shell --master yarn --deploy

3条回答
  •  清酒与你
    2020-12-16 23:43

    I do not see any errors in your logs, there are only warnings that you can avoid by adding the environment variables :

    export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
    export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
    

    For the exception : try to set manually the spark configuration for yarn : http://badrit.com/blog/2015/2/29/running-spark-on-yarn#.WD_e66IrJsM

    hdfs dfs -mkdir -p  /user/spark/share/lib
    hdfs dfs -put $SPARK_HOME/assembly/lib/spark-assembly_*.jar /user/spark/share/lib/spark-assembly.jar
    export SPARK_JAR=hdfs://your-server:port/user/spark/share/lib/spark-assembly.jar

    Hope this help.

提交回复
热议问题