I\'m new to Spark and I\'m busy setting up a Spark Cluster with HA enabled.
When starting a spark shell for testing via: bash spark-shell --master yarn --deploy
I do not see any errors in your logs, there are only warnings that you can avoid by adding the environment variables :
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
For the exception : try to set manually the spark configuration for yarn : http://badrit.com/blog/2015/2/29/running-spark-on-yarn#.WD_e66IrJsM
hdfs dfs -mkdir -p /user/spark/share/lib
hdfs dfs -put $SPARK_HOME/assembly/lib/spark-assembly_*.jar /user/spark/share/lib/spark-assembly.jar
export SPARK_JAR=hdfs://your-server:port/user/spark/share/lib/spark-assembly.jar
Hope this help.