My knowledge with Spark is limited and you would sense it after reading this question. I have just one node and spark, hadoop and yarn are installed on it.
I was abl
You could also use the spark.yarn.archive
option and set that to the location of an archive (you create) containing all the JARs in the $SPARK_HOME/jars/
folder, at the root level of the archive. For example:
jar cv0f spark-libs.jar -C $SPARK_HOME/jars/ .
hdfs dfs -put spark-libs.jar /some/path/
.hdfs dfs –setrep -w 10 hdfs:///some/path/spark-libs.jar
(Change the amount of replicas proportional to the number of total NodeManagers) spark.yarn.archive
to hdfs:///some/path/spark-libs.jar