I want to enable single cluster in Apache Spark, I installed java and scala. I downloaded the spark for Apache Hadoop 2.6 and unpacked. I\'m trying to turn the spark-shell b
Above solution did not work for me. I followed these steps:
How to start Spark applications on Windows (aka Why Spark fails with NullPointerException)?
and changed HADOOP_HOME environment variable in system variable. It worked for me.