Apache Spark error while start

后端 未结 3 1368
有刺的猬
有刺的猬 2020-12-15 10:32

I want to enable single cluster in Apache Spark, I installed java and scala. I downloaded the spark for Apache Hadoop 2.6 and unpacked. I\'m trying to turn the spark-shell b

3条回答
  •  不知归路
    2020-12-15 10:59

    Above solution did not work for me. I followed these steps: How to start Spark applications on Windows (aka Why Spark fails with NullPointerException)?

    and changed HADOOP_HOME environment variable in system variable. It worked for me.

提交回复
热议问题