Why spark-shell fails with NullPointerException?

前端 未结 10 1408
没有蜡笔的小新
没有蜡笔的小新 2020-12-02 07:45

I try to execute spark-shell on Windows 10, but I keep getting this error every time I run it.

I used both latest and spark-1.5.0-bin-hadoop2.4 versions

10条回答
  •  既然无缘
    2020-12-02 08:27

    I used Spark 1.5.2 with Hadoop 2.6 and had similar problems. Solved by doing the following steps:

    1. Download winutils.exe from the repository to some local folder, e.g. C:\hadoop\bin.

    2. Set HADOOP_HOME to C:\hadoop.

    3. Create c:\tmp\hive directory (using Windows Explorer or any other tool).

    4. Open command prompt with admin rights.

    5. Run C:\hadoop\bin\winutils.exe chmod 777 /tmp/hive

    With that, I am still getting some warnings, but no ERRORs and can run Spark applications just fine.

提交回复
热议问题