Why spark-shell fails with NullPointerException?

前端 未结 10 1415
没有蜡笔的小新
没有蜡笔的小新 2020-12-02 07:45

I try to execute spark-shell on Windows 10, but I keep getting this error every time I run it.

I used both latest and spark-1.5.0-bin-hadoop2.4 versions

10条回答
  •  死守一世寂寞
    2020-12-02 08:15

    You need to give permission to /tmp/hive directory to resolve this exception.

    Hope you already have winutils.exe and set HADOOP_HOME environment variable. Then open the command prompt and run following command as administrator:

    If winutils.exe is present in D:\winutils\bin location and \tmp\hive is also in D drive:

    D:\winutils\bin\winutils.exe chmod 777 D:\tmp\hive
    

    For more details,you can refer the following links :

    Frequent Issues occurred during Spark Development
    How to run Apache Spark on Windows7 in standalone mode

提交回复
热议问题