Why spark-shell fails with NullPointerException?

前端 未结 10 1404
没有蜡笔的小新
没有蜡笔的小新 2020-12-02 07:45

I try to execute spark-shell on Windows 10, but I keep getting this error every time I run it.

I used both latest and spark-1.5.0-bin-hadoop2.4 versions

相关标签:
10条回答
  • 2020-12-02 08:23

    You can resolve this issue by placing mysqlconnector jar in spark-1.6.0/libs folder and restart it again.It works.

    The important thing is here instead of running spark-shell you should do

    spark-shell --driver-class-path /home/username/spark-1.6.0-libs-mysqlconnector.jar
    

    Hope it should work.

    0 讨论(0)
  • 2020-12-02 08:26

    My guess is that you're running into https://issues.apache.org/jira/browse/SPARK-10528. I was seeing the same issue running on Windows 7. Initially I was getting the NullPointerException as you did. When I put winutils into the bin directory and set HADOOP_HOME to point to the Spark directory, I got the error described in the JIRA issue.

    0 讨论(0)
  • 2020-12-02 08:27

    I used Spark 1.5.2 with Hadoop 2.6 and had similar problems. Solved by doing the following steps:

    1. Download winutils.exe from the repository to some local folder, e.g. C:\hadoop\bin.

    2. Set HADOOP_HOME to C:\hadoop.

    3. Create c:\tmp\hive directory (using Windows Explorer or any other tool).

    4. Open command prompt with admin rights.

    5. Run C:\hadoop\bin\winutils.exe chmod 777 /tmp/hive

    With that, I am still getting some warnings, but no ERRORs and can run Spark applications just fine.

    0 讨论(0)
  • 2020-12-02 08:27

    I was facing a similar issue, got it resolved by putting the winutil inside bin folder. The Hadoop_home should be set as C:\Winutils and winutil to be placed in C:\Winutils\bin.

    Windows 10 64 bit Winutils are available in https://github.com/steveloughran/winutils/tree/master/hadoop-2.6.0/bin

    Also ensure that command line has administrative access.

    Refer https://wiki.apache.org/hadoop/WindowsProblems

    0 讨论(0)
提交回复
热议问题