Why spark-shell fails with NullPointerException?

前端 未结 10 1412
没有蜡笔的小新
没有蜡笔的小新 2020-12-02 07:45

I try to execute spark-shell on Windows 10, but I keep getting this error every time I run it.

I used both latest and spark-1.5.0-bin-hadoop2.4 versions

10条回答
  •  再見小時候
    2020-12-02 08:23

    You can resolve this issue by placing mysqlconnector jar in spark-1.6.0/libs folder and restart it again.It works.

    The important thing is here instead of running spark-shell you should do

    spark-shell --driver-class-path /home/username/spark-1.6.0-libs-mysqlconnector.jar
    

    Hope it should work.

提交回复
热议问题