I try to execute spark-shell on Windows 10, but I keep getting this error every time I run it.
spark-shell
I used both latest and spark-1.5.0-bin-hadoop2.4 versions
You can resolve this issue by placing mysqlconnector jar in spark-1.6.0/libs folder and restart it again.It works.
The important thing is here instead of running spark-shell you should do
spark-shell --driver-class-path /home/username/spark-1.6.0-libs-mysqlconnector.jar
Hope it should work.