I try to execute spark-shell on Windows 10, but I keep getting this error every time I run it.
spark-shell
I used both latest and spark-1.5.0-bin-hadoop2.4 versions
Or perhaps this link here below be easier to follow,
https://wiki.apache.org/hadoop/WindowsProblems
Basically download and copy winutils.exe to your spark\bin folder. Re-run spark-shell
If you have not set your /tmp/hive to a writable state, please do so.