Spark on windows 10 not working

假如想象 提交于 2019-12-12 23:20:20

问题


Im trying to get spark working on win10. When i try to run spark shell i get this error :

'Spark\spark-2.0.0-bin-hadoop2.7\bin..\jars""\ is not recognized as an internal or external command,operable program or batch file.

Failed to find Spark jars directory. You need to build Spark before running this program.

I am using a pre-built spark for hadoop 2.7 or later. I have installed java 8, eclipse neon, python 2.7, scala 2.11, gotten winutils for hadoop 2.7.1 And i still get this error.

When I donwloaded spark it comes in the tgz, when extracted there is another tzg inside, so i extracted it also and then I got all the bin folders and stuff. I need to access spark-shell. Can anyone help?

EDIT: Solution i ended up using:

1) Virtual box

2) Linux mint


回答1:


I got the same error while building Spark. You can move the extracted folder to C:\

Refer this: http://techgobi.blogspot.in/2016/08/configure-spark-on-windows-some-error.html




回答2:


You are probably giving the wrong folder path to Spark bin.

Just open the command prompt and change directory to the bin inside the spark folder.

Type spark-shell to check.

Refer: Spark on win 10




回答3:


"On Windows, I found that if it is installed in a directory that has a space in the path (C:\Program Files\Spark) the installation will fail. Move it to the root or another directory with no spaces." OR If you have installed Spark under “C:\Program Files (x86)..” replace 'Program Files (x86)' with Progra~2 in the PATH env variable and SPARK_HOME user variable.



来源:https://stackoverflow.com/questions/39296802/spark-on-windows-10-not-working

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!