Spark on windows 10 not working
问题 Im trying to get spark working on win10. When i try to run spark shell i get this error : 'Spark\spark-2.0.0-bin-hadoop2.7\bin..\jars""\ is not recognized as an internal or external command,operable program or batch file. Failed to find Spark jars directory. You need to build Spark before running this program. I am using a pre-built spark for hadoop 2.7 or later. I have installed java 8, eclipse neon, python 2.7, scala 2.11, gotten winutils for hadoop 2.7.1 And i still get this error. When I