Same problem as Failed to start master for spark in windows 10 which is also not solved.
My spark is working well by testing pyspark.cmd and spark-shell.cmd
If you are looking to start the master worker and slaves this should work for you. it works for me
spark-class org.apache.spark.deploy.master.Master
You will need to point your browser to http://localhost:8080/. If you get an error message "server not found" refresh page. From this page you will get your unique url. looks like this URL: spark://192.xxx.xx.xxx:7077
spark-class org.apache.spark.deploy.worker.Worker spark://ip:port
This part
spark://ip:portis the URL obtained from step 1. Refresh the browser tab opened in step one to see if the worker has started.
NOTE: JDK 1.9 is not supported
a little trick should help. I changed JAVA_HOME path to the DOS version: c:\Progra~1\Java\jre1.8.0_131 for instance then rebooted. After this i was able to run spark-class org.apache... command mentioned above.
The launch scripts located at %SPARK_HOME%\sbin
do not support Windows. You need to manually run the master and worker as outlined below.
Go to %SPARK_HOME%\bin
folder in a command prompt
Run spark-class org.apache.spark.deploy.master.Master
to run the master. This will give you a URL of the form spark://ip:port
Run spark-class org.apache.spark.deploy.worker.Worker spark://ip:port
to run the worker. Make sure you use the URL you obtained in step 2.
Run spark-shell --master spark://ip:port
to connect an application to the newly created cluster.
After executing spark-class org.apache.spark.deploy.master.Master, just goto http://localhost:8080 to get ip:port. And then open another command shell to execute spark-class org.apache.spark.deploy.worker.Worker spark://IP:PORT
Just found answer here: https://spark.apache.org/docs/1.2.0/spark-standalone.html
"Note: The launch scripts do not currently support Windows. To run a Spark cluster on Windows, start the master and workers by hand."