Same problem as Failed to start master for spark in windows 10 which is also not solved.
My spark is working well by testing pyspark.cmd and spark-shell.cmd
If you are looking to start the master worker and slaves this should work for you. it works for me
spark-class org.apache.spark.deploy.master.Master
You will need to point your browser to http://localhost:8080/. If you get an error message "server not found" refresh page. From this page you will get your unique url. looks like this URL: spark://192.xxx.xx.xxx:7077
spark-class org.apache.spark.deploy.worker.Worker spark://ip:port
This part
spark://ip:portis the URL obtained from step 1. Refresh the browser tab opened in step one to see if the worker has started.
NOTE: JDK 1.9 is not supported