Failed to start master for Spark in Windows

后端 未结 5 442
旧时难觅i
旧时难觅i 2020-12-08 11:28

Same problem as Failed to start master for spark in windows 10 which is also not solved.

My spark is working well by testing pyspark.cmd and spark-shell.cmd

相关标签:
5条回答
  • 2020-12-08 12:08

    If you are looking to start the master worker and slaves this should work for you. it works for me

    1. To start master worker open windows command prompt on the spark/bin directory then copy and paste this command and hit enter
    spark-class org.apache.spark.deploy.master.Master

    You will need to point your browser to http://localhost:8080/. If you get an error message "server not found" refresh page. From this page you will get your unique url. looks like this URL: spark://192.xxx.xx.xxx:7077

    1. open a new terminal and go the %SPARK_HOME%/bin,copy and paste this line of code and hit enter.
    spark-class org.apache.spark.deploy.worker.Worker spark://ip:port

    This part

    spark://ip:port
    is the URL obtained from step 1. Refresh the browser tab opened in step one to see if the worker has started.

    NOTE: JDK 1.9 is not supported

    0 讨论(0)
  • 2020-12-08 12:10

    a little trick should help. I changed JAVA_HOME path to the DOS version: c:\Progra~1\Java\jre1.8.0_131 for instance then rebooted. After this i was able to run spark-class org.apache... command mentioned above.

    0 讨论(0)
  • 2020-12-08 12:11

    The launch scripts located at %SPARK_HOME%\sbin do not support Windows. You need to manually run the master and worker as outlined below.

    1. Go to %SPARK_HOME%\bin folder in a command prompt

    2. Run spark-class org.apache.spark.deploy.master.Master to run the master. This will give you a URL of the form spark://ip:port

    3. Run spark-class org.apache.spark.deploy.worker.Worker spark://ip:port to run the worker. Make sure you use the URL you obtained in step 2.

    4. Run spark-shell --master spark://ip:port to connect an application to the newly created cluster.

    0 讨论(0)
  • 2020-12-08 12:15

    After executing spark-class org.apache.spark.deploy.master.Master, just goto http://localhost:8080 to get ip:port. And then open another command shell to execute spark-class org.apache.spark.deploy.worker.Worker spark://IP:PORT

    0 讨论(0)
  • 2020-12-08 12:22

    Just found answer here: https://spark.apache.org/docs/1.2.0/spark-standalone.html

    "Note: The launch scripts do not currently support Windows. To run a Spark cluster on Windows, start the master and workers by hand."

    0 讨论(0)
提交回复
热议问题