Failed to start master for Spark in Windows

后端 未结 5 453
旧时难觅i
旧时难觅i 2020-12-08 11:28

Same problem as Failed to start master for spark in windows 10 which is also not solved.

My spark is working well by testing pyspark.cmd and spark-shell.cmd

5条回答
  •  臣服心动
    2020-12-08 12:08

    If you are looking to start the master worker and slaves this should work for you. it works for me

    1. To start master worker open windows command prompt on the spark/bin directory then copy and paste this command and hit enter
    spark-class org.apache.spark.deploy.master.Master

    You will need to point your browser to http://localhost:8080/. If you get an error message "server not found" refresh page. From this page you will get your unique url. looks like this URL: spark://192.xxx.xx.xxx:7077

    1. open a new terminal and go the %SPARK_HOME%/bin,copy and paste this line of code and hit enter.
    spark-class org.apache.spark.deploy.worker.Worker spark://ip:port

    This part

    spark://ip:port
    is the URL obtained from step 1. Refresh the browser tab opened in step one to see if the worker has started.

    NOTE: JDK 1.9 is not supported

提交回复
热议问题