Failed to start master for Spark in Windows

后端 未结 5 451
旧时难觅i
旧时难觅i 2020-12-08 11:28

Same problem as Failed to start master for spark in windows 10 which is also not solved.

My spark is working well by testing pyspark.cmd and spark-shell.cmd

5条回答
  •  情话喂你
    2020-12-08 12:11

    The launch scripts located at %SPARK_HOME%\sbin do not support Windows. You need to manually run the master and worker as outlined below.

    1. Go to %SPARK_HOME%\bin folder in a command prompt

    2. Run spark-class org.apache.spark.deploy.master.Master to run the master. This will give you a URL of the form spark://ip:port

    3. Run spark-class org.apache.spark.deploy.worker.Worker spark://ip:port to run the worker. Make sure you use the URL you obtained in step 2.

    4. Run spark-shell --master spark://ip:port to connect an application to the newly created cluster.

提交回复
热议问题