Spark Standalone Cluster - Slave not connecting to Master

后端 未结 4 1417
孤独总比滥情好
孤独总比滥情好 2020-12-08 07:59

I am trying to setup a Spark standalone cluster following the official documentation.

My master is on a local vm running ubuntu and I also have one worker running in

4条回答
  •  暗喜
    暗喜 (楼主)
    2020-12-08 08:22

    I encounter the exact same problem as you and just figure out how to get it to work.

    The problem is that your spark master is listening on hostname, in your example spark, which causes the worker on the same host being able to register successfully but failed from another machine with command start-slave.sh spark://spark:7077.

    The solution is to make sure the value SPARK_MASTER_IP is specified with ip in file conf/spark-env.sh

        SPARK_MASTER_IP=
    

    on your master node, and start your spark master as normal. You can open your web GUI to make sure your spark master appears as spark://YOUR_HOST_IP:7077 after the start. Then, on another machine with command start-slave.sh spark://:7077 should start and register worker to master successfully.

    Hope it would help you

提交回复
热议问题