Spark Standalone Cluster - Slave not connecting to Master

后端 未结 4 1421
孤独总比滥情好
孤独总比滥情好 2020-12-08 07:59

I am trying to setup a Spark standalone cluster following the official documentation.

My master is on a local vm running ubuntu and I also have one worker running in

4条回答
  •  孤街浪徒
    2020-12-08 08:21

    Its depends on your spark version, it will need different conf. if your spark version 1.6 add this line to conf/spark-env.shso another machine can connect to master

    SPARK_MASTER_IP=your_host_ip

    and if your spark version is 2.x add these lines to your conf/spark-env.sh

    SPARK_MASTER_HOST=your_host_ip

    SPARK_LOCAL_IP=your_host_ip

    after adding these lines run spark :

    ./sbin/spark-all.sh

    and if you do right , you can see in :8080 that spark master url is:spark://:7077

    BeCarefule your_host_ip ,shouldnt be localhost and It must be exactly Your host ip that you set in conf/spark-env.sh

    after all you can connect another machine to the master with command below:

    ./sbin/start-slave.sh spark://your_host_ip:7077

提交回复
热议问题