I am trying to setup a Spark standalone cluster following the official documentation.
My master is on a local vm running ubuntu and I also have one worker running in
Its depends on your spark version, it will need different conf.
if your spark version 1.6 add this line to conf/spark-env.shso another machine can connect to master
SPARK_MASTER_IP=your_host_ip
and if your spark version is 2.x add these lines to your conf/spark-env.sh
SPARK_MASTER_HOST=your_host_ip
SPARK_LOCAL_IP=your_host_ip
after adding these lines run spark :
./sbin/spark-all.sh
and if you do right , you can see in that spark master url is:spark://
BeCarefule your_host_ip ,shouldnt be localhost and It must be exactly Your host ip that you set in conf/spark-env.sh
after all you can connect another machine to the master with command below:
./sbin/start-slave.sh spark://your_host_ip:7077