Failed to bind to: spark-master, using a remote cluster with two workers

后端 未结 4 1435
滥情空心
滥情空心 2020-12-14 17:53

I am managing to get everything working with the local master and two remote workers. Now, I want to connect to a remote master that has the same remote workers. I have trie

4条回答
  •  眼角桃花
    2020-12-14 18:41

    I had spark working in my EC2 instance. I started a new web server and to meet its requirement I had to change hostname to ec2 public DNS name i.e.

    hostname ec2-54-xxx-xxx-xxx.compute-1.amazonaws.com
    

    After that my spark could not work and showed error as below:

    16/09/20 21:02:22 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 16/09/20 21:02:22 ERROR SparkContext: Error initializing SparkContext.

    I solve it by setting SPARK_LOCAL_IP to as below:

    export SPARK_LOCAL_IP="localhost"
    

    then just launched sparkling shell as below:

    $SPARK_HOME/bin/spark-shell
    

提交回复
热议问题