My Spark's Worker cannot connect Master.Something wrong with Akka?

前端 未结 7 904
萌比男神i
萌比男神i 2020-12-15 23:46

I want to install Spark Standlone mode to a Cluster with my two virtual machines.
With the version of spark-0.9.1-bin-hadoop1, I execute spark-shell successfully in each

7条回答
  •  遥遥无期
    2020-12-16 00:11

    basically your ports are blocked so communication from master to worker is cut down. check here https://spark.apache.org/docs/latest/configuration.html#networking

    In the "Networking" section, you can see some of the ports are by default random. You can set them to your choice like below:

    val conf = new SparkConf() 
        .setMaster(master) 
        .setAppName("namexxx") 
        .set("spark.driver.port", "51810") 
        .set("spark.fileserver.port", "51811") 
        .set("spark.broadcast.port", "51812") 
        .set("spark.replClassServer.port", "51813") 
        .set("spark.blockManager.port", "51814") 
        .set("spark.executor.port", "51815") 
    

提交回复
热议问题