org.apache.spark.rpc.RpcTimeoutException: Futures timed out after [120 seconds]. This timeout is controlled by spark.rpc.lookupTimeout

后端 未结 2 1392
渐次进展
渐次进展 2020-12-31 21:59

Getting the below error with respect to the container while submitting an spark application to YARN. The HADOOP(2.7.3)/SPARK (2.1) environment is running a pseudo-distribute

2条回答
  •  再見小時候
    2020-12-31 22:36

    for me it is the firewall settings in spark cluster which prevents the executors from connecting correctly, the problem I couldn't figure that promptly as spark UI shows all workers connected to the master, but there are other connections blocked by my firewall. After setting the following ports and allowing them in the firewall problem solved. ( please note that Spark use a random port for these settings by default)

    spark.driver.port                    
    spark.blockManager.port              
    

提交回复
热议问题