How to solve “Can't assign requested address: Service 'sparkDriver' failed after 16 retries” when running spark code?

半世苍凉 提交于 2021-02-07 07:49:50

问题


I am learning spark + scala with intelliJ , started with below small piece of code

import org.apache.spark.{SparkConf, SparkContext}

object ActionsTransformations {

  def main(args: Array[String]): Unit = {
    //Create a SparkContext to initialize Spark
    val conf = new SparkConf()
    conf.setMaster("local")
    conf.setAppName("Word Count")
    val sc = new SparkContext(conf)

    val numbersList = sc.parallelize(1.to(10000).toList)

    println(numbersList)
  }

}

when trying to run , getting below exception

Exception in thread "main" java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:433)
    at sun.nio.ch.Net.bind(Net.java:425)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:496)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:481)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:446)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
    at java.lang.Thread.run(Thread.java:745)

Process finished with exit code 1

can any one suggest what to do .


回答1:


Seems like you've used some old version of spark. In your case try to add this line:

conf.set("spark.driver.bindAddress", "127.0.0.1")

If you will use spark 2.0+ folowing should do the trick:

val spark: SparkSession = SparkSession.builder()
.appName("Word Count")
.master("local[*]")
.config("spark.driver.bindAddress", "127.0.0.1")
.getOrCreate()



回答2:


Add SPARK_LOCAL_IP in load-spark-env.sh file located at spark/bin directory

export SPARK_LOCAL_IP="127.0.0.1"




回答3:


I think setMaster and setAppName will return a new SparkConf object and the line conf.setMaster("local") will not effect on the conf variable. So you should try:

val conf = new SparkConf()
    .setMaster("local[*]")
    .setAppName("Word Count")



回答4:


It seems like the ports which spark is trying to bind are already in use. Did this issue start happening after you ran spark successfully a few times? You may want to check if those previously-run-spark-processes are still alive, and are holding onto some ports (a simple jps / ps -ef should tell you that). If yes, kill those processes and try again.




回答5:


conf.set("spark.driver.bindAddress", "127.0.0.1")

Adding bindAddress worked for me.




回答6:


This worked for me for same error with pySpark:

from pyspark import SparkContext, SparkConf
conf_spark = SparkConf().set("spark.driver.host", "127.0.0.1")
sc = SparkContext(conf=conf_spark)



回答7:


Sometimes the problem is related to a connected VPN or something like that! Just disconnect your VPN or any other tool that may affect your networking, and then try again.



来源:https://stackoverflow.com/questions/52133731/how-to-solve-cant-assign-requested-address-service-sparkdriver-failed-after

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!