Error when trying to write to hdfs: Server IPC version 9 cannot communicate with client version 4

前端 未结 3 1611
日久生厌
日久生厌 2020-12-19 02:44

I am trying a write a file to hdfs using scala and I keep getting the following error

Caused by: org.apache.hadoop.ipc.RemoteException: Server IPC version 9         


        
相关标签:
3条回答
  • 2020-12-19 03:06

    As said in error message Server IPC version 9 cannot communicate with client version 4 your server has slighly newer version, than your client. You have to either downgrade your hadoop cluster (most likely not an option) or upgrade your client library from 1.2.1 to 2.x version.

    0 讨论(0)
  • 2020-12-19 03:18

    I had the same problem using Hadoop 2.3 and I've solved it adding the following lines to my build.sbt file :

    libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.3.0"
    
    libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "2.3.0"
    

    So I think in your case you case use the 2.4.0 version.

    PS: It also worked on your code sample. I hope it will help

    0 讨论(0)
  • 2020-12-19 03:21

    hadoop and spark versions should be in sync. (In my case, I am working with spark-1.2.0 and hadoop 2.2.0)

    STEP 1 - goto $SPARK_HOME

    STEP 2 - Simply mvn build spark with the version of hadoop client you want,

    mvn -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 -DskipTests clean package
    

    STEP 3 - Also spark project should have proper spark version,

    name := "smartad-spark-songplaycount"
    
    version := "1.0"
    
    scalaVersion := "2.10.4"
    
    //libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.1"
    libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.2.0"
    
    libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.2.0"
    
    libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "2.2.0"
    
    resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
    

    References

    Building apache spark with mvn

    0 讨论(0)
提交回复
热议问题