I am trying a write a file to hdfs using scala and I keep getting the following error
Caused by: org.apache.hadoop.ipc.RemoteException: Server IPC version 9
As said in error message
Server IPC version 9 cannot communicate with client version 4
your server has slighly newer version, than your client. You have to either downgrade your hadoop cluster (most likely not an option) or upgrade your client library from 1.2.1
to 2.x version.
I had the same problem using Hadoop 2.3 and I've solved it adding the following lines to my build.sbt file :
libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.3.0"
libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "2.3.0"
So I think in your case you case use the 2.4.0 version.
PS: It also worked on your code sample. I hope it will help
hadoop
and spark
version
s should be in sync. (In my case, I am working with spark-1.2.0
and hadoop 2.2.0
)
STEP 1 - goto $SPARK_HOME
STEP 2 - Simply mvn build
spark with the version of hadoop client you want,
mvn -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 -DskipTests clean package
STEP 3 - Also spark project should have proper spark version,
name := "smartad-spark-songplaycount"
version := "1.0"
scalaVersion := "2.10.4"
//libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.1"
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.2.0"
libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.2.0"
libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "2.2.0"
resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
Building apache spark with mvn