I\'m using Scala to create and run a Spark application locally.
My build.sbt:
name : \"SparkDemo\"
version : \"1.0\"
scalaVersion : \"2.10.4\"
librar
try running a simple program without the hadoop and hbase dependency
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.6.0" excludeAll(ExclusionRule(organization = "org.eclipse.jetty"))
libraryDependencies += "org.apache.hadoop" % "hadoop-mapreduce-client-core" % "2.6.0"
libraryDependencies += "org.apache.hbase" % "hbase-client" % "0.98.4-hadoop2"
libraryDependencies += "org.apache.hbase" % "hbase-server" % "0.98.4-hadoop2"
libraryDependencies += "org.apache.hbase" % "hbase-common" % "0.98.4-hadoop2"
There should be mismatch of the dependencies. also make sure you have same version of jars while you compile and while you run.
Also is it possible to run the code on spark shell to reproduce ? I will be able to help better.