Spark application throws javax.servlet.FilterRegistration

后端 未结 7 1081
别跟我提以往
别跟我提以往 2020-12-30 01:57

I\'m using Scala to create and run a Spark application locally.

My build.sbt:

name : \"SparkDemo\"
version : \"1.0\"
scalaVersion : \"2.10.4\"
librar         


        
7条回答
  •  无人及你
    2020-12-30 02:42

    try running a simple program without the hadoop and hbase dependency

    libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.6.0"     excludeAll(ExclusionRule(organization = "org.eclipse.jetty"))
    
    libraryDependencies += "org.apache.hadoop" % "hadoop-mapreduce-client-core" % "2.6.0"
    
    
    libraryDependencies += "org.apache.hbase" % "hbase-client" % "0.98.4-hadoop2"
    
    libraryDependencies += "org.apache.hbase" % "hbase-server" % "0.98.4-hadoop2"
    
    libraryDependencies += "org.apache.hbase" % "hbase-common" % "0.98.4-hadoop2"
    

    There should be mismatch of the dependencies. also make sure you have same version of jars while you compile and while you run.

    Also is it possible to run the code on spark shell to reproduce ? I will be able to help better.

提交回复
热议问题