Apache Spark Exception in thread “main” java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class

匿名 (未验证) 提交于 2019-12-03 02:03:01

问题:

Scala version:2.11.7(had to upgrade the scala verison to enable case clasess to accept more than 22 parameters.) Spark version:1.6.1 PFB pom.xml

Getting below error when trying to setup spark on intellij IDE,

16/03/16 18:36:44 INFO spark.SparkContext: Running Spark version 1.6.1 Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class     at org.apache.spark.util.TimeStampedWeakValueHashMap.(TimeStampedWeakValueHashMap.scala:42)     at org.apache.spark.SparkContext.(SparkContext.scala:298)     at com.examples.testSparkPost$.main(testSparkPost.scala:27)     at com.examples.testSparkPost.main(testSparkPost.scala)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     at java.lang.reflect.Method.invoke(Method.java:606)     at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140) Caused by: java.lang.ClassNotFoundException: scala.collection.GenTraversableOnce$class     at java.net.URLClassLoader$1.run(URLClassLoader.java:366)     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)     at java.security.AccessController.doPrivileged(Native Method)     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)     at java.lang.ClassLoader.loadClass(ClassLoader.java:425)     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)     at java.lang.ClassLoader.loadClass(ClassLoader.java:358)     ... 9 more 

pom.xml:

4.0.0StreamProcessStreamProcess0.0.1-SNAPSHOT${project.artifactId}This is a boilerplate maven project to start using Spark in Scala20101.61.6UTF-82.102.11.7cloudera-repo-releaseshttps://repository.cloudera.com/artifactory/repo/src/main/scalasrc/test/scalamaven-assembly-pluginpackagesinglejar-with-dependenciesnet.alchim31.mavenscala-maven-plugin3.2.2compiletestCompile-dependencyfile${project.build.directory}/.scala_dependenciesmaven-assembly-plugin2.4.1jar-with-dependenciesmake-assemblypackagesingleorg.scala-langscala-library${scala.version}org.mongodb.mongo-hadoopmongo-hadoop-core1.4.2javax.servletservlet-apiorg.mongodbmongodb-driver3.2.2javax.servletservlet-apiorg.mongodbmongodb-driver3.2.2javax.servletservlet-apiorg.apache.sparkspark-streaming_2.101.6.1org.apache.sparkspark-core_2.101.6.1org.apache.sparkspark-sql_2.101.6.1org.apache.hadoophadoop-hdfs2.6.0org.apache.hadoophadoop-auth2.6.0org.apache.hadoophadoop-common2.6.0org.apache.hadoophadoop-core1.2.1

Would like to know , what to be changed in pom to get things going.

回答1:

In the POM you have Scala version 2.11.7 but later on in the dependencies you are declaring Spark deps compiled against 2.10:

spark-streaming_2.10spark-core_2.10spark-sql_2.10

You have to change it to:

spark-streaming_2.11spark-core_2.11spark-sql_2.11


回答2:

For scala 2.12.0, you should add the following dependency to your POM.xml file, and that's it.

 org.scala-langscala-library2.12.0-M1


标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!