spark - scala init error on build

陌路散爱 提交于 2019-12-11 00:31:34

问题


I'm trying to use spark in my scala application.

this is my spark dependency I'm using :

<dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.10</artifactId>
      <version>1.0.0</version>
    </dependency>

Then in my code

import org.apache.spark.SparkConf
val sparkConf = new SparkConf()

There is no error in my eclipse IDE, But code build (mvn package exec:java) is failed with the following error:

error: class file needed by SparkConf is missing.
[ERROR] reference type Cloneable of package scala refers to nonexisting symbol.
[ERROR] val sparkConf = new SparkConf()
[ERROR]                     ^
[ERROR] one error found

How can I handle this?


回答1:


Like @massag has mention, It was a Scala version mismatch:

spark-core_2.10 is using scala 2.10.x



来源:https://stackoverflow.com/questions/24303314/spark-scala-init-error-on-build

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!