问题
I'm trying to use spark in my scala application.
this is my spark dependency I'm using :
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.0.0</version>
</dependency>
Then in my code
import org.apache.spark.SparkConf
val sparkConf = new SparkConf()
There is no error in my eclipse IDE, But code build (mvn package exec:java) is failed with the following error:
error: class file needed by SparkConf is missing.
[ERROR] reference type Cloneable of package scala refers to nonexisting symbol.
[ERROR] val sparkConf = new SparkConf()
[ERROR] ^
[ERROR] one error found
How can I handle this?
回答1:
Like @massag has mention, It was a Scala version mismatch:
spark-core_2.10 is using scala 2.10.x
来源:https://stackoverflow.com/questions/24303314/spark-scala-init-error-on-build