spark build path is cross-compiled with an incompatible version of Scala (2.10.0)

前端 未结 4 1989
渐次进展
渐次进展 2020-12-29 08:57

When i try to execute sparksql code in scala IDE im getting below error,Could anyone help me to sort out this please?

spark build path is cross-compiled wit         


        
4条回答
  •  南笙
    南笙 (楼主)
    2020-12-29 09:25

    In your project you are using jars built with different Scala versions; indeed from the log:

    • Scala IDE uses Scala 2.11.7
    • Apache Spark 1.5.2 built with Scala 2.10

    You need to align the jars versions. Actually Spark 1.5.2 pre-build (downloadable from here) has built with Scala 2.10 for compatibility reasons (see here). On the web site there is a note:

    Note: Scala 2.11 users should download the Spark source package and build with Scala 2.11 support.

    In order to solve the issue, in your project use the same Scala version used to build Spark.

    I suggest to try to switch to Scala 2.10 in your Scala IDE and it will solve the issue.

    Eclipse + Maven

    In my case, I'm using Eclipse with Scala IDE and Maven so I updated the Maven dependencies in this way:

    
        org.scala-lang
        scala-library
        2.10.6
    
    
        org.apache.spark
        spark-core_2.10
        1.5.2
    
    

    Then, I changed the Scala version in the IDE: Right click on the project -> Scala -> set Scala installation or Right Click on the Scala Library Container -> Properties and choose the Scala 2.10 as shown below

提交回复
热议问题