ClassNotFoundException scala.runtime.LambdaDeserialize when spark-submit

前端 未结 4 1605
南笙
南笙 2020-12-19 00:07

I follow the Scala tutorial on https://spark.apache.org/docs/2.1.0/quick-start.html

My scala file

/* SimpleApp.scala */
import org.apache.spark.Spark         


        
4条回答
  •  粉色の甜心
    2020-12-19 00:55

    There is definitely some confusion regarding the Scala version to be used for Spark 2.4.3. As of today (Nov 25, 2019) the doc home page for spark 2.4.3 states:

    Spark runs on Java 8+, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 2.4.3 uses Scala 2.12. You will need to use a compatible Scala version (2.12.x).

    Note that support for Java 7, Python 2.6 and old Hadoop versions before 2.6.5 were removed as of Spark 2.2.0. Support for Scala 2.10 was removed as of 2.3.0. Support for Scala 2.11 is deprecated as of Spark 2.4.1 and will be removed in Spark 3.0.

    Accordingly, the Scala version is supposed to be 2.12.

提交回复
热议问题