SparkSQL MissingRequirementError when registering table

后端 未结 2 1668
礼貌的吻别
礼貌的吻别 2021-01-05 02:34

I\'m a newbie to Scala and Apache Spark and I\'m trying to use Spark SQL. After cloning the repo I started the spark shell by typing bin/spark-shell and run the

相关标签:
2条回答
  • 2021-01-05 02:48

    Try to add "org.apache.spark" % "spark-catalyst_2.10" % "1.2.0" (although I feel this should be pulled in as a dependency).

    0 讨论(0)
  • 2021-01-05 03:00

    This problem can be fixed by adding fork := true to sbt project settings.

    See: http://apache-spark-developers-list.1001551.n3.nabble.com/Spark-1-2-0-MissingRequirementError-td10123.html

    Other useful settings might be found in referenced project file:

    https://github.com/deanwampler/spark-workshop/blob/master/project/Build.scala

    0 讨论(0)
提交回复
热议问题