sbt unresolved dependency for spark-cassandra-connector 2.0.2

拥有回忆 提交于 2019-12-02 14:59:52

问题


build.sbt:

val sparkVersion = "2.1.1";

libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided";

libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector" % "2.0.2";
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % sparkVersion;

output:

[error] (myproject/*:update) sbt.ResolveException: unresolved dependency: com.datastax.spark#spark-cassandra-connector;2.0.2: not found

Any idea? I am new to sbt and spark. Thanks


回答1:


This is caused by "com.datastax.spark" % "spark-cassandra-connector" % "2.0.2"; without scala version, see maven repo:

http://search.maven.org/#artifactdetails%7Ccom.datastax.spark%7Cspark-cassandra-connector_2.11%7C2.0.2%7Cjar

There are 2 solutions for this:

  1. "com.datastax.spark" % "spark-cassandra-connector_2.11" % "2.0.2" explicitly set Scala version for dependency
  2. "com.datastax.spark" %% "spark-cassandra-connector" % "2.0.2", use %% with artifact id, this way, the SBT will auto base on your project's scala version to expand to the solution 1.


来源:https://stackoverflow.com/questions/44461789/sbt-unresolved-dependency-for-spark-cassandra-connector-2-0-2

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!