spark sbt compile error libraryDependencies

冷暖自知 提交于 2019-12-02 09:05:01

问题


1.2.0-bin-hadoop2.4 and my Scala version is 2.11.7. I am getting an error so I can't use sbt.

~/sparksample$ sbt

Starting sbt: invoke with -help for other options [info] Set current project to Spark Sample (in build file:/home/beyhan/sparksample/)

> sbt compile

[info] Updating {file:/home/beyhan/sparksample/}default-f390c8... [info] Resolving org.scala-lang#scala-library;2.11.7 ... [info] Resolving org.apache.spark#spark-core_2.11.7;1.2.0 ... [warn] module not found: org.apache.spark#spark-core_2.11.7;1.2.0 [warn] ==== local: tried [warn] /home/beyhan/.ivy2/local/org.apache.spark/spark-core_2.11.7/1.2.0/ivys/ivy.xml [warn] ==== public: tried [warn] http://repo1.maven.org/maven2/org/apache/spark/spark-core_2.11.7/1.2.0/spark-core_2.11.7-1.2.0.pom [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: UNRESOLVED DEPENDENCIES :: [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: org.apache.spark#spark-core_2.11.7;1.2.0: not found [warn] :::::::::::::::::::::::::::::::::::::::::::::: [error] {file:/home/beyhan/sparksample/}default-f390c8/*:update: sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.11.7;1.2.0: not found [error] Total time: 2 s, completed Oct 15, 2015 11:30:47 AM

Any suggestions? Thanks


回答1:


There exists no spark-core_2.11.7 jar file. You have to get rid of the maintenance version number .7 in the spark dependencies because spark-core_2.11 exists. All Scala versions with version 2.11 should be compatible.

Update

A minimal sbt file could look like

name := "Simple Project"

version := "1.0"

scalaVersion := "2.11.7"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"



回答2:


As @Till Rohrmann suggested you there's no such thing as spark-core_2.11.7 and your build.sbt appears to reference that library.

I suggest you to edit the file /home/beyhan/sparksample/build.sbt and remove the references to that library.

The correct reference is:

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.2.0"

Remember that not only spark-core does not have any version 2.11.7 but also other spark libraries that you might be using.




回答3:


[info] Updating {file:/home/beyhan/sparksample/}default-f390c8... [info] Resolving org.scala-lang#scala-library;2.11.7 ... [info] Resolving org.apache.spark#spark-core_2.11.7;1.2.0 ... [warn] module not found: org.apache.spark#spark-core_2.11.7;1.2.0 [warn] ==== local: tried [warn] /home/beyhan/.ivy2/local/org.apache.spark/spark-core_2.11.7/1.2.0/ivys/ivy.xml [warn] ==== public: tried [warn]



来源:https://stackoverflow.com/questions/33143665/spark-sbt-compile-error-librarydependencies

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!