问题
I want to override dependency on project in certain Task. I have a sbt multi-project which using spark.
lazy val core = // Some Project
val sparkLibs = Seq(
"org.apache.spark" %% "spark-core" % "1.6.1"
)
val sparkLibsProvided = Seq(
"org.apache.spark" %% "spark-core" % "1.6.1" % "provided"
)
lazy val main = Project(
id = "main",
base = file("main-project"),
settings = sharedSettings
).settings(
name := "main",
libraryDependencies ++= sparkLibs,
dependencyOverrides ++= Set(
"com.fasterxml.jackson.core" % "jackson-databind" % "2.4.4"
)
).dependsOn(core)
When I try to make fat jar to submit on my yarn cluster, I use https://github.com/sbt/sbt-assembly task. But in this case, I want to use sparkLibsProvided
instead of sparkLibs
something like:
lazy val sparkProvided = (project in assembly).settings(
dependencyOverrides ++= sparkLibsProvided.toSet
)
How can I properly override this dependency?
回答1:
You can create a new project which is a dedicated project for creating your spark uber jar with the provided flag:
lazy val sparkUberJar = (project in file("spark-project"))
.settings(sharedSettings: _*)
.settings(
libraryDependencies ++= sparkLibsProvided,
dependencyOverrides ++= Set(
"com.fasterxml.jackson.core" % "jackson-databind" % "2.4.4"
)
)
And when you assemble in sbt, go to the said project first:
sbt project sparkUberJar
sbt assembly
回答2:
This can be easily achieved by using the key provided specifically for what you want:
assemblyExcludedJars in assembly := {
val cp = (fullClasspath in assembly).value
cp filter {
_.data.getName == "spark-core-1.6.1.jar"
}
}
This approach is considered hacky, however, and it would be better if you managed to split your configuration into subprojects, as is also warned in official documentation here:
If you need to tell sbt-assembly to ignore JARs, you're probably doing it wrong. assembly task grabs deps JARs from your project's classpath. Try fixing the classpath first.
来源:https://stackoverflow.com/questions/37523745/how-to-override-dependency-on-certain-task-in-sbt