How to add “provided” dependencies back to run/test tasks' classpath?

≯℡__Kan透↙ 提交于 2019-11-26 21:43:49

For a similar case I used in assembly.sbt:

run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run)) 

and now the 'run' task uses all the libraries, including the ones marked with "provided". No further change was necessary.

Update:

@rob solution seems to be the only one working on latest SBT version, just add to settings in build.sbt:

run in Compile := Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run)).evaluated,
runMain in Compile := Defaults.runMainTask(fullClasspath in Compile, runner in(Compile, run)).evaluated

Adding to @douglaz' answer,

runMain in Compile <<= Defaults.runMainTask(fullClasspath in Compile, runner in (Compile, run))

is the corresponding fix for the runMain task.

If you use sbt-revolver plugin, here is a solution for its "reStart" task:

fullClasspath in Revolver.reStart <<= fullClasspath in Compile

UPD: for sbt-1.0 you may use the new assignment form:

fullClasspath in Revolver.reStart := (fullClasspath in Compile).value

Another option is to create separate sbt projects for assembly vs run/test. This allows you to run sbt asseblyProj/assembly to build a fat jar for deploying with spark-submit, as well as sbt runTestProj/run for running directly via sbt with Spark embedded. As added benefits, runTestProj will work without modification in IntelliJ, and a separate main class can be defined for each project in order to e.g. specify the spark master in code when running with sbt.

val sparkDep = "org.apache.spark" %% "spark-core" % sparkVersion

val commonSettings = Seq(
  name := "Project",
  libraryDependencies ++= Seq(...) // Common deps
)

// Project for running via spark-submit
lazy val assemblyProj = (project in file("proj-dir"))
  .settings(
    commonSettings,
    assembly / mainClass := Some("com.example.Main"),
    libraryDependencies += sparkDep % "provided"
  )

// Project for running via sbt with embedded spark
lazy val runTestProj = (project in file("proj-dir"))
  .settings(
    // Projects' target dirs can't overlap
    target := target.value.toPath.resolveSibling("target-runtest").toFile,
    commonSettings,
    // If separate main file needed, e.g. for specifying spark master in code
    Compile / run / mainClass := Some("com.example.RunMain"),
    libraryDependencies += sparkDep
  )
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!