How to add “provided” dependencies back to run/test tasks' classpath?

前端 未结 4 1985
说谎
说谎 2020-11-28 22:22

Here\'s an example build.sbt:

import AssemblyKeys._

assemblySettings

buildInfoSettings

net.virtualvoid.sbt.graph.Plugin.graphSettings

name :         


        
4条回答
  •  误落风尘
    2020-11-28 23:08

    Another option is to create separate sbt projects for assembly vs run/test. This allows you to run sbt asseblyProj/assembly to build a fat jar for deploying with spark-submit, as well as sbt runTestProj/run for running directly via sbt with Spark embedded. As added benefits, runTestProj will work without modification in IntelliJ, and a separate main class can be defined for each project in order to e.g. specify the spark master in code when running with sbt.

    val sparkDep = "org.apache.spark" %% "spark-core" % sparkVersion
    
    val commonSettings = Seq(
      name := "Project",
      libraryDependencies ++= Seq(...) // Common deps
    )
    
    // Project for running via spark-submit
    lazy val assemblyProj = (project in file("proj-dir"))
      .settings(
        commonSettings,
        assembly / mainClass := Some("com.example.Main"),
        libraryDependencies += sparkDep % "provided"
      )
    
    // Project for running via sbt with embedded spark
    lazy val runTestProj = (project in file("proj-dir"))
      .settings(
        // Projects' target dirs can't overlap
        target := target.value.toPath.resolveSibling("target-runtest").toFile,
        commonSettings,
        // If separate main file needed, e.g. for specifying spark master in code
        Compile / run / mainClass := Some("com.example.RunMain"),
        libraryDependencies += sparkDep
      )
    

提交回复
热议问题