How to work efficiently with SBT, Spark and “provided” dependencies?

前端 未结 8 875
野性不改
野性不改 2021-01-31 02:47

I\'m building an Apache Spark application in Scala and I\'m using SBT to build it. Here is the thing:

  1. when I\'m developing under IntelliJ IDEA, I want Spark depend
8条回答
  •  暗喜
    暗喜 (楼主)
    2021-01-31 03:38

    A solution based on creating another subproject for running the project locally is described here.

    Basically, you would need to modifiy the build.sbt file with the following:

    lazy val sparkDependencies = Seq(
      "org.apache.spark" %% "spark-streaming" % sparkVersion
    )
    
    libraryDependencies ++= sparkDependencies.map(_ % "provided")
    
    lazy val localRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
       libraryDependencies ++= sparkDependencies.map(_ % "compile")
    )
    

    And then run the new subproject locally with Use classpath of module: localRunner under the Run Configuration.

提交回复
热议问题