I\'m building an Apache Spark application in Scala and I\'m using SBT to build it. Here is the thing:
A solution based on creating another subproject for running the project locally is described here.
Basically, you would need to modifiy the build.sbt file with the following:
lazy val sparkDependencies = Seq(
"org.apache.spark" %% "spark-streaming" % sparkVersion
)
libraryDependencies ++= sparkDependencies.map(_ % "provided")
lazy val localRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings(
libraryDependencies ++= sparkDependencies.map(_ % "compile")
)
And then run the new subproject locally with Use classpath of module: localRunner under the Run Configuration.