Im using spark (in java API) and require a single jar that can be pushed to the cluster, however the jar itself should not include spark. The app that deploys the jobs of co
For beginners like me, simply add the % Provided to Spark dependencies to exclude them from an uber-jar:
% Provided
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.0" % Provided libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.4.0" % Provided
in build.sbt.
build.sbt