sbt-assembly

spark + sbt-assembly: “deduplicate: different file contents found in the following”

感情迁移 提交于 2019-11-28 10:05:07
I ran spark application and wanna pack the test classes into the fat jar. What is weird is I ran "sbt assembly" successfully, but failed when I ran "sbt test:assembly". I tried sbt-assembly : including test classes , it didn't work for my case. SBT version : 0.13.8 build.sbt: import sbtassembly.AssemblyPlugin._ name := "assembly-test" version := "1.0" scalaVersion := "2.10.5" libraryDependencies ++= Seq( ("org.apache.spark" % "spark-core_2.10" % "1.3.1" % Provided) .exclude("org.mortbay.jetty", "servlet-api"). exclude("commons-beanutils", "commons-beanutils-core"). exclude("commons-collections

Proper way to make a Spark Fat Jar using SBT

时光毁灭记忆、已成空白 提交于 2019-11-28 08:11:30
问题 I need a Fat Jar with Spark because I'm creating a custom node for Knime. Basically it's a self-contained jar executed inside Knime and I assume a Fat Jar is the only way to spawn a local Spark Job. Eventually we will go on submitting a job to a remote cluster but for now I need it to spawn this way. That said, I made a Fat Jar using this: https://github.com/sbt/sbt-assembly I made an empty sbt project, included Spark-core in the dependencies and assembled the Jar. I added it to the manifest

SBT Assembly - Deduplicate error & Exclude error

你离开我真会死。 提交于 2019-11-28 07:50:35
问题 hey guys I am trying to build a JAR with dependencies using sbt-assembly. But I am running into this error again and again. I have tried multiple different things but I end up here. I am pretty new to SBT and wanted to get some help on this one. Here are the build.sbt & assembly.sbt files. build.sbt seq(assemblySettings: _*) name := "StreamTest" version := "1.0" scalaVersion := "2.10.4" libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.0" libraryDependencies += "org.apache

sbt-assembly : including test classes

被刻印的时光 ゝ 提交于 2019-11-28 02:50:45
问题 As part of sbt-assembly I want to include both src and test class files in jar. Sbt-assembly includes only src files with dependencies. Is there any way through which I can include test classes also in same jar? 回答1: I wrote sbt-assembly so that the settings can be loaded into other configurations than the default Runtime . Put the following in assembly.sbt and it should add test:assembly task: import AssemblyKeys._ Project.inConfig(Test)(baseAssemblySettings) jarName in (Test, assembly) := s

Standalone deployment of Scalatra servlet

跟風遠走 提交于 2019-11-27 21:49:32
问题 I implemented a Scalatra servlet and now want to create an executable jar, just like described in this tutorial: http://www.scalatra.org/2.2/guides/deployment/standalone.html I use IntelliJ IDEA with the Scala plugin for development and sbt to build and run my servlet (I used sbt-idea to generate the project files). My problem is that the jetty packages that the JettyLauncher in the tutorial uses cannot be found when I try to compile my project. UPDATE: Using Matt's answer I was able to

What are key differences between sbt-pack and sbt-assembly?

本秂侑毒 提交于 2019-11-27 09:51:09
问题 I've just stumbled upon the sbt-pack plugin. The development stream seems steady. It's surprising to me as I believed that the only plugin for (quoting sbt-pack's headline) "creating distributable Scala packages." is sbt-assembly (among the other features). What are the key differences between the plugins? When should I use one over the other? 回答1: (Disclaimer: I maintain sbt-assembly) sbt-assembly sbt-assembly creates a fat JAR - a single JAR file containing all class files from your code

Is it possible to use json4s 3.2.11 with Spark 1.3.0?

谁说我不能喝 提交于 2019-11-27 07:02:26
问题 Spark has a dependency on json4s 3.2.10, but this version has several bugs and I need to use 3.2.11. I added json4s-native 3.2.11 dependency to build.sbt and everything compiled fine. But when I spark-submit my JAR it provides me with 3.2.10. build.sbt import sbt.Keys._ name := "sparkapp" version := "1.0" scalaVersion := "2.10.4" libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.0" % "provided" libraryDependencies += "org.json4s" %% "json4s-native" % "3.2.11"` plugins.sbt

spark + sbt-assembly: “deduplicate: different file contents found in the following”

自闭症网瘾萝莉.ら 提交于 2019-11-27 02:47:54
问题 I ran spark application and wanna pack the test classes into the fat jar. What is weird is I ran "sbt assembly" successfully, but failed when I ran "sbt test:assembly". I tried sbt-assembly : including test classes, it didn't work for my case. SBT version : 0.13.8 build.sbt: import sbtassembly.AssemblyPlugin._ name := "assembly-test" version := "1.0" scalaVersion := "2.10.5" libraryDependencies ++= Seq( ("org.apache.spark" % "spark-core_2.10" % "1.3.1" % Provided) .exclude("org.mortbay.jetty"

How to add “provided” dependencies back to run/test tasks' classpath?

≯℡__Kan透↙ 提交于 2019-11-26 21:43:49
Here's an example build.sbt : import AssemblyKeys._ assemblySettings buildInfoSettings net.virtualvoid.sbt.graph.Plugin.graphSettings name := "scala-app-template" version := "0.1" scalaVersion := "2.9.3" val FunnyRuntime = config("funnyruntime") extend(Compile) libraryDependencies += "org.spark-project" %% "spark-core" % "0.7.3" % "provided" sourceGenerators in Compile <+= buildInfo buildInfoPackage := "com.psnively" buildInfoKeys := Seq[BuildInfoKey](name, version, scalaVersion, target) assembleArtifact in packageScala := false val root = project.in(file(".")). configs(FunnyRuntime). settings

sbt-assembly: deduplication found error

蹲街弑〆低调 提交于 2019-11-26 21:31:08
I am not sure whether mergestrategy or exclude jars is the best option here. Any help with how do I proceed further with this error will be great! [sameert@pzxdcc0151 approxstrmatch]$ sbt assembly [info] Loading project definition from /apps/sameert/software/approxstrmatch/project [info] Set current project to approxstrmatch (in build file:/apps/sameert/software/approxstrmatch/) [info] Including from cache: scala-library.jar [info] Checking every *.class/*.jar file's SHA-1. [info] Merging files... [info] Including from cache: curator-client-2.4.0.jar [info] Including from cache: secondstring