sbt-assembly

Why does Spark application fail with “ClassNotFoundException: Failed to find data source: kafka” as uber-jar with sbt assembly?

你离开我真会死。 提交于 2019-11-30 12:37:02
问题 I'm trying to run a sample like https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/sql/streaming/StructuredKafkaWordCount.scala. I started with the Spark Structured Streaming Programming guide at http://spark.apache.org/docs/latest/structured-streaming-programming-guide.html. My code is package io.boontadata.spark.job1 import org.apache.spark.sql.SparkSession object DirectKafkaAggregateEvents { val FIELD_MESSAGE_ID = 0 val FIELD_DEVICE_ID = 1 val

How to run sbt-assembly tasks from within IntelliJ IDEA?

痴心易碎 提交于 2019-11-30 08:50:58
Is it possible to run sbt-assembly from within IntelliJ IDEA? Also I read in the doc that one could add task within the SBT Tool window. But what I see is that it only helps you view your project not task? I cannot add any tasks there. How does the Tool window work exactly? I have the last version of IntelliJ IDEA. This answer is now out-of-date. IntelliJ now allows you to create a run configuration for an SBT task. You create the Run Configuration by : choosing "Edit configurations" from the "Run" menu (or the toolbar popup) click the "+" button to add a configuration and select "SBT Task" as

How do I publish a fat JAR (JAR with dependencies) using sbt and sbt-release?

心已入冬 提交于 2019-11-30 04:58:41
I need to build a single jar, including dependencies, for one of my sub-projects so that it can be used as a javaagent . I have a multi-module sbt project and this particular module is the lowest level one (it's also pure Java). Can I (e.g. with sbt-onejar , sbt-proguard or sbt assembly ) override how the lowest level module is packaged? It looks like these tools are really designed to be a post-publish step, but I really need a (replacement or additional) published artefact to include the dependencies (but only for this one module). UPDATE: Publishing for sbt-assembly are instructions for a

Proper way to make a Spark Fat Jar using SBT

社会主义新天地 提交于 2019-11-29 14:42:41
I need a Fat Jar with Spark because I'm creating a custom node for Knime. Basically it's a self-contained jar executed inside Knime and I assume a Fat Jar is the only way to spawn a local Spark Job. Eventually we will go on submitting a job to a remote cluster but for now I need it to spawn this way. That said, I made a Fat Jar using this: https://github.com/sbt/sbt-assembly I made an empty sbt project, included Spark-core in the dependencies and assembled the Jar. I added it to the manifest of my custom Knime node and tried to spawn a simple job (pararellize a collection, collect it and print

sbt-assembly : including test classes

眉间皱痕 提交于 2019-11-29 09:27:07
As part of sbt-assembly I want to include both src and test class files in jar. Sbt-assembly includes only src files with dependencies. Is there any way through which I can include test classes also in same jar? I wrote sbt-assembly so that the settings can be loaded into other configurations than the default Runtime . Put the following in assembly.sbt and it should add test:assembly task: import AssemblyKeys._ Project.inConfig(Test)(baseAssemblySettings) jarName in (Test, assembly) := s"${name.value}-test-${version.value}.jar" Like the way jarName setting is scoped, substitute xxx in assembly

How can a duplicate class be excluded from sbt assembly?

蹲街弑〆低调 提交于 2019-11-29 08:37:18
问题 We have a situation in which two dependencies have exactly the same class (because one of the dependencies copied it and included in their own source). This is causing sbt assembly to fail its deduplication checks. How can I exclude a class from a particular jar? 回答1: You need a mergeStrategy, which will take one of the files. mergeStrategy in assembly := { case PathList("path", "to", "your", "DuplicatedClass.class") => MergeStrategy.first case x => (mergeStrategy in assembly).value(x) }

How do I publish a fat JAR (JAR with dependencies) using sbt and sbt-release?

北城余情 提交于 2019-11-29 02:35:26
问题 I need to build a single jar, including dependencies, for one of my sub-projects so that it can be used as a javaagent . I have a multi-module sbt project and this particular module is the lowest level one (it's also pure Java). Can I (e.g. with sbt-onejar, sbt-proguard or sbt assembly) override how the lowest level module is packaged? It looks like these tools are really designed to be a post-publish step, but I really need a (replacement or additional) published artefact to include the

DeDuplication error with SBT assembly plugin

二次信任 提交于 2019-11-29 00:07:08
I am trying to create an executable jar using SBT assembly plugin. I am ending up with below error : [error] (app/*:assembly) deduplicate: different file contents found in the following: [error] /Users/rajeevprasanna/.ivy2/cache/org.eclipse.jetty.orbit/javax.servlet/orbits/javax.servlet-3.0.0.v201112011016.jar:about.html [error] /Users/rajeevprasanna/.ivy2/cache/org.eclipse.jetty/jetty-continuation/jars/jetty-continuation-8.1.8.v20121106.jar:about.html [error] /Users/rajeevprasanna/.ivy2/cache/org.eclipse.jetty/jetty-http/jars/jetty-http-8.1.8.v20121106.jar:about.html [error] /Users

Multiple executable jar files with different external dependencies from a single project with sbt-assembly

生来就可爱ヽ(ⅴ<●) 提交于 2019-11-28 20:24:58
I have a single scala project built with sbt using a build.scala file. I can use sbt-assembly to generate a single executable jar file from this project without problem. Now I want to generate multiple executable jar files, where each file includes a common set of internal and external base dependencies along with different external dependencies . Is this possible with the current version of sbt-assembly? In maven this is easy, as one can define multiple profiles in the pom, each generating a separate jar, but in sbt-assembly you pass the assembly settings to your project and I haven't figured

What are key differences between sbt-pack and sbt-assembly?

旧城冷巷雨未停 提交于 2019-11-28 16:41:42
I've just stumbled upon the sbt-pack plugin. The development stream seems steady . It's surprising to me as I believed that the only plugin for (quoting sbt-pack's headline) "creating distributable Scala packages." is sbt-assembly (among the other features). What are the key differences between the plugins? When should I use one over the other? (Disclaimer: I maintain sbt-assembly) sbt-assembly sbt-assembly creates a fat JAR - a single JAR file containing all class files from your code and libraries. By evolution, it also contains ways of resolving conflicts when multiple JARs provide the same