sbt-assembly

sbt-assembly: How do I include the static files in src/main/webapp

我是研究僧i 提交于 2019-12-19 07:50:18
问题 I am using sbtassembly from https://github.com/sbt/sbt-assembly with this merge strategy: mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) => { case PathList("javax", "servlet", xs @ _*) => MergeStrategy.first case PathList(ps @ _*) if ps.last endsWith ".html" => MergeStrategy.first case "application.conf" => MergeStrategy.concat case "unwanted.txt" => MergeStrategy.discard case x => old(x) } } For some reason my static content is not being included in the executable jar, but

scala sbt assembly “no main manifest attribute”

半腔热情 提交于 2019-12-19 06:03:28
问题 I use assembly plugin in sbt to assemble my project. But errors happen when run by "java -jar xx.jar" - "no main manifest attribute". I think it's because there are two files in my src/main/scala/ directory and each with an object extending Application which means there are two main entry in the project. But I need two applications, one is the server and the other is the test client. How to handle this two-main-entry problem in scala sbt. Thanks in advance. 回答1: In your SBT build file, define

Shading over third party classes

岁酱吖の 提交于 2019-12-19 02:42:52
问题 I'm currently facing a problem with deploying an uber-jar to a Spark Streaming application, where there are congruent JARs with different versions which are causing spark to throw run-time exceptions. The library in question is TypeSafe Config. After attempting many things, my solution was to defer to shading the provided dependency so it won't clash with the JAR provided by Spark at run-time. Hence, I went to the documentation for sbt-assembly and under shading, I saw the following example:

Shading over third party classes

徘徊边缘 提交于 2019-12-19 02:42:34
问题 I'm currently facing a problem with deploying an uber-jar to a Spark Streaming application, where there are congruent JARs with different versions which are causing spark to throw run-time exceptions. The library in question is TypeSafe Config. After attempting many things, my solution was to defer to shading the provided dependency so it won't clash with the JAR provided by Spark at run-time. Hence, I went to the documentation for sbt-assembly and under shading, I saw the following example:

DeDuplication error with SBT assembly plugin

泄露秘密 提交于 2019-12-18 02:51:40
问题 I am trying to create an executable jar using SBT assembly plugin. I am ending up with below error : [error] (app/*:assembly) deduplicate: different file contents found in the following: [error] /Users/rajeevprasanna/.ivy2/cache/org.eclipse.jetty.orbit/javax.servlet/orbits/javax.servlet-3.0.0.v201112011016.jar:about.html [error] /Users/rajeevprasanna/.ivy2/cache/org.eclipse.jetty/jetty-continuation/jars/jetty-continuation-8.1.8.v20121106.jar:about.html [error] /Users/rajeevprasanna/.ivy2

How can I make a task depend on another task?

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-13 03:55:53
问题 I'm new to sbt and I try to create a script for either deploy my application or to deploy and run the application. What already works for me is sbt deploy which will successfully deploy the final .jar file to the remove location. However, I don't know how to make deployAndRunTask dependent on deployTask . I've tried several things but none of them worked so far. My last hope was deployAndRunTask := { val d = deployTask.value } However, this does not seem to work. This is the script that I'm

assemblyMergeStrategy causing scala.MatchError when compiling

若如初见. 提交于 2019-12-12 18:28:32
问题 I'm new to sbt/assembly. I'm trying to resolve some dependency problems, and it seems the only way to do it is through a custom merge strategy. However, whenever I try to add a merge strategy I get a seemingly random MatchError on compiling: [error] (*:assembly) scala.MatchError: org/apache/spark/streaming/kafka/KafkaUtilsPythonHelper$$anonfun$13.class (of class java.lang.String) I'm showing this match error for the kafka library, but if I take out that library altogether, I get a MatchError

Rename file using sbt-assembly

混江龙づ霸主 提交于 2019-12-12 16:18:42
问题 I have a scala project that uses the ConfigFactory to set up the application configurations. For building I use sbt (together with sbt-assembly ). Depending on whether I create an assembly with sbt-assembly or whether I just run the project, I would like to use different config files ( application.conf when running the project, assembly.conf when running the assembly of the project). I thought of using the assemblyMergeStrategy for this purpose: When assembling the jar, I would discard the

How to run Spark application assembled with Spark 2.1 on cluster with Spark 1.6?

爱⌒轻易说出口 提交于 2019-12-12 15:39:53
问题 I've been told that I could build a Spark application with one version of Spark and, as long as I use sbt assembly to build that, than I can run it with spark-submit on any spark cluster. So, I've build my simple application with Spark 2.1.1. You can see my build.sbt file below. Than I'm starting this on my cluster with: cd spark-1.6.0-bin-hadoop2.6/bin/ spark-submit --class App --master local[*] /home/oracle/spark_test/db-synchronizer.jar So as you see I'm executing it with spark 1.6.0. and

Parboiled2 causes “missing or invalid dependency detected while loading class file 'Prepender.class'”

非 Y 不嫁゛ 提交于 2019-12-12 11:12:05
问题 So I've been trying to use parboiled2 for the last few weeks now, it is possibly the most difficult dependency to add to a build I have come across in my entire life. My current error is a compile sbt assembly ) error: [error] missing or invalid dependency detected while loading class file 'Prepender.class'. [error] Could not access type PrependAux in package shapeless, [error] because it (or its dependencies) are missing. Check your build definition for [error] missing or conflicting