sbt-assembly

Spark 1.5.1 + Scala 2.10 + Kafka + Cassandra = Java.lang.NoSuchMethodError:

时光毁灭记忆、已成空白 提交于 2019-12-23 01:55:10
问题 I want to connect Kafka + Cassandra to the Spark 1.5.1. The versions of the libraries: scalaVersion := "2.10.6" libraryDependencies ++= Seq( "org.apache.spark" % "spark-streaming_2.10" % "1.5.1", "org.apache.spark" % "spark-streaming-kafka_2.10" % "1.5.1", "com.datastax.spark" % "spark-cassandra-connector_2.10" % "1.5.0-M2" ) The initialization and use into app: val sparkConf = new SparkConf(true) .setMaster("local[2]") .setAppName("KafkaStreamToCassandraApp") .set("spark.executor.memory",

How to make a SBT task depend on a module defined in the same SBT project?

ぐ巨炮叔叔 提交于 2019-12-22 18:36:21
问题 I have module A and module B in a multi-module SBT project. I want to write a resource generator task for module B that invokes code from module A. One way to do this is to pull all the code from module A under project/ but that is unfeasible as module A is massive and I would like to keep it where it is (see https://stackoverflow.com/a/47323703/471136). How do I do this in SBT? Secondly, is it possible to get rid of module B altogether i.e. I want the resource generator task for module A

sbt switch dependencies for runtime

怎甘沉沦 提交于 2019-12-22 18:18:39
问题 I am developing a spark application which is using xgboost4j. https://github.com/dmlc/xgboost/tree/master/jvm-packages This package requires to be compiled to the local architecture due to local C dependencies of the jar. But the cluster has a different architecture than the development laptop. How can I substitute the package when running sbt assembly via one from the cluster? Or would you suggest to solve this via a % "provided" ? 回答1: Use suffix for (provided/compile) libs as like: val

Should sbt-assembly perform a “maven-shade-plugin”-like relocation of classes?

谁说胖子不能爱 提交于 2019-12-22 06:45:11
问题 The description of sbt-assembly merge strategy called rename sounded like it might permit something similar to the shading operation of the maven-shade-plugin which will relocate classes and their references to permit the management of incompatible versions of libraries. Would it be appropriate for sbt-assembly to perform that function? I used the following merge strategy to attempt to use rename as a relocation mechanism but while it matches all the files, it passes them straight through

java.lang.NoSuchMethodError: akka.actor.ActorCell.addFunctionRef

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-21 20:30:05
问题 I am trying to setup a simple akka-http 2.4.2 project to test it out, but I am failing to do so. My built.sbt: import NativePackagerHelper._ lazy val akkaVersion = "2.4.2" lazy val root = (project in file(".")). settings( name := "akkTest", version := "0.1", scalaVersion := "2.11.7") libraryDependencies ++= Seq( "com.typesafe.akka" %% "akka-actor" % akkaVersion, "com.typesafe.akka" %% "akka-http-spray-json-experimental" % akkaVersion ) enablePlugins(JavaServerAppPackaging) my code snippet in

sbt-assembly include test classes

喜你入骨 提交于 2019-12-21 19:59:55
问题 I follow sbt-assembly : including test classes from a config described in https://github.com/sbt/sbt-assembly that work ok doing assembly When I load sbt I get assembly.sbt:5: error: reference to jarName is ambiguous; it is imported twice in the same scope by import sbtassembly.AssemblyKeys._ and import _root_.sbtassembly.AssemblyPlugin.autoImport._ jarName in (Test, assembly) := s"${name.value}-test-${version.value}.jar" ^ So, I comment import line and run sbt:assembly but that begin the

sbt assembly shading to create fat jar to run on spark

偶尔善良 提交于 2019-12-21 07:39:14
问题 I'm using sbt assembly to create a fat jar which can run on spark. Have dependencies on grpc-netty . Guava version on spark is older than the one required by grpc-netty and I run into this error: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument. I was able to resolve this by setting userClassPathFirst to true on spark, but leads to other errors. Correct me if I am wrong, but from what I understand, I shouldn't have to set userClassPathFirst to true if I do

How can I add unmanaged JARs in sbt-assembly to the final fat JAR?

久未见 提交于 2019-12-21 03:42:55
问题 My project has dependencies on a JAR file that isn't in Ivy, how can I include it directly in the final JAR output by sbt-assembly? 回答1: Figured out I just have to add them explicitly as unmanaged dependencies in Build.scala , they are not automatically pulled in from the lib folder. Adding this line to settings worked: unmanagedJars in Compile += file("lib/vertica_jdk_5.jar") 回答2: For a single project setup, putting jars into lib should work. If you have multi-project setup the lib directory

Scala SBT Assembly cannot merge due to de-duplication error in StaticLoggerBinder.class

有些话、适合烂在心里 提交于 2019-12-20 20:02:58
问题 My problem is I can no longer use the sbt-assembly plugin because some kind of dependency merge problem creeped in, between a couple people working on this project. The problem when I run 'sbt assembly' : [error] 3 errors were encountered during merge java.lang.RuntimeException: deduplicate: different file contents found in the following: /Users/aris.vlasakakis/.ivy2/cache/ch.qos.logback/logback-classic/jars/logback-classic-1.1.2.jar:org/slf4j/impl/StaticLoggerBinder.class /Users/aris

sbt-assembly: How do I include the static files in src/main/webapp

99封情书 提交于 2019-12-19 07:52:28
问题 I am using sbtassembly from https://github.com/sbt/sbt-assembly with this merge strategy: mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) => { case PathList("javax", "servlet", xs @ _*) => MergeStrategy.first case PathList(ps @ _*) if ps.last endsWith ".html" => MergeStrategy.first case "application.conf" => MergeStrategy.concat case "unwanted.txt" => MergeStrategy.discard case x => old(x) } } For some reason my static content is not being included in the executable jar, but