sbt-assembly

sbt-assembly: skip specific test

僤鯓⒐⒋嵵緔 提交于 2019-12-10 14:56:26
问题 I would like to configure sbt-assembly to skip a specific test class. Is there any way to do this? If it helps, I tagged the test using ScalaTest @Network tag. 回答1: See Additional test configurations with shared sources. This allows you to come up with alternative "test" task in FunTest configuration while reusing your test source. After you have fun:test working with whatever filter you define using testOptions in FunTest := Seq(Tests.Filter(itFilter)) , you can then rewire test in assembly

Why isn't guava being shaded properly in my build.sbt?

浪尽此生 提交于 2019-12-10 10:56:01
问题 tl;dr: Here's a repo containing the problem. Cassandra and HDFS both use guava internally, but neither of them shades the dependency for various reasons. Because the versions of guava aren't binary compatible, I'm finding NoSuchMethodError s at runtime. I've tried to shade guava myself in my build.sbt : val HadoopVersion = "2.6.0-cdh5.11.0" // ... val hadoopHdfs = "org.apache.hadoop" % "hadoop-hdfs" % HadoopVersion val hadoopCommon = "org.apache.hadoop" % "hadoop-common" % HadoopVersion val

Setting up sbt to publish to artifactory based on git branch

大兔子大兔子 提交于 2019-12-10 03:36:49
问题 I would like to set up an sbt project so that it can publish to the proper artifactory repository based on the (git) branch. The solution proposed for this question suggests to hardcode the repository in the build.sbt file. However, I would like the master branch to publish to "releases", and another branch to publish to "snapshots", using the same build.sbt file. Ideally, I would like the following: val gitBranch = taskKey[String]("Determines current git branch") gitBranch := Process("git

Deduplication error in build.sbt when building a fat jar using sbt assembly

泪湿孤枕 提交于 2019-12-09 05:11:27
I'm trying to create a fat jar of a combination of spark and restlet framework, but I keep getting this deduplication error. This is the error for running sbt assembly : java.lang.RuntimeException: deduplicate: different file contents found in the following: /Users/ccd/.ivy2/cache/org.eclipse.jetty.orbit/javax.transaction/orbits/javax.transaction-1.1.1.v201105210645.jar:META-INF/ECLIPSEF.RSA /Users/ccd/.ivy2/cache/org.eclipse.jetty.orbit/javax.servlet/orbits/javax.servlet-3.0.0.v201112011016.jar:META-INF/ECLIPSEF.RSA /Users/ccd/.ivy2/cache/org.eclipse.jetty.orbit/javax.mail.glassfish/orbits

How to properly manage logback configrations in development and production using SBT & Scala?

孤人 提交于 2019-12-08 21:40:28
问题 I have a pretty standard Scalatra project using Logback for logging. Following the logback manual I have added a logback-test.xml for my development configuration (logs of debugs), whilst maintaining a production logback.xml . However, in development while using the xsbt-web-plugin to run a container with code reloading, my app only seems to pick up the logback.xml . How do I get the desired behavior?: In development mode ( ./sbt container:start ) the app uses logback-test.xml When assembled

Strip ScalaSignature annotation from resulting binaries

守給你的承諾、 提交于 2019-12-08 09:50:46
问题 How can you strip java or scala annotations programmatically? One of the deliverables for my current project is my code, plus whatever dependencies my code relies on, as an uber-jar. We're building an SDK therefore any of the included dependencies need to be renamed as to not interfere with the SDK client's dependencies (meaning if we're using apache commons version X, and they're using version Y, there aren't any conflicts). We used sbt-assembly to rename the packages our code relies on. For

sbt assembly fails due to different file contents found

心已入冬 提交于 2019-12-08 02:34:19
问题 I am trying to build a project in GitLab. In gitlab-ci.yml, I ran sbt assembly and encountered annoying exception. [error] (soda/*:assembly) deduplicate: different file contents found in the following: [error] /root/.ivy2/cache/io.netty/netty-buffer/jars/netty-buffer-4.0.42.Final.jar:META-INF/io.netty.versions.properties [error] /root/.ivy2/cache/io.netty/netty-common/jars/netty-common-4.0.42.Final.jar:META-INF/io.netty.versions.properties [error] /root/.ivy2/cache/io.netty/netty-codec-http

error while running sbt assembly : sbt deduplication error

主宰稳场 提交于 2019-12-07 13:20:28
问题 I am facing the exact issue as described in the below post and the suggested answer is not helping. sbt-assembly: deduplication found error [error] (*:assembly) deduplicate: different file contents found in the following: [error] C:\Users\xxx\.ivy2\cache\org.eclipse.jetty.orbit\javax.transaction\orbits\javax.transaction-1.1.1.v201105210645.jar:META-INF/ECLIPSEF.RSA [error] C:\Users\xxx\.ivy2\cache\org.eclipse.jetty.orbit\javax.servlet\orbits\javax.servlet-3.0.0.v201112011016.jar:META-INF

How to “package” some modules to jars and others to wars in multi-module build with single task?

无人久伴 提交于 2019-12-07 13:01:04
问题 I use package task (from xsbt-web-plugin) to package a project to a war, and assembly task (from sbt-assembly) to package the project to a jar. I have a multi-module build and some modules are packaged into wars and some into jars. I'd like to set up the build to execute assembly task and: Jar modules are packaged into jar files War modules are packaged into war files How to execute package task for the war projects while executing assembly task? 回答1: Both package task and assembly task

Spark 1.5.1 + Scala 2.10 + Kafka + Cassandra = Java.lang.NoSuchMethodError:

早过忘川 提交于 2019-12-07 12:52:29
I want to connect Kafka + Cassandra to the Spark 1.5.1. The versions of the libraries: scalaVersion := "2.10.6" libraryDependencies ++= Seq( "org.apache.spark" % "spark-streaming_2.10" % "1.5.1", "org.apache.spark" % "spark-streaming-kafka_2.10" % "1.5.1", "com.datastax.spark" % "spark-cassandra-connector_2.10" % "1.5.0-M2" ) The initialization and use into app: val sparkConf = new SparkConf(true) .setMaster("local[2]") .setAppName("KafkaStreamToCassandraApp") .set("spark.executor.memory", "1g") .set("spark.cores.max", "1") .set("spark.cassandra.connection.host", "127.0.0.1") Creates schema