sbt-assembly

How to exclude jar in final sbt assembly plugin

夙愿已清 提交于 2020-01-13 04:29:25
问题 I need to exclude spark and test dependencies from my final assembly jar. I tried to use provider but it was not working. libraryDependencies ++= Seq("org.apache.spark" % "spark-core_2.11" % "2.0.1" % "provided") and execute sbt assembly . Please help me resolve this issue. 回答1: Use exclude option of assembly plugin filtering by direct name or with contains assemblyExcludedJars in assembly := { val cp = (fullClasspath in assembly).value cp filter { f => f.data.getName.contains("spark-core") |

libraryDependencies for com.eed3si9n#sbt-assembly;0.13.0: not found

北战南征 提交于 2020-01-01 03:04:10
问题 I am building a sbt plugin and want to reference assembly task in the sbt-assembly plugin ( to be dependent on my task) to do this i need to reference it as a library ( as opposed to a plugin), and somehow sbt is not able to resolve it as a libraryDepdendencies this is what my sbt looks like sbtPlugin := true name := "my-sbt-plugin" scalaVersion := "2.10.6" sbtVersion := "0.13.0" resolvers ++= Seq(Resolver.sbtPluginRepo("releases"), Resolver.sbtPluginRepo("snapshots")) libraryDependencies ++=

libraryDependencies for com.eed3si9n#sbt-assembly;0.13.0: not found

若如初见. 提交于 2020-01-01 03:03:07
问题 I am building a sbt plugin and want to reference assembly task in the sbt-assembly plugin ( to be dependent on my task) to do this i need to reference it as a library ( as opposed to a plugin), and somehow sbt is not able to resolve it as a libraryDepdendencies this is what my sbt looks like sbtPlugin := true name := "my-sbt-plugin" scalaVersion := "2.10.6" sbtVersion := "0.13.0" resolvers ++= Seq(Resolver.sbtPluginRepo("releases"), Resolver.sbtPluginRepo("snapshots")) libraryDependencies ++=

how to shade before compile with SBT?

自古美人都是妖i 提交于 2019-12-30 08:19:40
问题 Our project mainly consists of two parts Build.scala where the root project lies BuildShaded.scala where some external dependencies are shaded with sbt-assembly. The shaded jars will be depended upon by sub projects under the root project through unmanagedJars setting. The question is how to assembly the shaded project before compiling the root project. Otherwise, the root project will fail to compile since those classes in the shaded jars are not available. 回答1: As I said in the comments, I

What is a classifier in SBT

回眸只為那壹抹淺笑 提交于 2019-12-30 08:10:09
问题 What is meant under the term classifiers ? Is it comes from Jars? For example in sbt-assembly plugin: artifact in (Compile, assembly) ~= { art => art.copy(`classifier` = Some("assembly")) } 回答1: classifier is defined by Maven as the fifth element of a project coordinate, after groupId , artifactId , version and packaging . More specifically (from the maven documentation, emphasis mine): The classifier allows to distinguish artifacts that were built from the same POM but differ in their

Multiple executable jar files with different external dependencies from a single project with sbt-assembly

痞子三分冷 提交于 2019-12-29 03:53:09
问题 I have a single scala project built with sbt using a build.scala file. I can use sbt-assembly to generate a single executable jar file from this project without problem. Now I want to generate multiple executable jar files, where each file includes a common set of internal and external base dependencies along with different external dependencies . Is this possible with the current version of sbt-assembly? In maven this is easy, as one can define multiple profiles in the pom, each generating a

Sbt assembly hangs on my Mac

点点圈 提交于 2019-12-24 13:33:13
问题 I have had this problem since the day I started using assembly . sbt assembly (for any project) never completes on my Mac. It just hangs at the last step . For this one reason . I transfer my code to a Linux box and build there . Anyone else has experienced this ? . Any ideas on how I can go about debugging ? . I had Lion with Java 1.6.0_37 . Now upgraded to Mountain Lion 回答1: I had the same problem some time ago. Do you have anti-virus software installed? In my case it was Sophos, that tried

Spark 2.0.0 streaming job packed with sbt-assembly lacks Scala runtime methods

£可爱£侵袭症+ 提交于 2019-12-24 07:19:10
问题 When using -> in Spark Streaming 2.0.0 jobs, or using spark-streaming-kafka-0-8_2.11 v2.0.0, and submitting it with spark-submit , I get the following error: Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 72.0 failed 1 times, most recent failure: Lost task 0.0 in stage 72.0 (TID 37, localhost): java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object; I put a brief illustration of this phenomenon

SBT: How to package an instance of a class as a JAR?

血红的双手。 提交于 2019-12-23 09:01:03
问题 I have code which essentially looks like this: class FoodTrainer(images: S3Path) { // data is >100GB file living in S3 def train(): FoodClassifier // Very expensive - takes ~5 hours! } class FoodClassifier { // Light-weight API class def isHotDog(input: Image): Boolean } I want to at JAR-assembly ( sbt assembly ) time, invoke val classifier = new FoodTrainer(s3Dir).train() and publish the JAR which has the classifier instance instantly available to downstream library users. What is the

Deduplication error in build.sbt when building a fat jar using sbt assembly

对着背影说爱祢 提交于 2019-12-23 02:42:20
问题 I'm trying to create a fat jar of a combination of spark and restlet framework, but I keep getting this deduplication error. This is the error for running sbt assembly : java.lang.RuntimeException: deduplicate: different file contents found in the following: /Users/ccd/.ivy2/cache/org.eclipse.jetty.orbit/javax.transaction/orbits/javax.transaction-1.1.1.v201105210645.jar:META-INF/ECLIPSEF.RSA /Users/ccd/.ivy2/cache/org.eclipse.jetty.orbit/javax.servlet/orbits/javax.servlet-3.0.0.v201112011016