sbt

Unable to run Unit tests (scalatest) on Spark-2.2.0 - Scala-2.11.8

我的梦境 提交于 2020-01-24 00:51:06
问题 Unable to run run scalatest with context on spark-2.2.0 StackTrace: An exception or error caused a run to abort: org.apache.spark.sql.test.SharedSQLContext.eventually(Lorg/scalatest/concurrent/PatienceConfiguration$Timeout;Lscala/Function0;Lorg/scalatest/concurrent/AbstractPatienceConfiguration$PatienceConfig;)Ljava/lang/Object; java.lang.NoSuchMethodError: org.apache.spark.sql.test.SharedSQLContext.eventually(Lorg/scalatest/concurrent/PatienceConfiguration$Timeout;Lscala/Function0;Lorg

Scala and SBT install on Debian with Java 8

﹥>﹥吖頭↗ 提交于 2020-01-23 18:54:08
问题 I have java 8 server runtime on Debian. Tried: dpkg -i scala-2.11.7.deb scala depends on java6-runtime-headless; however: Package java6-runtime-headless is not installed. But is ok: scala -version Scala code runner version 2.11.7 -- Copyright 2002-2013, LAMP/EPFL Now installing SBT from bintray repo: My Java 8 is deactivated, installs open openjdk-7-* ... Have to "apt-get purge openjdk-7-*" (wrong way) EDIT: Problem corrected with: update-alternatives --config java 回答1: Here is the list of

Difference from `{.}/*:name` and `*/*:name` in sbt?

谁都会走 提交于 2020-01-23 06:24:13
问题 From some sbt document(e.g. scopes), I see: {.}/*:name means name in entire build (use name in ThisBuild to define it) */*:name means name in global project (use name in Global to define it) (PS: I ignored the config part *: ) But, I still don't know what is the difference between them, they seem exactly the same to me. Is there any thing I can do with one rather than another one? 回答1: Whatever version you specified in ThisBuild will be applied to all projects in your build, overriding

SBT run code in project after compile

余生长醉 提交于 2020-01-23 01:13:21
问题 we need to run some code after the compile step. Making things happen after the compile step seems easy: compile in Compile <<= (compile in Compile) map{x=> // post-compile work doFoo() x } but how do you run something in the freshly compiled code? More info on the scenario: we are using less for css in a lift project. We wanted lift to compile less into css on the fly (if needed) to help dev, but produce less using the same code, during the build, before tests etc run. less-sbt may help but

Publish zip created by sbt-native-packager

自闭症网瘾萝莉.ら 提交于 2020-01-22 14:37:40
问题 I am trying to publish to a repository the zip file generated by the sbt-native-packager plugin through universal:packageBin task. I configured my project like this: publishTo := Some("Repo" at "http://repo") publishMavenStyle := true packagerSettings packageArchetype.java_application I am struggling trying to create a new sbt task (named publishZip) using publish task and packageBin task to publish the zip file. How can I achieve this ? 回答1: Add the following line to your sbt build (around

Why Maven assembly works when SBT assembly find conflicts

非 Y 不嫁゛ 提交于 2020-01-22 13:56:08
问题 The title could also be: What are the differences between Maven and SBT assembly plugins. I have found this to be an issue, while migrating a project from Maven to SBT. To describe the problem I have created an example project with dependencies that I found to behave differently, depending on the build tool. https://github.com/atais/mvn-sbt-assembly The only dependencies are (sbt style) "com.netflix.astyanax" % "astyanax-cassandra" % "3.9.0", "org.apache.cassandra" % "cassandra-all" % "3.4",

SBT assembly jar exclusion

*爱你&永不变心* 提交于 2020-01-22 05:51:33
问题 Im using spark (in java API) and require a single jar that can be pushed to the cluster, however the jar itself should not include spark. The app that deploys the jobs of course should include spark. I would like: sbt run - everything should be compiled and excuted sbt smallAssembly - create a jar without spark sbt assembly - create an uber jar with everything (including spark) for ease of deployment. I have 1. and 3. working. Any ideas on how I can 2. ? What code would I need to add to my

sbt compile time warning: non-variable type argument String in type pattern List[String]

谁说我不能喝 提交于 2020-01-22 02:30:08
问题 My sbt is showing a warning message non-variable type argument String in type pattern List[String] (the underlying of List[String]) is unchecked since it is eliminated by erasure I tried the answer given in the link (first solution) Erasure elimination in scala : non-variable type argument is unchecked since it is eliminated by erasure Here is my code case class ListStrings(values:scala.List[String]) { } def matchValue(value: Any) = { value match { case ListStrings(xs) => val userList = xs

Why sbt.Extracted remove the previously defined TaskKey while append method?

会有一股神秘感。 提交于 2020-01-21 12:18:27
问题 There is a suitable method in the sbt.Exctracted to add the TaskKey to the current state. Assume I have inState: State : val key1 = TaskKey[String]("key1") Project.extract(inState).append(Seq(key1 := "key1 value"), inState) I have faced with the strange behavior when I do it twice. I got the exception in the following example: val key1 = TaskKey[String]("key1") val key2 = TaskKey[String]("key2") val st1: State = Project.extract(inState).append(Seq(key1 := "key1 value"), inState) val st2:

Why does sbt report “No java installations was detected” with $JAVA_HOME set?

邮差的信 提交于 2020-01-21 12:10:12
问题 I have 2 sbt-android-scala projects. The first one is a single one: $ ls -al drwxr-xr-x 13 alex staff 442 Dec 24 20:44 . drwxr-xr-x 4 alex staff 136 Dec 24 21:08 .. drwxr-xr-x 12 alex staff 408 Dec 24 20:38 .git -rw-r--r-- 1 alex staff 141 Dec 24 20:38 .gitignore -rw-r--r-- 1 alex staff 115 Dec 24 20:38 .travis.yml -rw-r--r-- 1 alex staff 664 Dec 24 20:38 CHANGES -rw-r--r-- 1 alex staff 1418 Dec 24 20:38 LICENSE -rw-r--r-- 1 alex staff 2491 Dec 24 20:38 README.md -rw-r--r-- 1 alex staff 874