sbt

Why is sbt current project name “default” in 0.10?

匆匆过客 提交于 2020-01-21 04:33:14
问题 I'm using sbt 0.10 to build a Scala project using just a build.sbt file instead of a full configuration. Every time I start sbt it gives me the messages as follows: [info] Set current project to default-ee699e (in build file:/Users/.../project/plugins/) [info] Set current project to default-8febe7 (in build file:/Users/.../) I did set the name and mainClass settings in the build.sbt file, so I don't know what I need to set to get the project names default-XXXX go away. EDIT: the answer given

SBT: How to access environment variable or configuration?

我只是一个虾纸丫 提交于 2020-01-21 02:47:06
问题 I publish to an internal Nexus repository. We have two repos, "dev" and "production". Developers use the dev repo, the build team uses the production repo which they access from machines in a secure area. I would like to add an environment variable or SBT config that defines STAGE with a default value of "dev". On the production build boxes STAGE would be overriden to "production". How can I do this? I am able to define stage in my build.sbt file and use it in the publishTo task, I just can't

NoSuchMethodError from spark-cassandra-connector with assembled jar

五迷三道 提交于 2020-01-20 07:29:25
问题 I'm fairly new to Scala and am trying to build a Spark job. I've built ajob that contains the DataStax connector and assembled it into a fat jar. When I try to execute it it fails with a java.lang.NoSuchMethodError . I've cracked open the JAR and can see that the DataStax library is included. Am I missing something obvious? Is there a good tutorial to look at regarding this process? Thanks console $ spark-submit --class org.bobbrez.CasCountJob ./target/scala-2.11/bobbrez-spark-assembly-0.0.1

Spark MLlib example, NoSuchMethodError: org.apache.spark.sql.SQLContext.createDataFrame()

≯℡__Kan透↙ 提交于 2020-01-16 06:54:00
问题 I'm following the documentation example Example: Estimator, Transformer, and Param And I got error msg 15/09/23 11:46:51 INFO BlockManagerMaster: Registered BlockManager Exception in thread "main" java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror; at SimpleApp$.main(hw.scala:75) And line 75 is the code "sqlContext.createDataFrame()": import java.util.Random import org.apache.log4j.Logger import org

Spark MLlib example, NoSuchMethodError: org.apache.spark.sql.SQLContext.createDataFrame()

若如初见. 提交于 2020-01-16 06:50:02
问题 I'm following the documentation example Example: Estimator, Transformer, and Param And I got error msg 15/09/23 11:46:51 INFO BlockManagerMaster: Registered BlockManager Exception in thread "main" java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror; at SimpleApp$.main(hw.scala:75) And line 75 is the code "sqlContext.createDataFrame()": import java.util.Random import org.apache.log4j.Logger import org

Play! 2.1 sbt publish a dist cycle in tasks error

守給你的承諾、 提交于 2020-01-16 02:52:07
问题 Upon migrating from 2.04 to 2.1, we encountered a problem with our publish task that sends the zip file from the dist task to Artifactory. Now what we are getting is the folowing error: Internal task engine error: nothing running. This usually indicates a cycle in tasks. There is discussion about this in the play framework user groups: https://github.com/playframework/Play20/pull/535 and https://groups.google.com/forum/#!topic/play-framework/BoWw65F6vg8 We basically tried to do what they

Conditional libraries in build.sbt

浪尽此生 提交于 2020-01-15 23:38:15
问题 Using buildsbt. I'm trying to do something like this: if (condition) { libraryDependencies += ... // library from maven } else { unmanagedJars in Compile += ... // local library instead } However, build.sbt doesn't like this at all. I've been able to accomplish this using side effects, but that's obviously undesirable. Any advice would be appreciated. Thanks. 回答1: This might be easier to do using a Build.scala build definition. Here is an example build.sbt file (this is optional): import sbt.

Conditional libraries in build.sbt

心不动则不痛 提交于 2020-01-15 23:37:31
问题 Using buildsbt. I'm trying to do something like this: if (condition) { libraryDependencies += ... // library from maven } else { unmanagedJars in Compile += ... // local library instead } However, build.sbt doesn't like this at all. I've been able to accomplish this using side effects, but that's obviously undesirable. Any advice would be appreciated. Thanks. 回答1: This might be easier to do using a Build.scala build definition. Here is an example build.sbt file (this is optional): import sbt.

Play 2.3.6 Java - OutOfMemory errors w/ sbt-uglify

有些话、适合烂在心里 提交于 2020-01-15 10:20:17
问题 I'm having an issue when trying to use the sbt-uglify plugin. I've configured the project per: https://github.com/sbt/sbt-uglify in plugins.sbt: addSbtPlugin("com.typesafe.sbt" % "sbt-uglify" % "1.0.3") in build.sbt: pipelineStages := Seq(uglify, digest, gzip) I have a non-trivial number of JS files (60+). What I've been getting is an OutOfMemory exception which prevents me from being able to complete a dist command. Has anyone else encountered this problem? Any ideas/solutions are

How do I get SBT staging directory at build time?

牧云@^-^@ 提交于 2020-01-15 07:08:26
问题 How do I get SBT staging directory at build time? I want to do a tricky clone of a remote repo, and the stagingDirectory of SBT seems to be a nice fit. How do I get the directory inside "Build.scala" ? SBT source code: http://www.scala-sbt.org/0.13.1/sxr/sbt/BuildPaths.scala.html#sbt.BuildPaths.stagingDirectory ======= Underlying problem NOT directly relevant to the question. I wanted to use a subdirectory of a git dependency in SBT. SBT doesn't provide this out of the box so I wrote a simple