sbt

java.lang.NoClassDefFoundError: org/slf4j/Logger class not found with IntelliJ and SBT

这一生的挚爱 提交于 2020-08-08 05:15:34
问题 I am trying to run a spark scala program where in I have used "import org.apache.spark.internal.Logging" . Program was running fine in local until I try to build a fat jar and added assembly.sbt using build.sbt Here's my build.sbt lazy val root = (project in file(".")). settings( name := "TestingSparkApplicationProject", version := "0.0.1-SNAPSHOT", scalaVersion := "2.11.12", mainClass in Compile := Some("com.test.spark.job.TestSparkJob.scala") ) val sparkVersion = "2.4.3" libraryDependencies

sbt - object apache is not a member of package org

谁说我不能喝 提交于 2020-08-04 07:41:27
问题 I want to deploy and submit a spark program using sbt but its throwing error. Code: package in.goai.spark import org.apache.spark.{SparkContext, SparkConf} object SparkMeApp { def main(args: Array[String]) { val conf = new SparkConf().setAppName("First Spark") val sc = new SparkContext(conf) val fileName = args(0) val lines = sc.textFile(fileName).cache val c = lines.count println(s"There are $c lines in $fileName") } } build.sbt name := "First Spark" version := "1.0" organization := "in.goai

sbt - object apache is not a member of package org

限于喜欢 提交于 2020-08-04 07:39:12
问题 I want to deploy and submit a spark program using sbt but its throwing error. Code: package in.goai.spark import org.apache.spark.{SparkContext, SparkConf} object SparkMeApp { def main(args: Array[String]) { val conf = new SparkConf().setAppName("First Spark") val sc = new SparkContext(conf) val fileName = args(0) val lines = sc.textFile(fileName).cache val c = lines.count println(s"There are $c lines in $fileName") } } build.sbt name := "First Spark" version := "1.0" organization := "in.goai

How to work efficiently with SBT, Spark and “provided” dependencies?

大憨熊 提交于 2020-07-31 06:57:20
问题 I'm building an Apache Spark application in Scala and I'm using SBT to build it. Here is the thing: when I'm developing under IntelliJ IDEA, I want Spark dependencies to be included in the classpath (I'm launching a regular application with a main class) when I package the application (thanks to the sbt-assembly) plugin, I do not want Spark dependencies to be included in my fat JAR when I run unit tests through sbt test , I want Spark dependencies to be included in the classpath (same as #1

Setting an environment variable from within the sbt shell

假装没事ソ 提交于 2020-07-30 05:22:21
问题 I would like to be able to set an environment variable from within the interactive sbt shell, and I can't seem to find a way to do that. (I have looked on the official sbt docs as well as on stackoverflow without success). I want to make it clear that I don't want to have to set this environment variable in the build.sbt file, but rather be able to change it on the fly on my interactive sbt shell session, so that the environment variable is used for the next sbt commands I run. For example, I

Exception in thread “main” java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps

怎甘沉沦 提交于 2020-07-15 09:00:07
问题 When I run in terminal: sudo spark-submit --master local --class xxx.xxxx.xxx.xxxx.xxxxxxxxxxxxJob --conf 'spark.driver.extraJavaOptions=-Dconfig.resource=xxx.conf' /home/xxxxx/workspace/prueba/pruebas/target/scala-2.11/MiPrueba.jar I get the following error: Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps; at pureconfig.DurationUtils$.words(DurationUtils.scala:36) at pureconfig.DurationUtils$.pureconfig

How to load a class from the source code using reflection inside SBT task?

送分小仙女□ 提交于 2020-07-09 09:34:30
问题 I'm using SBT to build my project. I want to analyze some classes from my source code using either Scala or Java reflection during the build process. How do I define an SBT task that loads a single known class or all classes from my source code? import sbt._ val loadedClasses = taskKey[Seq[Class[_]]]("All classes from the source") val classToLoad = settingKey[String]("Scala class name to load") val loadedClass = taskKey[Seq[Class[_]]]("Loaded classToLoad") 回答1: You can use the output of the