sbt

Dependencies for Spark-Streaming and Twiter-Streaming in SBT

廉价感情. 提交于 2020-05-16 03:09:30
问题 I was trying to use the following dependencies in my build.sbt, but it keeps giving "unresolved dependency" issue. libraryDependencies += "org.apache.bahir" %% "spark-streaming-twitter_2.11" % "2.2.0.1.0.0-SNAPSHOT" libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.2.0" I'm using Spark 2.2.0. What are the correct dependencies? 回答1: The question was posted a while ago, but I ran into the same problem this week. Here is the solution for those who still have the problem : As

SBT: Cross build project for two Scala versions with different dependencies

こ雲淡風輕ζ 提交于 2020-05-15 02:39:26
问题 I have the following use case. I would like to build the same Scala project for scala 2.10 and 2.12. When doing so I would like to specify some of the dependencies for the 2.10 version as provided whereas I'd like to have those compiled in the jar for 2.12. I was looking at SBT's docs and found how I can split a build.sbt into separate declarations but those always get mentioned as sub-modules . In my case I'd like to cross-build the whole app - not specific parts of it. Any hints or

scoverage: Combine Coverage from test and it:test

半世苍凉 提交于 2020-05-13 02:05:10
问题 I splitted my Unit- and Integration-Tests with a Filter: lazy val FunTest = config("it") extend Test def funTestFilter(name: String): Boolean = name endsWith "Spec" def unitTestFilter(name: String): Boolean = name endsWith "Test" ... testOptions in Test := Seq(Tests.Filter(unitTestFilter)), testOptions in FunTest := Seq(Tests.Filter(funTestFilter)), ... So I can do something like that: sbt clean coverage test dockerComposeUp it:test dockerComposeStop coverageReport Sadly that kills all my

Spark/Scala - Project runs fine from IntelliJ but throws error with SBT

送分小仙女□ 提交于 2020-05-11 14:49:52
问题 I have a Spark project that I'm running locally in IntelliJ and is working fine when I run from there. The project is very simple and just a toy example for the moment. Below is the code: package mls.main import org.apache.spark.SparkContext._ import org.apache.spark.rdd.RDD import org.apache.spark.sql.{DataFrame, SQLContext} import org.apache.spark.{SparkConf, SparkContext} import java.nio.file.{Paths, Files} import scala.io.Source object Main { def main(args: Array[String]) { import org

Spark/Scala - Project runs fine from IntelliJ but throws error with SBT

不问归期 提交于 2020-05-11 14:46:44
问题 I have a Spark project that I'm running locally in IntelliJ and is working fine when I run from there. The project is very simple and just a toy example for the moment. Below is the code: package mls.main import org.apache.spark.SparkContext._ import org.apache.spark.rdd.RDD import org.apache.spark.sql.{DataFrame, SQLContext} import org.apache.spark.{SparkConf, SparkContext} import java.nio.file.{Paths, Files} import scala.io.Source object Main { def main(args: Array[String]) { import org

How to release a Scala library to Maven Central using sbt?

别等时光非礼了梦想. 提交于 2020-05-09 17:55:08
问题 I have an open source Scala project using SBT and I would like to release my library to Maven. How do I do it? 回答1: I always forget how to do this. So here are my notes: Once in your life: Create Sonatype account For every new developer machine: Install gpg e.g. on OSX: brew install gpg Run gpg --gen-key to generate a new key. Remember the passphrase and email you used . Make sure you see it when you list your secret keys: > gpg --list-secret-keys ~/.gnupg/pubring.kbx ------------------------

How to define Kafka (data source) dependencies for Spark Streaming?

对着背影说爱祢 提交于 2020-05-08 08:11:14
问题 I'm trying to consume a kafka 0.8 topic using spark-streaming2.0.0, i'm trying to identify the required dependencies i have tried using these dependencies in my build.sbt file libraryDependencies += "org.apache.spark" %% "spark-streaming_2.11" % "2.0.0" when i run sbt package i'm getting unresolved dependencies for all three these jars, But these jars do exist https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-8_2.11/2.0.0 Please help in debugging this issue, I'm new

intellij 2020.1 sbt mainRunner configuration

让人想犯罪 __ 提交于 2020-04-30 06:33:26
问题 I am trying to follow the intructions for setting up intellij scala project to work with sbt. However, I am not finding the run/configuration described in intellij 2020.1. Based on this post I understand that the way this is configured has changed. However, that post describes how to make old project work. What do I do for new projects? Steps to Reproduce Create nice sbt project with idea.sbt already configured with mainRunner sbt new tillrohrmann/flink-project.g8 this includes idea.sbt lazy

what does a single colon mean in sbt (between two commands)

陌路散爱 提交于 2020-04-16 08:02:09
问题 In a .travis.yml file using sbt, I see this script: - sbt ++$TRAVIS_SCALA_VERSION test:fastOptJS test:fullOptJS In sbt, I can run test , and I can run fastOptJS . What does the single colon between them do? In travis, can one run a sequence of commands? ie. what does it mean for test:fastOptJS to be followed by test:fullOptJS ? 回答1: In sbt, I can run test , and I can run fastOptJS . What does the single colon between them do? test:fastOptJS means fastOptJS in test scope. The confusion comes

what does a single colon mean in sbt (between two commands)

允我心安 提交于 2020-04-16 08:00:19
问题 In a .travis.yml file using sbt, I see this script: - sbt ++$TRAVIS_SCALA_VERSION test:fastOptJS test:fullOptJS In sbt, I can run test , and I can run fastOptJS . What does the single colon between them do? In travis, can one run a sequence of commands? ie. what does it mean for test:fastOptJS to be followed by test:fullOptJS ? 回答1: In sbt, I can run test , and I can run fastOptJS . What does the single colon between them do? test:fastOptJS means fastOptJS in test scope. The confusion comes