scalatest

Scala unit testing stdin/stdout

♀尐吖头ヾ 提交于 2019-12-07 07:14:50
问题 Is it common practice to unit test stdIn/stdOut? If so, then how would you test something like this: import scala.io.StdIn._ object Test { def main(args: Array[String]) = { println("Please input your text. Leaving an empty line will indicate end of the input.") val input = Iterator.continually(readLine()).takeWhile(_ != "").mkString("\n") val result = doSomethingWithInput(input) println("Result:") println(result) } } I'm normally using ScalaTest if that makes any difference. 回答1: Since Scala

Play/Scala injecting controller into test

…衆ロ難τιáo~ 提交于 2019-12-07 05:16:53
问题 So according to Play 2.4 documentation (https://playframework.com/documentation/2.4.x/ScalaTestingWithScalaTest#Unit-Testing-Controllers), the controller should be set up as a trait like this trait ExampleController { this: Controller => def index() = Action { Ok("ok") } } object ExampleController extends Controller with ExampleController in order for a test to work like this class ExampleControllerSpec extends PlaySpec with Results { class TestController() extends Controller with

scalatest : object scalatest is not a member of package org

谁说胖子不能爱 提交于 2019-12-06 18:50:00
问题 EDIT : it works if the file is in src/test/scala/tests/ but not in src/main/scala/mypackage/ Why ? I have try solutions from topics with people having nearly the same issue but none works. In details, I have this in build.sbt : libraryDependencies ++= Seq( ... "org.scalatest" % "scalatest_2.10" % "2.2.1" % "test", ... In intellij a class with: import org.scalatest.{BeforeAndAfterAll, Suite} with {BeforeAndAfterAll, Suite} in red so i guess scalatest is found sbt package does not work too :

Hive configuration for Spark integration tests

拟墨画扇 提交于 2019-12-06 15:54:06
I am looking for a way to configure Hive for Spark SQL integration testing such that tables are written either in a temporary directory or somewhere under the test root. My investigation suggests that this requires setting both fs.defaultFS and hive.metastore.warehouse.dir before HiveContext is created. Just setting the latter, as mentioned in this answer is not working on Spark 1.6.1. val sqlc = new HiveContext(sparkContext) sqlc.setConf("hive.metastore.warehouse.dir", hiveWarehouseDir) The table metadata goes in the right place but the written files go to /user/hive/warehouse. If a dataframe

ParallelTestExecution with FlatSpec, Selenium DSL and Spring

天涯浪子 提交于 2019-12-06 13:31:07
问题 I am using Scalatest, FlatSpec, Spring, Selenium DSL and BeforeAndAfterAll. One of these things seems to stop ParallelTestExecution working properly. This is what happens when I run a class with two tests: One Browser opens and does some beforeAll stuff (but not Spring stuff) Another browser opens and does the beforeAll stuff Second browser gets used for first test then closes Another browser opens and does beforeAll stuff followed by second test First and Third browsers close So basically

How to suppress deprecation warnings when testing deprecated Scala functions?

元气小坏坏 提交于 2019-12-06 06:38:13
问题 Suppose I have a library, which contains both a deprecated function and a preferred function: object MyLib { def preferredFunction() = () @deprecated("Use preferredFunction instead", "1.0") def deprecatedFunction() = () } I want to test both preferredFunction and deprecatedFunction in ScalaTest: class MyLibSpec extends FreeSpec with Matchers { "preferred function" in { MyLib.preferredFunction() should be(()) } "deprecated function" in { MyLib.deprecatedFunction() should be(()) } } However, a

embedmongo with reactivemongo process does not exit

佐手、 提交于 2019-12-06 04:13:15
I am trying to do some tests using ScalaTest + embedmongo + reactivemongo but I fail. My first problem is that after a test mongod process does not shut down, I have this message in console: INFO: stopOrDestroyProcess: process has not exited and tests are paused till I kill the process manually. That happens even if body of my test is empty. I am running windows 8.1. The other issue is, that when I try to connect to db inside test using reactive mongo and insert anything to db I get this exception: reactivemongo.core.errors.ConnectionNotInitialized: MongoError['Connection is missing metadata

mocking methods which use ClassTag in scala using scalamock

大憨熊 提交于 2019-12-06 03:34:50
My first question, question , was answered but it uncovered another issue I am having. Here is the scenario. Example code (expanded from previous question) A Model: case class User (first: String, last: String, enabled: Boolean) Component Definition: trait DataProviderComponent { def find[T: ClassTag](id: Int): Try[T] def update[T](data: T): Try[T] } One of the concrete component implementations (updated implementation): class DbProvider extends DataProviderComponent { override def find[T: ClassTag](id: Int): Try[T] = { Try { val gson = new Gson() val testJson = """{"first": "doe", "last":

sbt: Add dependency on scalatest library. Where?

ぐ巨炮叔叔 提交于 2019-12-06 02:21:49
问题 I created a scala project using SBT and wrote some unit test cases using scalatest. But somehow sbt test can't find the org.scalatest package. Here my project definition files: Sources are in src/main/scala respective src/test/scala build.sbt : name := "MyProject" version := "0.1" scalaVersion := "2.9.2" project/plugins.sbt : addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.1.0") project/build.sbt : libraryDependencies += "org.scalatest" %% "scalatest" % "1.8" % "test" I'm

Mocking case classes with primitive types

给你一囗甜甜゛ 提交于 2019-12-06 02:18:36
问题 Consider following types structure: trait HasId[T] { def id: T } case class Entity(id: Long) extends HasId[Long] Let's say, we want to mock Entity class in some test. val entityMock = mock[Entity] Mockito.when(entityMock.id).thenReturn(0) Playing such test results in throwing NullPointerException (in second line), probably because of scala compiler behaviour with wrapping primitive types (if we replace Long with String, test executes correctly). An exception or error caused a run to abort.