How to correctly use Spark in ScalaTest tests?

蹲街弑〆低调 提交于 2020-01-02 06:00:29

问题


I have multiple ScalaTest classes which use BeforeAndAfterAll to construct a SparkContext and stop it afterwards like so:

class MyTest extends FlatSpec with Matchers with BeforeAndAfterAll {

  private var sc: SparkContext = null

  override protected def beforeAll(): Unit = {
    sc = ... // Create SparkContext
  }

  override protected def afterAll(): Unit = {
    sc.stop()
  }

  // my tests follow
}

These tests run fine when started from IntelliJ IDEA, but when running sbt test, I get WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243)., and after that, a bunch of other exceptions which are, I suppose, related to this issue.

How to correctly use Spark? Do I have to create one global SparkContext for the whole test suite, and if yes, how do I do this?


回答1:


Seems like I lost the sight of the wood for the trees, I forgot the following line in my build.sbt:

parallelExecution in test := false

With this line, the test runs.



来源:https://stackoverflow.com/questions/33237856/how-to-correctly-use-spark-in-scalatest-tests

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!