How to write unit tests in Spark 2.0+?

后端 未结 6 442
日久生厌
日久生厌 2020-11-29 16:00

I\'ve been trying to find a reasonable way to test SparkSession with the JUnit testing framework. While there seem to be good examples for SparkContext

6条回答
  •  伪装坚强ぢ
    2020-11-29 16:57

    I could solve the problem with below code

    spark-hive dependency is added in project pom

    class DataFrameTest extends FunSuite with DataFrameSuiteBase{
            test("test dataframe"){
            val sparkSession=spark
            import sparkSession.implicits._
            var df=sparkSession.read.format("csv").load("path/to/csv")
            //rest of the operations.
            }
            }
    

提交回复
热议问题