Spark 2.0 missing spark implicits

后端 未结 2 1275
夕颜
夕颜 2020-12-25 12:27

Using Spark 2.0, Im seeing that it is possible to turn a dataframe of row\'s into a dataframe of case classes. When I try to do so, Im greeted with a message stating to impo

2条回答
  •  攒了一身酷
    2020-12-25 13:01

    Spark used spark identifier for SparkSession. This is what causes the confusion. If you created it with something like,

    val ss = SparkSession
      .builder()
      .appName("test")
      .master("local[2]")
      .getOrCreate()
    

    The correct way to import implicits would be,

    import ss.implicits._
    

    Let me know if this helps. Cheers.

提交回复
热议问题