How to convert a dataframe to dataset in Apache Spark in Scala?

那年仲夏 提交于 2019-12-03 05:56:10

The error message you are reading is a pretty good pointer.

When you convert a DataFrame to a Dataset you have to have a proper Encoder for whatever is stored in the DataFrame rows.

Encoders for primitive-like types (Ints, Strings, and so on) and case classes are provided by just importing the implicits for your SparkSession like follows:

case class MyData(intField: Int, boolField: Boolean) // e.g.

val spark: SparkSession = ???
val df: DataFrame = ???

import spark.implicits._

val ds: Dataset[MyData] = df.as[MyData]

If that doesn't work either is because the type you are trying to cast the DataFrame to isn't supported. In that case, you would have to write your own Encoder: you may find more information about it here and see an example (the Encoder for java.time.LocalDateTime) here.

Shang Gao

Spark 1.6.0

case class MyCase(id: Int, name: String)

val encoder = org.apache.spark.sql.catalyst.encoders.ExpressionEncoder[MyCase]

val dataframe = …

val dataset = dataframe.as(encoder)

Spark 2.0 or above

case class MyCase(id: Int, name: String)

val encoder = org.apache.spark.sql.Encoders.product[MyCase]

val dataframe = …

val dataset = dataframe.as(encoder)
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!