Spark: How can DataFrame be Dataset[Row] if DataFrame's have a schema

后端 未结 2 1048
面向向阳花
面向向阳花 2021-01-03 13:16

This article claims that a DataFrame in Spark is equivalent to a Dataset[Row], but this blog post shows that a DataFrame has a schem

相关标签:
2条回答
  • 2021-01-03 13:42

    In Spark 2.0, in code there is: type DataFrame = Dataset[Row]

    It is Dataset[Row], just because of definition.

    Dataset has also schema, you can print it using printSchema() function. Normally Spark infers schema, so you don't have to write it by yourself - however it's still there ;)

    You can also do createTempView(name) and use it in SQL queries, just like DataFrames.

    In other words, Dataset = DataFrame from Spark 1.5 + encoder, that converts rows to your classes. After merging types in Spark 2.0, DataFrame becomes just an alias for Dataset[Row], so without specified encoder.

    About conversions: rdd.map() also returns RDD, it never returns DataFrame. You can do:

    // Dataset[Row]=DataFrame, without encoder
    val rddToDF = sparkSession.createDataFrame(rdd)
    // And now it has information, that encoder for String should be used - so it becomes Dataset[String]
    val rDDToDataSet = rddToDF.as[String]
    
    // however, it can be shortened to:
    val dataset = sparkSession.createDataset(rdd)
    
    0 讨论(0)
  • 2021-01-03 13:53

    Note (in addition to the answer of T Gaweda) that there is a schema associated to each Row (Row.schema). However, this schema is not set until it is integrated in a DataFrame (or Dataset[Row])

    scala> Row(1).schema
    res12: org.apache.spark.sql.types.StructType = null
    
    scala> val rdd = sc.parallelize(List(Row(1)))
    rdd: org.apache.spark.rdd.RDD[org.apache.spark.sql.Row] = ParallelCollectionRDD[5] at parallelize at <console>:28
    scala> spark.createDataFrame(rdd,schema).first
    res15: org.apache.spark.sql.Row = [1]
    scala> spark.createDataFrame(rdd,schema).first.schema
    res16: org.apache.spark.sql.types.StructType = StructType(StructField(a,IntegerType,true))
    
    0 讨论(0)
提交回复
热议问题