Convert an RDD to a DataFrame in Spark using Scala

Deadly 提交于 2019-12-24 16:19:13

问题


I have textRDD: org.apache.spark.rdd.RDD[(String, String)]

I would like to convert it to a DataFrame. The columns correspond to the title and content of each page(row).


回答1:


Use toDF(), provide the column names if you have them.

val textDF = textRDD.toDF("title": String, "content": String)
textDF: org.apache.spark.sql.DataFrame = [title: string, content: string]

or

val textDF = textRDD.toDF()
textDF: org.apache.spark.sql.DataFrame = [_1: string, _2: string]

The shell auto-imports (I am using version 1.5), but you may need import sqlContext.implicits._ in an application.




回答2:


I usually do this like the following:

Create a case class like this:

case class DataFrameRecord(property1: String, property2: String)

Then you can use map to convert into the new structure using the case class:

rdd.map(p => DataFrameRecord(prop1, prop2)).toDF()


来源:https://stackoverflow.com/questions/33023330/convert-an-rdd-to-a-dataframe-in-spark-using-scala

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!