How to convert a DataFrame back to normal RDD in pyspark?

前端 未结 3 833
青春惊慌失措
青春惊慌失措 2020-12-12 19:00

I need to use the

(rdd.)partitionBy(npartitions, custom_partitioner)

method that is not available on the DataFrame. All of the DataFrame

3条回答
  •  被撕碎了的回忆
    2020-12-12 19:31

    @dapangmao's answer works, but it doesn't give the regular spark RDD, it returns a Row object. If you want to have the regular RDD format.

    Try this:

    rdd = df.rdd.map(tuple)
    

    or

    rdd = df.rdd.map(list)
    

提交回复
热议问题