Spark dataframe reducebykey like operation

前端 未结 3 849
星月不相逢
星月不相逢 2021-02-08 11:40

I have a Spark dataframe with the following data (I use spark-csv to load the data in):

key,value
1,10
2,12
3,0
1,20
         


        
3条回答
  •  耶瑟儿~
    2021-02-08 12:18

    How about this? I agree this still converts to rdd then to dataframe.

    df.select('key','value').map(lambda x: x).reduceByKey(lambda a,b: a+b).toDF(['key','value'])
    

提交回复
热议问题