Show all pyspark columns after group and agg

陌路散爱 提交于 2020-01-25 06:40:52

问题


I wish to groupby a column and then find the max of another column. Lastly, show all the columns based on this condition. However, when I used my codes, it only show 2 columns and not all of it.

# Normal way of creating dataframe in pyspark
sdataframe_temp = spark.createDataFrame([
    (2,2,'0-2'),
    (2,23,'22-24')],
    ['a', 'b', 'c']
)

sdataframe_temp2 = spark.createDataFrame([
    (4,6,'4-6'),
    (5,7,'6-8')],
    ['a', 'b', 'c']
)
# Concat two different pyspark dataframe
sdataframe_union_1_2 = sdataframe_temp.union(sdataframe_temp2)

sdataframe_union_1_2_g = sdataframe_union_1_2.groupby('a').agg({'b':'max'})

sdataframe_union_1_2_g.show()

output:

+---+------+
|  a|max(b)|
+---+------+
|  5|     7|
|  2|    23|
|  4|     6|
+---+------+

Expected output:

+---+------+-----+
|  a|max(b)| c   |
+---+------+-----+
|  5|     7|6-8  |
|  2|    23|22-24|
|  4|     6|4-6  |
+---+------+---+

回答1:


You can use a Window function to make it work:

Method 1: Using Window function

import pyspark.sql.functions as F
from pyspark.sql.window import Window

w = Window().partitionBy("a").orderBy(F.desc("b"))

(sdataframe_union_1_2
.withColumn('max_val', F.row_number().over(w) == 1)
.where("max_val == True")
.drop("max_val")
.show())

+---+---+-----+
|  a|  b|    c|
+---+---+-----+
|  5|  7|  6-8|
|  2| 23|22-24|
|  4|  6|  4-6|
+---+---+-----+

Explanation

  1. Window functions are useful when we want to attach a new column to the existing set of columns.
  2. In this case, I tell Window function to groupby partitionBy('a') column and sort the column b in descending order F.desc(b). This make the first value in b in each group its max value.
  3. Then we use F.row_number() to filter the max values where row number equals 1.
  4. Finally, we drop the new column since it is not being used after filtering the data frame.

Method 2: Using groupby + inner join

f = sdataframe_union_1_2.groupby('a').agg(F.max('b').alias('b'))

sdataframe_union_1_2.join(f, on=['a','b'], how='inner').show()

+---+---+-----+
|  a|  b|    c|
+---+---+-----+
|  2| 23|22-24|
|  5|  7|  6-8|
|  4|  6|  4-6|
+---+---+-----+


来源:https://stackoverflow.com/questions/59807555/show-all-pyspark-columns-after-group-and-agg

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!