Spark - Group by HAVING with dataframe syntax?

前端 未结 2 602
抹茶落季
抹茶落季 2020-12-18 19:56

What\'s the syntax for using a groupby-having in Spark without an sql/hiveContext? I know I can do

DataFrame df = some_df
df.registreTempTable(\"df\");    
d         


        
2条回答
  •  余生分开走
    2020-12-18 20:18

    Yes, it doesn't exist. You express the same logic with agg followed by where:

    df.groupBy(someExpr).agg(somAgg).where(somePredicate) 
    

提交回复
热议问题