spark scala column to counter of column unique values

后端 未结 0 377
佛祖请我去吃肉
佛祖请我去吃肉 2020-12-03 21:24

How can I correctly get column values as Map(k->v) where k is unique value and v is occurence count? I do it within groupby.

val getMapUDF = udf((arr: Arra         


        
相关标签:
回答
  • 消灭零回复
提交回复
热议问题