How to use Analytic/Window Functions in Spark Java?

故事扮演 提交于 2019-11-29 14:48:46

You're mixing dataframe syntax and sql syntax - specifically you created a WindowSpec but then didn't use it.

Import org.apache.spark.sql.functions to get the row_number function, then create the column that you're trying to select:

Column rowNum = functions.row_number().over(ws)

Then select it using the dataframe API:

df.select(each, column, you, want, rowNum)

My syntax may be slightly off, I'm used to scala or python, but the gist is something like that.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!