Spark count(“*”).over(window) returning incorrect values (pyspark 2.3)

前端 未结 0 1440
刺人心
刺人心 2020-12-18 09:13

We have a window specification and a count("*") over that window as follows:

from pyspark.sql.functions import count as spark_count, rank window_spec = Win

相关标签:
回答
  • 消灭零回复
提交回复
热议问题