PySpark Numeric Window Group By
问题 I'd like to be able to have Spark group by a step size, as opposed to just single values. Is there anything in spark similar to PySpark 2.x's window function for numeric (non-date) values? Something along the lines of: sqlContext = SQLContext(sc) df = sqlContext.createDataFrame([10, 11, 12, 13], "integer").toDF("foo") res = df.groupBy(window("foo", step=2, start=10)).count() 回答1: You can reuse timestamp one and express parameters in seconds. Tumbling: from pyspark.sql.functions import col,