Apply window function over multiple columns
问题 I would like to perform window function (concretely moving average), but over all columns of a dataframe. I can do it this way from pyspark.sql import SparkSession, functions as func df = ... df.select([func.avg(df[col]).over(windowSpec).alias(col) for col in df.columns]) but I'm afraid this isn't very efficient. Is there a better way to do it? 回答1: An alternative which may be better is to create a new df where you Group By the columns in Window function and apply average on the remaining