get min and max from a specific column scala spark dataframe

后端 未结 7 1184
梦谈多话
梦谈多话 2021-02-01 04:37

I would like to access to the min and max of a specific column from my dataframe but I don\'t have the header of the column, just its number, so I should I do using scala ?

7条回答
  •  别跟我提以往
    2021-02-01 04:56

    You can use pattern matching while assigning variable:

    import org.apache.spark.sql.functions.{min, max}
    import org.apache.spark.sql.Row
    
    val Row(minValue: Double, maxValue: Double) = df.agg(min(q), max(q)).head
    

    Where q is either a Column or a name of column (String). Assuming your data type is Double.

提交回复
热议问题