I\'m trying to figure out the best way to get the largest value in a Spark dataframe column.
Consider the following example:
df = spark.createDataFra
In case some wonders how to do it using Scala (using Spark 2.0.+), here you go:
scala> df.createOrReplaceTempView("TEMP_DF") scala> val myMax = spark.sql("SELECT MAX(x) as maxval FROM TEMP_DF"). collect()(0).getInt(0) scala> print(myMax) 117