In spark iterate through each column and find the max length

前端 未结 3 1777
清歌不尽
清歌不尽 2021-01-15 20:28

I am new to spark scala and I have following situation as below I have a table \"TEST_TABLE\" on cluster(can be hive table) I am converting that to dataframe as:

<         


        
3条回答
  •  醉酒成梦
    2021-01-15 21:03

    Plain and simple:

    import org.apache.spark.sql.functions._
    
    val df = spark.table("TEST_TABLE")
    df.select(df.columns.map(c => max(length(col(c)))): _*)
    

提交回复
热议问题