Renaming column names of a DataFrame in Spark Scala

后端 未结 6 1619
攒了一身酷
攒了一身酷 2020-11-28 02:04

I am trying to convert all the headers / column names of a DataFrame in Spark-Scala. as of now I come up with following code which only replaces a single colum

6条回答
  •  北海茫月
    2020-11-28 02:37

    def aliasAllColumns(t: DataFrame, p: String = "", s: String = ""): DataFrame =
    {
      t.select( t.columns.map { c => t.col(c).as( p + c + s) } : _* )
    }
    

    In case is isn't obvious, this adds a prefix and a suffix to each of the current column names. This can be useful when you have two tables with one or more columns having the same name, and you wish to join them but still be able to disambiguate the columns in the resultant table. It sure would be nice if there were a similar way to do this in "normal" SQL.

提交回复
热议问题