Renaming column names of a DataFrame in Spark Scala

后端 未结 6 1617
攒了一身酷
攒了一身酷 2020-11-28 02:04

I am trying to convert all the headers / column names of a DataFrame in Spark-Scala. as of now I come up with following code which only replaces a single colum

6条回答
  •  渐次进展
    2020-11-28 02:39

    tow table join not rename the joined key

    // method 1: create a new DF
    day1 = day1.toDF(day1.columns.map(x => if (x.equals(key)) x else s"${x}_d1"): _*)
    
    // method 2: use withColumnRenamed
    for ((x, y) <- day1.columns.filter(!_.equals(key)).map(x => (x, s"${x}_d1"))) {
        day1 = day1.withColumnRenamed(x, y)
    }
    

    works!

提交回复
热议问题