Convert Row to map in spark scala

前端 未结 4 1043
傲寒
傲寒 2021-02-04 18:23

I have a row from a data frame and I want to convert it to a Map[String, Any] that maps column names to the values in the row for that column.

Is there an easy way to do

4条回答
  •  青春惊慌失措
    2021-02-04 18:51

    Let's say you have a data Frame with these columns:

    [time(TimeStampType), col1(DoubleType), col2(DoubleType)]

    You can do something like this:

    val modifiedDf = df.map{row => 
        val doubleObject = row.getValuesMap(Seq("col1","col2"))
        val timeObject = Map("time" -> row.getAs[TimeStamp]("time"))
        val map = doubleObject ++ timeObject
    }
    

提交回复
热议问题