How to add a new column to a Spark RDD?

走远了吗. 提交于 2019-12-04 03:24:44

You do not have to use Tuple* objects at all for adding a new column to an RDD.

It can be done by mapping each row, taking its original contents plus the elements you want to append, for example:

val rdd = ...
val withAppendedColumnsRdd = rdd.map(row => {
  val originalColumns = row.toSeq.toList
  val secondColValue = originalColumns(1).asInstanceOf[Int]
  val thirdColValue = originalColumns(2).asInstanceOf[Int]
  val newColumnValue = secondColValue + thirdColValue 
  Row.fromSeq(originalColumns :+ newColumnValue)
  // Row.fromSeq(originalColumns ++ List(newColumnValue1, newColumnValue2, ...)) // or add several new columns
})

you have RDD of tuple 4, apply map and convert it to tuple5

val rddTuple4RDD = ...........
val rddTuple5RDD = rddTuple4RDD.map(r=> Tuple5(rddTuple4._1, rddTuple4._2, rddTuple4._3, rddTuple4._4, rddTuple4._2 + rddTuple4._3))
标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!