How to add multiple columns in Apache Spark

非 Y 不嫁゛ 提交于 2019-12-24 02:02:11

问题


Here is my input data with four columns with space as the delimiter. I want to add the second and third column and print the result

sachin 200 10 2
sachin 900 20 2
sachin 500 30 3
Raju 400 40 4
Mike 100 50 5
Raju 50 60 6

My code is in the mid way

from pyspark import SparkContext
sc = SparkContext()
def getLineInfo(lines):
    spLine = lines.split(' ')
    name = str(spLine[0])
    cash = int(spLine[1])
    cash2 = int(spLine[2])
    cash3 = int(spLine[3])
    return (name,cash,cash2)
myFile = sc.textFile("D:\PYSK\cash.txt")
rdd = myFile.map(getLineInfo)
print rdd.collect()

From here I got the result as

[('sachin', 200, 10), ('sachin', 900, 20), ('sachin', 500, 30), ('Raju', 400, 40
), ('Mike', 100, 50), ('Raju', 50, 60)]

Now the final result I need is as below, adding the 2nd and 3rd column and display the remaining fields

sachin 210 2
sachin 920 2
sachin 530 3
Raju 440 4
Mike 150 5
Raju 110 6

回答1:


Use this:

def getLineInfo(lines):
    spLine = lines.split(' ')
    name = str(spLine[0])
    cash = int(spLine[1])
    cash2 = int(spLine[2])
    cash3 = int(spLine[3])
    return (name, cash + cash2, cash3)


来源:https://stackoverflow.com/questions/39392237/how-to-add-multiple-columns-in-apache-spark

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!