PySpark: Union of all the dataframes in a Python dictionary

不打扰是莪最后的温柔 提交于 2020-12-26 05:02:51

问题


I have a dictionary my_dict_of_df which consists of variable number of dataframes each time my program runs. I want to create a new dataframe that is a union of all these dataframes.

My dataframes look like-

my_dict_of_df["df_1"], my_dict_of_df["df_2"] and so on...

How do I union all these dataframes?


回答1:


Consulted the solution given here, thanks to @pault.

from functools import reduce
from pyspark.sql import DataFrame

def union_all(*dfs):
    return reduce(DataFrame.union, dfs)

df1 = sqlContext.createDataFrame([(1, "foo1"), (2, "bar1")], ("k", "v"))
df2 = sqlContext.createDataFrame([(3, "foo2"), (4, "bar2")], ("k", "v"))
df3 = sqlContext.createDataFrame([(5, "foo3"), (6, "bar3")], ("k", "v"))

my_dic = {}
my_dic["df1"] = df1
my_dic["df2"] = df2
my_dic["df3"] = df3

new_df = union_all(*my_dic.values())

print(type(new_df))   # <class 'pyspark.sql.dataframe.DataFrame'>
print(new_df.show())  

"""
+---+----+
|  k|   v|
+---+----+
|  1|foo1|
|  2|bar1|
|  3|foo2|
|  4|bar2|
|  5|foo3|
|  6|bar3|
+---+----+
"""

Edit: using DataFrame.union instead of DataFrame.unionAll since the latter is deprecated.



来源:https://stackoverflow.com/questions/55168411/pyspark-union-of-all-the-dataframes-in-a-python-dictionary

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!