Pyspark: How to add ten days to existing date column

五迷三道 提交于 2020-12-26 06:22:40

问题


I have a dataframe in Pyspark with a date column called "report_date".

I want to create a new column called "report_date_10" that is 10 days added to the original report_date column.

Below is the code I tried:

df_dc["report_date_10"] = df_dc["report_date"] + timedelta(days=10)

This is the error I got:

AttributeError: 'datetime.timedelta' object has no attribute '_get_object_id'

Help! thx


回答1:


It seems you are using the pandas syntax for adding a column; For spark, you need to use withColumn to add a new column; For adding the date, there's the built in date_add function:

import pyspark.sql.functions as F
df_dc = spark.createDataFrame([['2018-05-30']], ['report_date'])

df_dc.withColumn('report_date_10', F.date_add(df_dc['report_date'], 10)).show()
+-----------+--------------+
|report_date|report_date_10|
+-----------+--------------+
| 2018-05-30|    2018-06-09|
+-----------+--------------+


来源:https://stackoverflow.com/questions/50703284/pyspark-how-to-add-ten-days-to-existing-date-column

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!