Computing First Day of Previous Quarter in Spark SQL

不想你离开。 提交于 2021-02-11 17:55:52

问题


How do I derive the first day of the last quarter pertaining to any given date in Spark-SQL query using the SQL API ? Few required samples are as below:

input_date | start_date
------------------------
2020-01-21 | 2019-10-01
2020-02-06 | 2019-10-01
2020-04-15 | 2020-01-01
2020-07-10 | 2020-04-01
2020-10-20 | 2020-07-01
2021-02-04 | 2020-10-01

The Quarters generally are:

1 | Jan - Mar
2 | Apr - Jun
3 | Jul - Sep
4 | Oct - Dec

Note:I am using Spark SQL v2.4.

Any help is appreciated. Thanks.


回答1:


Use the date_trunc with the negation of 3 months.

df.withColumn("start_date", to_date(date_trunc("quarter", expr("input_date - interval 3 months"))))
  .show()

+----------+----------+
|input_date|start_date|
+----------+----------+
|2020-01-21|2019-10-01|
|2020-02-06|2019-10-01|
|2020-04-15|2020-01-01|
|2020-07-10|2020-04-01|
|2020-10-20|2020-07-01|
|2021-02-04|2020-10-01|
+----------+----------+



回答2:


Personally I would create a table with the dates in from now for the next twenty years using excel or something and just reference that table.



来源:https://stackoverflow.com/questions/63813537/computing-first-day-of-previous-quarter-in-spark-sql

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!