DATEDIFF in SPARK SQl

冷暖自知 提交于 2019-12-05 21:32:52

问题


I am new to Spark SQL. We are migrating data from SQL server to Databricks. I am using SPARK SQL . Can you please suggest how to achieve below functionality in SPARK sql for the below datefunctions. I can see datediff gives only days in spark sql.

DATEDIFF(YEAR,StartDate,EndDate) DATEDIFF(Month,StartDate,EndDate) DATEDIFF(Quarter,StartDate,EndDate)


回答1:


As you have mentioned SparkSQL does support DATEDIFF but for days only. I would also be careful as it seems the parameters are the opposite way round for Spark, ie

--SQL Server
DATEDIFF ( datepart , startdate , enddate )

--Spark
DATEDIFF ( enddate , startdate )

Spark does however support a similar function called months_between which you could use in place of DATEDIFF( month .... This function also returns a decimal amount so optionally cast it to INT for similar functionality to the

SELECT startDate, endDate, 
  DATEDIFF( endDate, startDate ) AS diff_days,
  CAST( months_between( endDate, startDate ) AS INT ) AS diff_months      
FROM yourTable
ORDER BY 1;

There are also year and quarter functions for determining the year and quarter of a date respectively. You could simply minus the years but quarters would be more tricky. It may be you have to 'do the math' or end up using a calendar table.



来源:https://stackoverflow.com/questions/52527571/datediff-in-spark-sql

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!