How to get today -“1 day” date in sparksql?

后端 未结 4 575
北荒
北荒 2020-12-30 02:40

How to get current_date - 1 day in sparksql, same as cur_date()-1 in mysql.

相关标签:
4条回答
  • 2020-12-30 02:55

    The arithmetic functions allow you to perform arithmetic operation on columns containing dates.

    For example, you can calculate the difference between two dates, add days to a date, or subtract days from a date. The built-in date arithmetic functions include datediff, date_add, date_sub, add_months, last_day, next_day, and months_between.

    Out of above what we need is

    date_sub(timestamp startdate, int days), Purpose: Subtracts a specified number of days from a TIMESTAMP value. The first argument can be a string, which is automatically cast to TIMESTAMP if it uses the recognized format, as described in TIMESTAMP Data Type. Return type: Returns the date that is > days days before start

    and we have

    current_timestamp() Purpose: Alias for the now() function. Return type: timestamp

    you can do select

    date_sub(CAST(current_timestamp() as DATE), 1)
    

    See https://spark.apache.org/docs/1.6.2/api/java/org/apache/spark/sql/functions.html

    0 讨论(0)
  • 2020-12-30 02:59

    Yes, the date_sub() function is the right for the question, anyway, there's an error in the selected answer:

    Return type: timestamp

    The return type should be date instead, date_sub() function will trim any hh:mm:ss part of the timestamp, and returns only a date.

    0 讨论(0)
  • 2020-12-30 03:02

    You can easily perform this task , there are many methods related to the date and what you can use here is date_sub

    Example on Spark-REPL:

     scala> spark.sql("select date_sub(current_timestamp(), 1)").show
    +----------------------------------------------+
    |date_sub(CAST(current_timestamp() AS DATE), 1)|
    +----------------------------------------------+
    |                                    2016-12-12|
    +----------------------------------------------+
    
    0 讨论(0)
  • 2020-12-30 03:14

    You can try

    date_add(current_date(), -1)
    

    I don't know spark either but I found it on google. You can also use this link for reference

    0 讨论(0)
提交回复
热议问题