Apache Spark subtract days from timestamp column

后端 未结 2 1528
走了就别回头了
走了就别回头了 2020-12-21 07:46

I am using Spark Dataset and having trouble subtracting days from a timestamp column.

I would like to subtract days from Timestamp Column and get new Column with ful

相关标签:
2条回答
  • 2020-12-21 08:15

    Or you can simply use date_sub function from pyspark +1.5:

    from pyspark.sql.functions import *
    
    df.withColumn("10_days_before", date_sub(col('timestamp'),10).cast('timestamp'))
    
    0 讨论(0)
  • 2020-12-21 08:21

    You cast data to timestamp and expr to subtract an INTERVAL:

    import org.apache.spark.sql.functions.expr
    
    val df = Seq("2017-09-22 13:17:39.900").toDF("timestamp")
    
    df.withColumn(
      "10_days_before", 
      $"timestamp".cast("timestamp") - expr("INTERVAL 10 DAYS")).show(false)
    
    +-----------------------+---------------------+
    |timestamp              |10_days_before       |
    +-----------------------+---------------------+
    |2017-09-22 13:17:39.900|2017-09-12 13:17:39.9|
    +-----------------------+---------------------+
    

    If data is already of TimestampType you can skip cast.

    0 讨论(0)
提交回复
热议问题