Spark Scala: DateDiff of two columns by hour or minute

后端 未结 2 1919
半阙折子戏
半阙折子戏 2020-12-01 16:37

I have two timestamp columns in a dataframe that I\'d like to get the minute difference of, or alternatively, the hour difference of. Currently I\'m able to get the day diff

2条回答
  •  抹茶落季
    2020-12-01 16:43

    You can get the difference in seconds by

    import org.apache.spark.sql.functions._
    val diff_secs_col = col("ts1").cast("long") - col("ts2").cast("long")
    

    Then you can do some math to get the unit you want. For example:

    val df2 = df1
      .withColumn( "diff_secs", diff_secs_col )
      .withColumn( "diff_mins", diff_secs_col / 60D )
      .withColumn( "diff_hrs",  diff_secs_col / 3600D )
      .withColumn( "diff_days", diff_secs_col / (24D * 3600D) )
    

    Or, in pyspark:

    from pyspark.sql.functions import *
    diff_secs_col = col("ts1").cast("long") - col("ts2").cast("long")
    
    df2 = df1 \
      .withColumn( "diff_secs", diff_secs_col ) \
      .withColumn( "diff_mins", diff_secs_col / 60D ) \
      .withColumn( "diff_hrs",  diff_secs_col / 3600D ) \
      .withColumn( "diff_days", diff_secs_col / (24D * 3600D) )
    

提交回复
热议问题