How to convert unix timestamp to date in Spark

前端 未结 7 1714
伪装坚强ぢ
伪装坚强ぢ 2020-12-01 11:42

I have a data frame with a column of unix timestamp(eg.1435655706000), and I want to convert it to data with format \'yyyy-MM-DD\', I\'ve tried nscala-time but it doesn\'t w

7条回答
  •  南笙
    南笙 (楼主)
    2020-12-01 12:21

    What you can do is:

    input.withColumn("time", concat(from_unixtime(input.col("COL_WITH_UNIX_TIME")/1000,
    "yyyy-MM-dd'T'HH:mm:ss"), typedLit("."), substring(input.col("COL_WITH_UNIX_TIME"), 11, 3), 
    typedLit("Z")))
    

    where time is a new column name and COL_WITH_UNIX_TIME is the name of the column which you want to convert. This will give data in millis, making your data more accurate, like: "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"

提交回复
热议问题