Scala: Spark SQL to_date(unix_timestamp) returning NULL

后端 未结 1 2020
悲哀的现实
悲哀的现实 2021-01-03 01:36

Spark Version: spark-2.0.1-bin-hadoop2.7 Scala: 2.11.8

I am loading a raw csv into a DataFrame. In csv, although the column is support to be in date for

相关标签:
1条回答
  • 2021-01-03 02:33

    To convert yyyyMMdd to yyyy-MM-dd you can:

    spark.sql("""SELECT DATE_FORMAT(
      CAST(UNIX_TIMESTAMP('20161025', 'yyyyMMdd') AS TIMESTAMP), 'yyyy-MM-dd'
    )""")
    

    with functions:

    date_format(unix_timestamp(col, "yyyyMMdd").cast("timestamp"), "yyyy-MM-dd")
    
    0 讨论(0)
提交回复
热议问题