How to get day of week in SparkSQL?

前端 未结 3 977
半阙折子戏
半阙折子戏 2020-12-03 14:19

I am trying to select all record recorded at Sunday through SparkSQL. I have the following try but in vain.

SELECT * FROM mytable WHERE DATEPART(WEEKDAY, cre         


        
3条回答
  •  伪装坚强ぢ
    2020-12-03 15:12

    If the create_time is in the format of UTC, you can use the following to filter out specific days in SparkSQL. I used Spark 1.6.1:

    select id,  date_format(from_unixtime(created_utc), 'EEEE') from testTable where date_format(from_unixtime(created_utc), 'EEEE') == "Wednesday"
    

    If you specify 'EEEE', the day of the week is spelled out completely. You can use 'E' to specify the shortened version, e.g. Wed. You can find more info here: http://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.DataFrame http://docs.oracle.com/javase/6/docs/api/java/text/SimpleDateFormat.html

提交回复
热议问题