How to convert unix timestamp to date in Spark
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I have a data frame with a column of unix timestamp(eg.1435655706000), and I want to convert it to data with format 'yyyy-MM-DD', I've tried nscala-time but it doesn't work. val time_col = sqlc.sql("select ts from mr").map(_(0).toString.toDateTime) time_col.collect().foreach(println) and I got error: java.lang.IllegalArgumentException: Invalid format: "1435655706000" is malformed at "6000" 回答1: Since spark1.5 , there is a builtin UDF for doing that. val df = sqlContext.sql("select from_unixtime(ts,'YYYY-MM-dd') as `ts` from mr") Please check