Spark: java.lang.UnsupportedOperationException: No Encoder found for java.time.LocalDate

馋奶兔 提交于 2019-12-05 11:57:47

for custom dataset type, you can use Kyro serde framework, as long as your data is actually serializable(aka. implements Serializable). here is one example of using Kyro: Spark No Encoder found for java.io.Serializable in Map[String, java.io.Serializable].

Kyro is always recommended since it's much faster and also compatible with Java serde framework. you can definitely choose Java native serde(ObjectWriter/ObjectReader) but it's much slower.

like the comments above, SparkSQL comes with lots of useful Encoders under sqlContext.implicits._, but that won't cover everything, so you might have to plugin your own Encoder.

Like I said, your custom data has to be serializable, and according to https://docs.oracle.com/javase/8/docs/api/java/time/LocalDate.html, it implements Serializable interface, so you are definitely good here.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!