NPE in spark with Joda DateTime

余生颓废 提交于 2019-12-08 03:07:59

问题


When executing simple mapping in spark on joda DateTime field I am receving NullPointerException.

Code snippet:

val me1 = (accountId, DateTime.now())
val me2 = (accountId, DateTime.now())
val me3 = (accountId, DateTime.now())
val rdd = spark.parallelize(List(me1, me2, me3))

val result = rdd.map{case (a,d) => (a,d.dayOfMonth().roundFloorCopy())}.collect.toList

Stacktrace:

java.lang.NullPointerException
    at org.joda.time.DateTime$Property.roundFloorCopy(DateTime.java:2280)
    at x.y.z.jobs.info.AggJobTest$$anonfun$1$$anonfun$2.apply(AggJobTest.scala:47)
    at x.y.z.jobs.info.AggJobTest$$anonfun$1$$anonfun$2.apply(AggJobTest.scala:47)
    at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
    at scala.collection.Iterator$class.foreach(Iterator.scala:727)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
    at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
    at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
    at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
    at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
    at scala.collection.AbstractIterator.to(Iterator.scala:1157)
    at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
    at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
    at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
    at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
    at org.apache.spark.rdd.RDD$$anonfun$16.apply(RDD.scala:780)
    at org.apache.spark.rdd.RDD$$anonfun$16.apply(RDD.scala:780)
    at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1314)
    at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1314)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
    at org.apache.spark.scheduler.Task.run(Task.scala:56)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:196)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

Any suggestions how to solve this problem?

Update: In ordeer to reproduce the problem you need to use KryoSerializer:

.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")


回答1:


As you have pointed out, you are using the KryoSerializer with the Joda DateTime object. It appears that the serialization has left out some required information, you may wish to look at using one of the projects which adds support for Joda DateTime objects to Kryo. For example https://github.com/magro/kryo-serializers provides a serializer called JodaDateTimeSerializer which you could register with kryo.register( DateTime.class, new JodaDateTimeSerializer() );



来源:https://stackoverflow.com/questions/32256318/npe-in-spark-with-joda-datetime

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!