JavaSparkContext not serializable

柔情痞子 提交于 2019-12-04 04:58:02

No, JavaSparkContext is not serializable and is not supposed to be. It can't be used in a function you send to remote workers. Here you're not explicitly referencing it but a reference is being serialized anyway because your anonymous inner class function is not static and therefore has a reference to the enclosing class.

Try rewriting your code with this function as a static, stand-alone object.

You cannot use SparkContext and create other RDDs from within an executor (map function of an RDD).

You have to create the Cassandra RDD (sc.cassandraTable) in the driver and then do a join between those two RDDs (client RDD and cassandra table RDD).

Declare it with transient keyword:

private transient JavaSparkContext sparkContext;
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!