Task not Serializable - Spark Java

不想你离开。 提交于 2019-12-30 13:24:19

问题


I'm getting the Task not serializable error in Spark. I've searched and tried to use a static function as suggested in some posts, but it still gives the same error.

Code is as below:

public class Rating implements Serializable {
    private SparkSession spark;
    private SparkConf sparkConf;
    private JavaSparkContext jsc;
    private static Function<String, Rating> mapFunc;

    public Rating() {
        mapFunc = new Function<String, Rating>() {
            public Rating call(String str) {
                return Rating.parseRating(str);
            }
        };
    }

    public void runProcedure() { 
        sparkConf = new SparkConf().setAppName("Filter Example").setMaster("local");
        jsc = new JavaSparkContext(sparkConf);
        SparkSession spark = SparkSession.builder().master("local").appName("Word Count")
            .config("spark.some.config.option", "some-value").getOrCreate();        

        JavaRDD<Rating> ratingsRDD = spark.read().textFile("sample_movielens_ratings.txt")
                .javaRDD()
                .map(mapFunc);
    }

    public static void main(String[] args) {
        Rating newRating = new Rating();
        newRating.runProcedure();
    }
}

The error gives:

How do I solve this error? Thanks in advance.


回答1:


Clearly Rating cannot be Serializable, because it contains references to Spark structures (i.e. SparkSession, SparkConf, etc.) as attributes.

The problem here is in

JavaRDD<Rating> ratingsRD = spark.read().textFile("sample_movielens_ratings.txt")
            .javaRDD()
            .map(mapFunc);

If you look at the definition of mapFunc, you're returning a Rating object.

mapFunc = new Function<String, Rating>() {
    public Rating call(String str) {
        return Rating.parseRating(str);
    }
};

This function is used inside a map (a transformation in Spark terms). Because the transformations are executed directly into the worker nodes and not in the driver node, their code must be serializable. This forces Spark to try serialize the Rating class, but it is not possible.

Try to extract the features you need from Rating, and placing them in a different class that does not own any Spark structure. Finally, use this new class as return type of your mapFunc function.




回答2:


In addition you have to be sure to not include non-serializable variables in your class like JavaSparkContext and SparkSession. if you need to include them you should define like this:

private transient JavaSparkContext sparkCtx;
private transient SparkSession spark;

Good luck.



来源:https://stackoverflow.com/questions/40480311/task-not-serializable-spark-java

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!