Spark Accumulator value not read by task
I am initializing an accumulator final Accumulator<Integer> accum = sc.accumulator(0); And then while in map function , I'm trying to increment the accumulator , then using the accumulator value in setting a variable. JavaRDD<UserSetGet> UserProfileRDD1 = temp.map(new Function<String, UserSetGet>() { @Override public UserSetGet call(String arg0) throws Exception { UserSetGet usg = new UserSetGet(); accum.add(1); usg.setPid(accum.value().toString(); } }); But Im getting the following error. 16/03/14 09:12:58 ERROR executor.Executor: Exception in task 0.0 in stage 2.0 (TID 2) java.lang