I am trying to run a map/reducer in java. Below are my files
WordCount.java
package counter;
public class
Add these 2 lines in your code :
job.setMapOutputKeyClass(Text.class);
job.setMapOutputValueClass(IntWritable.class);
You are using TextOutputFormat
which emits LongWritable key and Text value by default, but you are emitting Text as key and IntWritable as value. You need to tell this to the famework.
HTH
Removing this from Code solved the issue
super.map(key, value, context);
This may not be your issue but I had this silly issue once. Make sure you are not mixing the old and the new libraries i.e. mapred vs mapreduce. Annotate @Overide on your map and reduce methods. If you see errors you are not properly overriding the methods.
I got a similar exception stack trace due to improper Mapper Class set in my code (typo :) )
job.setMapperClass(Mapper.class) // Set to org.apache.hadoop.mapreduce.Mapper due to type
Notice that mistakenly I was using Mapper class from mapreduce package, I changed it to my custom mapper class:
job.setMapperClass(LogProcMapperClass.class) // LogProcMapperClass is my custom mapper.
The exception is resolved after I corrected the mapper class.