Class Cast exception for the Hadoop new API

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-13 00:35:21

问题


i have trying to cough up with some simple code using Map reduce framework. Previously I had implemented using mapred package and I was able to specify the input format class as KeyvalueTextInputFormat But in the new Api using mapreduce this class is not present. I tried using the TextInputFormat.class but i still get the following exception

- job_local_0001
java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.io.Text
    at com.hp.hpl.mapReduceprocessing.MapReduceWrapper$HitFileProccesorMapper_internal.map(MapReduceWrapper.java:1)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:621)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
    at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:177)

here is a sample snippet of the code

Configuration conf = new Configuration();
         conf.set("key.value.separator.output.line", ",");    

        Job job = new Job(conf, "Result Aggregation");
        job.setJarByClass(ProcessInputFile.class);

        job.setInputFormatClass(TextInputFormat.class);
        job.setOutputFormatClass(TextOutputFormat.class);

        job.setMapOutputKeyClass(Text.class);
        job.setMapOutputValueClass(Text.class);

        job.setMapperClass(MultithreadedMapper.class);
        MultithreadedMapper.setMapperClass(job, HitFileProccesorMapper_internal.class);
        MultithreadedMapper.setNumberOfThreads(job, 3);
        //job.setMapperClass(HitFileProccesorMapper_internal.class);
        job.setReducerClass(HitFileReducer_internal.class);

        job.setOutputKeyClass(Text.class);
        job.setOutputValueClass(Text.class);



        FileInputFormat.addInputPath(job, new Path(inputFileofhits.getName()));
        FileOutputFormat.setOutputPath(job, new Path(ProcessInputFile.resultAggProps
                .getProperty("OUTPUT_DIRECTORY")));

        try {
            job.waitForCompletion(true);
        } catch (InterruptedException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        } catch (ClassNotFoundException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }

Please do let me know what are the configuration changes to be made so that classcast exception can be avoided.


回答1:


This usually happens when there is a type mismatch in what MapReduce is trying to pass through as a key/value and what the Map or Reduce class is templated to have.

You say that you are using KeyvalueTextInputFormat, but in your code you are using TextInputFormat. TextInputFormat delivers records as <LongWritable, Text> : "position, line".

I'm going to guess that the type of your Mapper is <Text, Text, ?, ?>. Therefore, MapReduce is trying to cast the LongWritable that TextInputFormat is giving it to a Text, and it can't, so it bombs out.

I suggest you either KeyvalueTextInputFormat or change the type of your mapper to <LongWritable, Text, ?, ?>.



来源:https://stackoverflow.com/questions/11541345/class-cast-exception-for-the-hadoop-new-api

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!