Conflicting API when trying to run MRUnit example

时光总嘲笑我的痴心妄想 提交于 2020-01-05 12:34:34

问题


I've been playing around with MRUnit and tried running it for a hadoop wordcount example following the tutorial for wordcount and unit testing

Though not a fan, I've been using Eclipse to run the code and I keep getting an error for setMapper function

import java.io.IOException;
import java.util.ArrayList;
import java.util.List;

import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;


import org.apache.hadoop.mrunit.mapreduce.MapDriver;
import org.apache.hadoop.mrunit.mapreduce.MapReduceDriver;
import org.apache.hadoop.mrunit.mapreduce.ReduceDriver;

import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;

import org.junit.Before;
import org.junit.Test;

public class TestWordCount {
  MapReduceDriver<LongWritable, Text, Text, IntWritable, Text, IntWritable> mapReduceDriver;
  MapDriver<LongWritable, Text, Text, IntWritable> mapDriver;
  ReduceDriver<Text, IntWritable, Text, IntWritable> reduceDriver;

  @Before
  public void setUp() throws IOException
  {
      WordCountMapper mapper = new WordCountMapper();
      mapDriver = new MapDriver<LongWritable, Text, Text, IntWritable>();
      mapDriver.setMapper(mapper);  //<--Issue here

      WordCountReducer reducer = new WordCountReducer();
      reduceDriver = new ReduceDriver<Text, IntWritable, Text, IntWritable>();
      reduceDriver.setReducer(reducer);

      mapReduceDriver = new MapReduceDriver<LongWritable, Text, Text, IntWritable,     Text, IntWritable>();
      mapReduceDriver.setMapper(mapper); //<--Issue here
      mapReduceDriver.setReducer(reducer);
  }

Error message:

java.lang.Error: Unresolved compilation problems: 
    The method setMapper(Mapper<LongWritable,Text,Text,IntWritable>) in the type MapDriver<LongWritable,Text,Text,IntWritable> is not applicable for the arguments (WordCountMapper)
    The method setMapper(Mapper<LongWritable,Text,Text,IntWritable>) in the type MapReduceDriver<LongWritable,Text,Text,IntWritable,Text,IntWritable> is not applicable for the arguments (WordCountMapper)

Looking up this issue, I think it might be an API conflict but I'm not sure where to look for it. Anybody else have this issue before?

EDIT I'm using a user defined library with the hadoop2 jar and the latest Junit(4.10) jar in it.

EDIT 2 Here is the code for WordCountMapper

import java.io.IOException;
import java.util.StringTokenizer;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;

public class WordCountMapper extends Mapper<Object, Text, Text, IntWritable> 
{

    private final static IntWritable one = new IntWritable(1);
    private Text word = new Text();


    public void map(Object key, Text value, Context context)throws IOException, InterruptedException 
    {
        StringTokenizer itr = new StringTokenizer(value.toString());
        while (itr.hasMoreTokens()) 
        {
            word.set(itr.nextToken());
            context.write(word, one);
        }
    }
}

FINAL EDIT / IT WORKS

Turns out I needed to set

WordCountMapper mapper = new WordCountMapper();

to

Mapper mapper = new WordCountMapper();

since there was an issue with generics. Also needed to import the mockito library into my user defined library.


回答1:


Here's your problem

public class WordCountMapper extends Mapper<Object, Text, Text, IntWritable>
....
MapDriver<LongWritable, Text, Text, IntWritable> mapDriver;

Your WordCountMapper input type (Object) is not compatible with the MapDriver input type (LongWritable). Change your Mapper definition to

class WordCountMapper extends Mapper<LongWritable, Text, Text, IntWritable>

You probably want to change your map method argument from Object key to LongWritable key also.




回答2:


Make sure you have imported correct class, i have faced same error unlike above my program was having correct parameters in both classes Reducer and reduce_test but due to importing wrong class i have faced same error message which is reported above

Wrongly imported class--

import org.apache.hadoop.mrunit.ReduceDriver;

Correct class---

import org.apache.hadoop.mrunit.mapreduce.ReduceDriver;

same solution in case of mapper_test, if you are sure that your parameters are same in Mapper__class and Mapper_test



来源:https://stackoverflow.com/questions/24084265/conflicting-api-when-trying-to-run-mrunit-example

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!