hadoop MultipleInputs fails with ClassCastException

ⅰ亾dé卋堺 提交于 2019-11-27 23:34:18

Following up on my comment, the Javadocs for TaggedInputSplit confirms that you are probably wrongly casting the input split to a FileSplit:

/**
 * An {@link InputSplit} that tags another InputSplit with extra data for use
 * by {@link DelegatingInputFormat}s and {@link DelegatingMapper}s.
 */

My guess is your setup method looks something like this:

@Override
protected void setup(Context context) throws IOException,
        InterruptedException {
    FileSplit split = (FileSplit) context.getInputSplit();
}

Unfortunately TaggedInputSplit is not public visible, so you can't easily do an instanceof style check, followed by a cast and then call to TaggedInputSplit.getInputSplit() to get the actual underlying FileSplit. So either you'll need to update the source yourself and re-compile&deploy, post a JIRA ticket to ask this to be fixed in future version (if it already hasn't been actioned in 2+) or perform some nasty nasty reflection hackery to get to the underlying InputSplit

This is completely untested:

@Override
protected void setup(Context context) throws IOException,
        InterruptedException {
    InputSplit split = context.getInputSplit();
    Class<? extends InputSplit> splitClass = split.getClass();

    FileSplit fileSplit = null;
    if (splitClass.equals(FileSplit.class)) {
        fileSplit = (FileSplit) split;
    } else if (splitClass.getName().equals(
            "org.apache.hadoop.mapreduce.lib.input.TaggedInputSplit")) {
        // begin reflection hackery...

        try {
            Method getInputSplitMethod = splitClass
                    .getDeclaredMethod("getInputSplit");
            getInputSplitMethod.setAccessible(true);
            fileSplit = (FileSplit) getInputSplitMethod.invoke(split);
        } catch (Exception e) {
            // wrap and re-throw error
            throw new IOException(e);
        }

        // end reflection hackery
    }
}

Reflection Hackery Explained:

With TaggedInputSplit being declared protected scope, it's not visible to classes outside the org.apache.hadoop.mapreduce.lib.input package, and therefore you cannot reference that class in your setup method. To get around this, we perform a number of reflection based operations:

  1. Inspecting the class name, we can test for the type TaggedInputSplit using it's fully qualified name

    splitClass.getName().equals("org.apache.hadoop.mapreduce.lib.input.TaggedInputSplit")

  2. We know we want to call the TaggedInputSplit.getInputSplit() method to recover the wrapped input split, so we utilize the Class.getMethod(..) reflection method to acquire a reference to the method:

    Method getInputSplitMethod = splitClass.getDeclaredMethod("getInputSplit");

  3. The class still isn't public visible so we use the setAccessible(..) method to override this, stopping the security manager from throwing an exception

    getInputSplitMethod.setAccessible(true);

  4. Finally we invoke the method on the reference to the input split and cast the result to a FileSplit (optimistically hoping its a instance of this type!):

    fileSplit = (FileSplit) getInputSplitMethod.invoke(split);

I had this same problem, but the actual problem was that I was still setting the InputFormat after setting up the MultipleInputs:

job.setInputFormatClass(SequenceFileInputFormat.class);

Once I removed this line everything worked fine.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!