I am looking for a little clarification on the the answers to this question here:
Generating Separate Output files in Hadoop Streaming
My use case is as foll
You can do something like the following, but it involves a little Java compiling, which I think shouldn't be a problem, if you want your use case done anyway with Python- From Python, as far as I know it's not directly possible to skip the filename from the final output as your use case demands in a single job. But what's shown below can make it possible with ease!
Here is the Java class that's need to compiled -
package com.custom;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapred.lib.MultipleTextOutputFormat;
public class CustomMultiOutputFormat extends MultipleTextOutputFormat<Text, Text> {
/**
* Use they key as part of the path for the final output file.
*/
@Override
protected String generateFileNameForKeyValue(Text key, Text value, String leaf) {
return new Path(key.toString(), leaf).toString();
}
/**
* We discard the key as per your requirement
*/
@Override
protected Text generateActualKey(Text key, Text value) {
return null;
}
}
Steps to compile:
While you are in the directory where the above saved file is, type -
$JAVA_HOME/bin/javac -cp $(hadoop classpath) -d . CustomMultiOutputFormat.java
Make sure JAVA_HOME is set to /path/to/your/SUNJDK before attempting the above command.
Make your custom.jar file using (type exactly) -
$JAVA_HOME/bin/jar cvf custom.jar com/custom/CustomMultiOutputFormat.class
Finally, run your job like -
hadoop jar /path/to/your/hadoop-streaming-*.jar -libjars custom.jar -outputformat com.custom.CustomMultiOutputFormat -file your_script.py -input inputpath --numReduceTasks 0 -output outputpath -mapper your_script.py
After doing these you should see two directories inside your outputpath one with valid_file_name and other with err_file_name. All records having valid_file_name as a tag will go to valid_file_name directory and all records having err_file_name would go to err_file_name directory.
I hope all these makes sense.