HBase mapreduce: write into HBase in Reducer

夙愿已清 提交于 2019-12-12 03:16:07

问题


I am learning the HBase. I know how to write a Java program using Hadoop MapReduce and write the output into HDFS; but now I want to write the same output into HBase, instead of HDFS. It should have some similar code like I did before in HDFS thing:

context.write(key,value);

Could anyone show me an example to achieve this?


回答1:


Here's one way to do this:

public static class MyMapper extends TableMapper<ImmutableBytesWritable, Put>  {

public void map(ImmutableBytesWritable row, Result value, Context context) throws IOException, InterruptedException {
    // this example is just copying the data from the source table...
    context.write(row, resultToPut(row,value));
}

private static Put resultToPut(ImmutableBytesWritable key, Result result) throws IOException {
    Put put = new Put(key.get());
    for (KeyValue kv : result.raw()) {
        put.add(kv);
     }
    return put;
   }
}

You can read here about Table Mapper




回答2:


Instead of using the FileOutputFormat when setting up your job, you should be able to use the TableOutputFormat.

http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/mapreduce/TableOutputFormat.html

You will still have to modify your Reducer a little.

A quote from the page above:

Convert Map/Reduce output and write it to an HBase table. The KEY is ignored while the output value must be either a Put or a Delete instance.



来源:https://stackoverflow.com/questions/18603031/hbase-mapreduce-write-into-hbase-in-reducer

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!