NoSuchMethodError HTableDescriptor.addFamily

℡╲_俬逩灬. 提交于 2019-12-10 11:35:42

问题


I have installed hadoop 2.5.2 and hbase 1.0.1.1 (which are compatible with each other) .But In the hadoop code I am trying to add columnfamily in the hbase table.

My code is

Configuration hbaseConfiguration =  HBaseConfiguration.create();
Job hbaseImportJob = new Job(hbaseConfiguration, "FileToHBase");


HBaseAdmin hbaseAdmin = new HBaseAdmin(hbaseConfiguration);

if (!hbaseAdmin.tableExists(Config_values.tableName)) {        
    TableName tableName1 = TableName.valueOf("tableName");
    HTableDescriptor hTableDescriptor = new HTableDescriptor(tableName1);
    HColumnDescriptor hColumnDescriptor1 = new HColumnDescriptor("columnFamily1");                                 
    hTableDescriptor.addFamily(hColumnDescriptor1);                                   
    hbaseAdmin.createTable(hTableDescriptor);
}

I am getting this error

java.lang.NoSuchMethodError: Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.hbase.HTableDescriptor.addFamily(Lorg/apache/hadoop/hbase/HColumnDescriptor;)V at com.atm_ex.atm_ex.Profiles.profiles(Profiles.java:177) at com.atm_ex.atm_ex.App.main(App.java:28) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:212)


回答1:


For safety reason, you should use the same version of hbase for both compiling and running you jar. when you run mvn clean package -DskipTests=true,make sure that your hbase pom dependence match with your cdh hbase ,not version ,but the same method it contains,the cdh may not follow the apache orign. Maybe you can try use the pom (maven repository) which cdh support on it website.

    <name>c-cdh-maven-dep</name>
<!---  you need try both  -->
    <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
    <!---  i have tried this and it works well -->
    <!-- <url>http://maven.apache.org</url> -->

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    </properties>


    <!-- <repositories> <repository> <id>cloudera</id> <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url> 
        </repository> </repositories> -->

    <dependencies>
        <!-- <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> 
            <version>3.8.1</version> <scope>test</scope> </dependency> -->

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.6.0-cdh5.7.0</version>
        </dependency>

    <!--    <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-hdfs</artifactId>
            <version>2.6.0-cdh5.7.0</version>
        </dependency> -->
<!--        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-maven-plugins</artifactId>
            <version>2.6.0-cdh5.7.0</version>
        </dependency> -->
        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-client</artifactId>
            <version>1.2.0-cdh5.7.0</version>
        </dependency>
<!--        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-client</artifactId>
            <version>1.2.0</version>
        </dependency> -->


        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-hadoop2-compat</artifactId>
            <version>1.2.0</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>1.5.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.10</artifactId>
            <version>1.5.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-hive_2.10</artifactId>
            <version>1.5.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.10</artifactId>
            <version>1.5.1</version>
        </dependency>

        <!-- hadoop dependency start -->
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>2.6.0</version>
        </dependency>
        <!-- Hadoop dep end -->

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming-kafka_2.10</artifactId>
            <version>1.5.1</version>
        </dependency>
        <!-- spark dep end -->

        <dependency>
            <groupId>org.clojure</groupId>
            <artifactId>clojure</artifactId>
            <version>1.6.0</version>
        </dependency>
        <dependency>
            <groupId>com.google.guava</groupId>
            <artifactId>guava</artifactId>
            <version>11.0.2</version>
        </dependency>

        <dependency>
            <groupId>com.google.protobuf</groupId>
            <artifactId>protobuf-java</artifactId>
            <version>2.5.0</version>
        </dependency>
        <dependency>
            <groupId>io.netty</groupId>
            <artifactId>netty</artifactId>
            <version>3.6.6.Final</version>
        </dependency>

        <dependency>
            <groupId>org.apache.zookeeper</groupId>
            <artifactId>zookeeper</artifactId>
            <version>3.4.5</version>
        </dependency>
        <dependency>
            <groupId>org.cloudera.htrace</groupId>
            <artifactId>htrace-core</artifactId>
            <version>2.01</version>
        </dependency>


    <!--    <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase</artifactId>
            <version>2.0.0-SNAPSHOT</version>
            <type>pom</type>
        </dependency> -->



        <!-- hbase dep start -->
        <!-- https://mvnrepository.com/artifact/org.apache.hbase/hbase -->
<!--        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase</artifactId>
            <version>1.2.0</version>
            <type>pom</type>
        </dependency> -->



<!--        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-common</artifactId>
            <version>1.0.0</version>
        </dependency> -->
        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-server</artifactId>
            <version>1.2.0</version>
        </dependency>



回答2:


The HBase related jar files have to be included in the MapReduce job. Check this Cloudera blog for more details.




回答3:


I have encountered such an error too! But in a different scenario: I compiled my jar using hbase-0.98, though I run it with hbase-1.0.1.1.

Finally I found that in hbase-0.98, the method signature is

  • void addFamily(final HColumnDescriptor family)

, BUT in hbase-1.0.1.1 it is

  • HTableDescriptor addFamily(final HColumnDescriptor family)

There are different!

For safety reason, you should use the same version of hbase for both compiling and running you jar.



来源:https://stackoverflow.com/questions/31032693/nosuchmethoderror-htabledescriptor-addfamily

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!