hbase

How to get all rows containing (or equaling) a particular ID from an HBase table?

落爺英雄遲暮 提交于 2019-12-24 07:36:18
问题 I have a method which select the row whose rowkey contains the parameter passed into. HTable table = new HTable(Bytes.toBytes(objectsTableName), connection); public List<ObjectId> lookUp(String partialId) { if (partialId.matches("[a-fA-F0-9]+")) { // create a regular expression from partialId, which can //match any rowkey that contains partialId as a substring, //and then get all the row with the specified rowkey } else { throw new IllegalArgumentException( "query must be done with

Get stuck when using java api to connect Hbase

谁说胖子不能爱 提交于 2019-12-24 07:28:16
问题 My local environment: OS X 10.9.2, Hbase-0.94.17, Java 1.6 My Hbase mode: standalone I was able to do operation in shell, but when I used java api, it did not work. My java code: import java.io.IOException; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.Get; import org.apache.hadoop.hbase.client.HTable; import org.apache.hadoop.hbase.client.Put; import org.apache.hadoop.hbase.client.Result; import org

About hbase, zookeeper.MetaTableLocator: Failed verification of hbase:meta, .NotServingRegionException

我的未来我决定 提交于 2019-12-24 06:06:04
问题 I use hadoop2.7.3,Hbase 1.2.3, zookeeper 3.4.9. Each time I stop the Hbase and restart it,it will throw the exception: [hadoop01:16000.activeMasterManager] zookeeper.MetaTableLocator: Failed verification of hbase:meta,,1 at address=hadoop05,16020,1478663588885, exception=org.apache.hadoop.hbase.NotServingRegionException: Region hbase:meta,,1 is not online on hadoop05,16020,1478664215143 at org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:2922) at

A connection error in remote mode of Titan-1.0.0+Hbase-0.98.20 using java

♀尐吖头ヾ 提交于 2019-12-24 03:05:54
问题 I am learning Titan database. I have run it successfully in local-mode. Now, I am trying to use Titan database in "Remote Server Mode" introduced in Titan-documentation. My Titan version is Titan-1.0.0-hadoop1. I have clusters in my LAN including cloud12 and cloud13 . I installed hadoop-1.2.1 on it, the master is cloud12 and the slave is cloud13 . I want to test the performance about create a graph, so I design to start my Hbase-0.98.20 in pseudo-distributed-mode on machine cloud12 with

Hbase简介和安装

社会主义新天地 提交于 2019-12-24 02:51:49
hbase简介 hbase是一个高可靠性,高性能,面向列,可伸缩,可实时读写的分布式数据库。 利用hadoop HDFS作为文件存储系统,利用hadoop MR来处理HBase中的海量数据,利用zk作为分布式协同服务。 主要用来存储非结构化和半结构化的松散数据(列存储,NoSQL数据库) ROW KEY决定一行数据,按照字典排序,最大存储64k,太大查询效率极低,row key 设计,加时间戳,row key 设计策略。 列族 Column Family ,一个列族里面可以有很多列,权限控制、存储以及调优都是在列族层面进行。 无类型 字节数据 HBase-1.2 1、hbase中有row key,column family,column,cell 每个cell单元格能设置版本,可以设置有1个版本10各版本,版本通过时间戳timestamp来区分,可以有新的版本,换成之前的版本等,解决了数据更新的问题。 短期之内cell历史版本不会删除,只有在文件合并等会删除。 2、hbase 中null不占用存储空间。cell单元格中都是未解析的字节数组组成,cell无类型,以字节码存储。 3、row key 大小最大64k,row key 设计很关键,影响效率。 4、可以存储结构化、非结构化、半结构化的数据。 5、hbase使用字典排序。如果cell有多个版本,可以使用一个最大值减去最之前的数据

Configuring Hive with Hbase

大兔子大兔子 提交于 2019-12-24 02:00:22
问题 I need to execute queries on HBase using hive. I have downloaded the HBase and hive my HMaster is running fine; I need to know what configuration changes I need to do for hive to work with HBase as the back end database. Any link tutorial will be appreciated. Thanks in advance. 回答1: The apache Hive wiki explains it nicely https://cwiki.apache.org/confluence/display/Hive/HBaseIntegration To create an HBase table that is managed from hive try something like CREATE TABLE hive_managed(key string,

Spark rdd write to Hbase

烂漫一生 提交于 2019-12-24 01:24:22
问题 I am able to read the messages from Kafka using the below code: val ssc = new StreamingContext(sc, Seconds(50)) val topicmap = Map("test" -> 1) val lines = KafkaUtils.createStream(ssc,"127.0.0.1:2181", "test-consumer-group",topicmap) But, I am trying to read each message from Kafka and putting into HBase. This is my code to write into HBase but no success. lines.foreachRDD(rdd => { rdd.foreach(record => { val i = +1 val hConf = new HBaseConfiguration() val hTable = new HTable(hConf, "test")

hbase shell: TypeError: can't dup NilClass

可紊 提交于 2019-12-23 23:08:42
问题 I'm getting this strange error when trying to launch hbase shell . I'm using CDH5. # hbase shell TypeError: can't dup NilClass dup at org/jruby/RubyKernel.java:1940 initialize at file:/usr/lib/hbase/lib/jruby-complete-1.6.8.jar!/META-INF/jruby.home/lib/ruby/1.8/pathname.rb:212 (root) at /usr/lib/hbase/bin/../bin/hirb.rb:41 Has it occurred to anyone? How can I resolve this? 回答1: I faced a similar exception. The reason was because, my older version of hbase was actually pointed in the path. so

Transport exception

不羁岁月 提交于 2019-12-23 18:54:41
问题 I'm trying to import happybase but gets the following error message while connecting. I have Hadoop pseudonode cluster and Hbase already running. Version of the components installed are as follows, Hadoop version - 1.0.4 Hbase version - 0.94.4 happybase -0.4 Can someone have a look into the exceptions below and let me know, if any thrift specific settings or any guidance in getting this fixed. Thank you. Python 2.6.1 (r261:67515, Jun 24 2010, 21:47:49) [GCC 4.2.1 (Apple Inc. build 5646)] on

How to store complex objects into hadoop Hbase?

十年热恋 提交于 2019-12-23 18:14:53
问题 I have complex objects with collection fields which needed to be stored to Hadoop. I don't want to go through whole object tree and explicitly store each field. So I just think about serialization of complex fields and store it as one big piece. And than desirialize it when reading object. So what is the best way to do it? I though about using some kind serilization for that but I hope that Hadoop has means to handle this situation. Sample object's class to store: class ComplexClass { <simple