Questions tagged [hbase]

HBase is the Hadoop database (columnar). Use it when you need random, real time read/write access to your Big Data. This project's goal is the hosting of very large tables -- billions of rows X millions of columns -- atop clusters of commodity hardware.

HBase is an open source, non-relational, distributed,versioned, column-oriented database modeled after Google's Bigtable and is written in Java. Bigtable: A Distributed Storage System for Structured by Chang et al. Just as Bigtable leverages the distributed data storage provided by the Google File System, HBase provides Bigtable-like capabilities on top of Hadoop Distributed File System(HDFS). HBase includes: It is developed as part of Apache Software Foundation's Apache Hadoop project and runs on top of HDFS (Hadoop Distributed File System), providing Bigtable-like capabilities for Hadoop.

  • Convenient base classes for backing Hadoop MapReduce jobs with HBase tables including cascading, hive and pig source and sink modules
  • Query predicate push down via server side scan and get filters
  • Optimizations for real time queries
  • A Thrift gateway and a REST-ful Web service that supports XML, Protobuf, and binary data encoding options
  • Extensible jruby-based (JIRB) shell
  • Support for exporting metrics via the Hadoop metrics subsystem to files or Ganglia; or via JMX
6961 questions
2
votes
1 answer

hbase doesn't start because of Master exiting error

I attempting to start hbase0.98.4 in windows 8. I installed hadoop2.3.0 and zookeeper3.3.6. I started hadoop : start-dfs.cmd start-yarn.cmd then I started hbase: start-hbase.cmd but The logs give the reason as: log4j:ERROR Could not find…
NASRIN
  • 475
  • 7
  • 22
2
votes
0 answers

getting NoClassDefFounderror for hbase context

even after adding the dependencies at runtime I'm getting follow error Exception in thread "main" java.lang.NoClassDefFoundError: com/cloudera/spark/hbase/HBaseContext at Spark_HBase.SparkHBaseExample$.main(SparkHBaseExample.scala:36) at…
Gunjan
  • 19
  • 6
2
votes
0 answers

Is it possible to block region's splitting in hbase coprocessor?

I have some processing(enrichment of trades) in my coprocesserService which modifies the existing data in place. It iterates over every row, modifies and puts it back to region. The table can be modified by only one client. During the processing…
dpolaczanski
  • 386
  • 1
  • 3
  • 18
2
votes
5 answers

Apache Kylin Unable to find HBase common lib

I have installed Hadoop version 2.6.0, HBase version 0.99.0 , Hive version 1.2, Kylin version 1.5.0. I have setup all of the above in Standalone mode while in running Kylin it checks in early stage about Hadoop, HBase and Hive. Each and everything…
Kunal Gupta
  • 83
  • 1
  • 10
2
votes
1 answer

Delete hbase cell using spark

Is there any api available to delete a specific HBase cell using Spark Scala. We are able to read and write using Spark-HBase Connector. Any suggestion for cell deletion is highly appreciable.
2
votes
1 answer

How to get salt bucket count in Apache Phoenix?

Apache Phoenix recommends salted buckets for improved performance. I wish to get the count of salted buckets for a table I created some time ago. The SQLLine based client doesn't offer anything similar to a MySQL SHOW CREATE TABLE. Also Hbase…
Alavalathi
  • 713
  • 2
  • 9
  • 21
2
votes
0 answers

Example of spinning up a Hbase mock style test for integration testing in scala

I am trying to find an example of how to spin up a Hbase server in a mock or integration style, so I can test my code locally in my IDE. I have tried fake-hbase and hbase testing utility and receive errors especially when trying to start the…
eboni
  • 883
  • 2
  • 10
  • 25
2
votes
1 answer

master.HMaster: Failed to become active master SIMPLE authentication is not enabled. Available:[TOKEN]

I am trying to setup hbase on my local MAC machine. I installed hadoop and hbase via brew. The version for hadoop and hbase are 2.7.1 and 1.1.2 respectively. I am trying to run in Pseudo distributed mode and want to disable the authentication so…
Jatin
  • 63
  • 2
  • 8
2
votes
2 answers

Which Phoenix version should I use with HBase in Cloudera 5.5 and Hortonworks 2.4?

Is there a single version of phoenix that is compatible with HBase provided in both Cloudera 5.5 and Hortonworks 2.4?
2
votes
1 answer

Hbase range scan while eliminating region server hot spotting

I have a hbase table and row key will be like <>_<> where time stamp will be yyyyMMddHHmm. My concern is to query user details in a given time range. eg: "201602021310_user1" HTable table = new HTable(conf, tableName); …
tiroshanm
  • 123
  • 3
  • 13
2
votes
2 answers

HBase read data returns null

I have created a java application to read data from HBase. I checked link1 link2 link3 and link4. Program returns null even there is data in my table. hbase shell: hbase(main):009:0> get 'login','1' COLUMN CELL …
Alican Balik
  • 1,284
  • 1
  • 8
  • 22
2
votes
0 answers

Hbase Schema design for facebook data

I am trying to get the posts from my facebook page and put them into hbase. If I design a relational data model I have three tables with one to many mapping like this: Post has multiple comments and comment has multiple reply comments. After doing…
anusha
  • 51
  • 4
2
votes
1 answer

Set row TTL at HBase

I'm trying to set a TTL to a row at Hbase. I can't set the TTL to the column family because the table already exists and I can't change that. What I tried is to use setTTL function in the Put operation. But it seems that is not working because…
Javi Ortiz
  • 568
  • 1
  • 7
  • 22
2
votes
1 answer

Performance of OpenTSDB

It takes 6 seconds to return json of 9000 datapoints. I have approximately 10GB of Data in 12 metrics say x.open, x.close... Data Storage pattern: Metric : x.open tagk : symbol tagv : stringValue Metric : x.close tagk : symbol tagv : stringValue My…
2
votes
0 answers

Exporting Hbase scan command output as CSV/Pipeseperated

I am looking for a way to export output of a scan command from Hbase. hbase(main):028:0* scan 'version_test_2',{FILTER => "MultipleColumnPrefixFilter('name','sal')",TIMERANGE => [0, 2] } ROW COLUMN+CELL 1 …
dijin
  • 61
  • 6