Questions tagged [hbase]

HBase is the Hadoop database (columnar). Use it when you need random, real time read/write access to your Big Data. This project's goal is the hosting of very large tables -- billions of rows X millions of columns -- atop clusters of commodity hardware.

HBase is an open source, non-relational, distributed,versioned, column-oriented database modeled after Google's Bigtable and is written in Java. Bigtable: A Distributed Storage System for Structured by Chang et al. Just as Bigtable leverages the distributed data storage provided by the Google File System, HBase provides Bigtable-like capabilities on top of Hadoop Distributed File System(HDFS). HBase includes: It is developed as part of Apache Software Foundation's Apache Hadoop project and runs on top of HDFS (Hadoop Distributed File System), providing Bigtable-like capabilities for Hadoop.

  • Convenient base classes for backing Hadoop MapReduce jobs with HBase tables including cascading, hive and pig source and sink modules
  • Query predicate push down via server side scan and get filters
  • Optimizations for real time queries
  • A Thrift gateway and a REST-ful Web service that supports XML, Protobuf, and binary data encoding options
  • Extensible jruby-based (JIRB) shell
  • Support for exporting metrics via the Hadoop metrics subsystem to files or Ganglia; or via JMX
6961 questions
2
votes
0 answers

Why can't I get the data when I used hbase CompareOp.valueOf('EQUAL') in scan FILTER to query the rows?

I want to query the rows with hbase CompareOp.valueOf('EQUAL') in scan FILTER like this, but I can't get the data, and I'm sure there are data matching condition in the hbase. import org.apache.hadoop.hbase.filter.CompareFilter import…
W.Dylan
  • 21
  • 2
2
votes
1 answer

Using HBase Shell To Do A Put With A TTL Different From That Of The Column Family

I am doing some playing around with hbase shell to see how HBase behaves. But I cannot find anything equivalent in hbase shell of doing a put with a TTL different from that of the column family. The Java Put class (at least in HBase 1.x) has a…
user2456600
  • 515
  • 4
  • 14
2
votes
1 answer

TableNotFoundException: hbase:labels

I'm trying to upload tsv file to HBase using this command hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.columns=HBASE_ROW_KEY,info:email,info:country,info:continent test_table /user/test/1row.tsv My table defined as create…
2
votes
1 answer

HBase starts always a Zookeeper server

In Docker I have 2 containers: HBase Zookeeper I am configuring hbase-site.xml: hbase.zookeeper.quorum zookeeper
Thomas Decaux
  • 21,738
  • 2
  • 113
  • 124
2
votes
1 answer

Check if Phoenix server is correctly setup with HBase

I am using: Hadoop 2.7.1 in single node HBase 1.1.2 with HDFS, single node Phoenix 4.6 Everything is running with Docker, on separate containers, with Docker network (https://docs.docker.com/engine/userguide/networking/dockernetworks/). I copied…
Thomas Decaux
  • 21,738
  • 2
  • 113
  • 124
2
votes
1 answer

How to serve a web request using hbase

I have about 3 million documents that are pdfs, docs and images. I have build a website and if user search from website interface, I have to serve those hbase stored documents as required. How can I do it? Is it good to use hbase for serving web…
Hafiz Muhammad Shafiq
  • 8,168
  • 12
  • 63
  • 121
2
votes
0 answers

Why doesn't the spark-submit package reference always work?

So, I am running into the same issue that many are experiencing - see the error below. WARN scheduler.TaskSetManager: Lost task 0.0 in stage 3.0 (): java.io.IOException: java.lang.reflect.InvocationTargetException Caused by:…
davidpricedev
  • 2,107
  • 2
  • 20
  • 34
2
votes
0 answers

Java serialize byte[] for hbase

I want to store an UUID in Hbase using Java. Today, I write it as a String of 36 characters, but it takes too much space. I need to optimize. I want to write a raw byte[16], but it is not Serializable. Is there any easy way to do that? Am I doing…
Costin
  • 2,699
  • 5
  • 25
  • 43
2
votes
1 answer

HBase rowkey which includes timestamp

I would like to whether it is bad to have rowkeys like the following: username-timestamp This rows would be read from MapReduce jobs and will be put using java client API. Also, a subset would be selected using STARTROW, ENDROW. On one side this…
Kobe-Wan Kenobi
  • 3,694
  • 2
  • 40
  • 67
2
votes
1 answer

Indefinite pause while trying to insert data into HBase

While trying to insert data into HBase, I see that after doing a number of writes (~100000000) the insert operation simply hangs (0 writes/sec in UI). The master and region servers remain up and the java HBase client process too seems alive. All I…
Clyde D'Cruz
  • 1,915
  • 1
  • 14
  • 36
2
votes
1 answer

Import table from HDFS in HBase

I created a table in pig and stored it in hdfs: STORE mapping INTO 'hdfs://localhost:9000/hbase/data/default/mapping' USING PigStorage ('\t'); Running the ls command on hdfs, I'm getting the table: bin/hdfs dfs -ls /hbase/data/default Found 1…
Jane Doe
  • 33
  • 6
2
votes
1 answer

guava version conflict with HBase 1 and ES 2

I'm having a project using both HBase 1.0.0 (Cloudera version) and Elasticsearch. With the upgrade to ES 2.0 I'm experiencing a problem with guava version. ES 2.0 requires guava version 18.0, but Cloudera requires guava 14.0.1. No matter what…
divadpoc
  • 903
  • 10
  • 31
2
votes
1 answer

What is the right way to set up HBase, hadoop, hive to access Hbase through hive?

I have a problem with configuring and installing hbase/hadoop/hive. What I did so far on an VM with ubuntu 14.04.3 LTS: installed jdk like this with the Version…
dino
  • 239
  • 3
  • 12
2
votes
1 answer

Apache Hue only show ten entry from hbase database

I use hbase to store some data from web. And I also use the apache Hue to visually view the content in the hbase. But it only shows the first ten entries from the database. I could not get it to show more. There is no next page button. I know I can…
Iching Chang
  • 638
  • 1
  • 7
  • 17
2
votes
0 answers

Loading Data to HBase from HDFS Through Apache Phoenix Using Play Java Application

I'm a beginner to Hadoop technologies and I'm struggling with loading data to hbase from HDFS through Phoenix using Play Java application. I tried to load data to hbase from HDFS through pig scripts and it works well but I couldn't find a way to do…
Chanaka
  • 21
  • 3
1 2 3
99
100