2

I created a table in pig and stored it in hdfs:

STORE mapping INTO  'hdfs://localhost:9000/hbase/data/default/mapping' USING PigStorage ('\t');

Running the ls command on hdfs, I'm getting the table:

bin/hdfs dfs -ls /hbase/data/default
Found 1 item
drwxr-xr-x   - hfu supergroup          0 2015-11-09 13:33 /hbase/data/default/mapping

But while running the list command in HBase shell, the table doesn't appear.

I'm using:

hbase-0.98.0-hadoop2
hadoop-2.6.1
pig-0.15.0

all running on one virtual machine

How can I import the table in HBase?

Brian Tompsett - 汤莱恩
  • 5,753
  • 72
  • 57
  • 129
Jane Doe
  • 33
  • 6

1 Answers1

2

First of all create the table in HBase using HBaseHCatStorageHandler or directly from HBase shell.

CREATE TABLE meters (col1 STRING, col2 STRING) STORED BY 'org.apache.hcatalog.hbase.HBaseHCatStorageHandler' TBLPROPERTIES ( 'hbase.table.name' = 'meters', 'hbase.columns.mapping' = 'd:col2', 'hcat.hbase.output.bulkMode' = 'true' ) ;

col1- Will be the Rowkey of HBase table col2- Will be the column qualifier under column family "d"

Now use STORE command to load data into this table.

Maddy RS
  • 1,031
  • 8
  • 9
  • 1
    I've created a HBase table from HBase shell and then I've tried to store the data from the pig table into HBase. But the store command in the pig shell stops after connecting to zookeeper: `[main-SendThread(localhost:2181)] INFO org.apache.zookeeper.ClientCnxn - Session establishment complete on server localhost/127.0.0.1:2181, sessionid = 0x150ec3d937d000a, negotiated timeout = 90000` – Jane Doe Nov 11 '15 at 11:36
  • with Error "unable to store" – Jane Doe Nov 11 '15 at 11:48