9

My local environment: OS X 10.9.2, Hbase 0.98.0, Java1.6

conf/hbase-site.xml

 <property>
     <name>hbase.rootdir</name>
     <!--<value>hdfs://127.0.0.1:9000/hbase</value> need to run dfs -->
     <value>file:///Users/apple/Documents/tools/hbase-rootdir/hbase</value>
 </property>

 <property>
        <name>hbase.zookeeper.property.dataDir</name>
        <value>/Users/apple/Documents/tools/hbase-zookeeper/zookeeper</value>
 </property> 

conf/hbase-env.sh

export JAVA_HOME=$(/usr/libexec/java_home -d 64 -v 1.6)
export HBASE_OPTS="-XX:+UseConcMarkSweepGC"

And when I ran

> list

in Hbase shell, I got following errors:

2014-03-29 10:25:53.412 java[2434:1003] Unable to load realm info from SCDynamicStore
2014-03-29 10:25:53,416 WARN  [main] util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2014-03-29 10:26:14,470 ERROR [main] zookeeper.RecoverableZooKeeper: ZooKeeper exists failed after 4 attempts
2014-03-29 10:26:14,471 WARN  [main] zookeeper.ZKUtil: hconnection-0x5e15e68d, quorum=localhost:2181, baseZNode=/hbase Unable to set watcher on znode (/hbase/hbaseid)
org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase/hbaseid
    at org.apache.zookeeper.KeeperException.create(KeeperException.java:99)
    at org.apache.zookeeper.KeeperException.create(KeeperException.java:51)
    at org.apache.zookeeper.ZooKeeper.exists(ZooKeeper.java:1041)
    at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:199)
    at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
    at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
    at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
    at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:857)
    at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:662)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:414)
    at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:393)
    at org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:274)
    at org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:183)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at org.jruby.javasupport.JavaConstructor.newInstanceDirect(JavaConstructor.java:275)
    at org.jruby.java.invokers.ConstructorInvoker.call(ConstructorInvoker.java:91)
    at org.jruby.java.invokers.ConstructorInvoker.call(ConstructorInvoker.java:178)
    at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:322)
    at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:178)
    at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:182)
    at org.jruby.java.proxies.ConcreteJavaProxy$2.call(ConcreteJavaProxy.java:48)
    at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:322)
    at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:178)
    at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:182)
    at org.jruby.RubyClass.newInstance(RubyClass.java:829)
        ...
at Users.apple.Documents.tools.hbase_minus_0_dot_98_dot_0_minus_hadoop2.bin.hirb.block_2$RUBY$start(/Users/apple/Documents/tools/hbase-0.98.0-hadoop2/bin/hirb.rb:185)
    at Users$apple$Documents$tools$hbase_minus_0_dot_98_dot_0_minus_hadoop2$bin$hirb$block_2$RUBY$start.call(Users$apple$Documents$tools$hbase_minus_0_dot_98_dot_0_minus_hadoop2$bin$hirb$block_2$RUBY$start:65535)
    at org.jruby.runtime.CompiledBlock.yield(CompiledBlock.java:112)
    at org.jruby.runtime.CompiledBlock.yield(CompiledBlock.java:95)
    at org.jruby.runtime.Block.yield(Block.java:130)
    at org.jruby.RubyContinuation.enter(RubyContinuation.java:106)
    at org.jruby.RubyKernel.rbCatch(RubyKernel.java:1212)
    at org.jruby.RubyKernel$s$1$0$rbCatch.call(RubyKernel$s$1$0$rbCatch.gen:65535)
    at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:322)
    at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:178)
    at org.jruby.runtime.callsite.CachingCallSite.callIter(CachingCallSite.java:187)
    at Users.apple.Documents.tools.hbase_minus_0_dot_98_dot_0_minus_hadoop2.bin.hirb.method__5$RUBY$start(/Users/apple/Documents/tools/hbase-0.98.0-hadoop2/bin/hirb.rb:184)
    at Users$apple$Documents$tools$hbase_minus_0_dot_98_dot_0_minus_hadoop2$bin$hirb$method__5$RUBY$start.call(Users$apple$Documents$tools$hbase_minus_0_dot_98_dot_0_minus_hadoop2$bin$hirb$method__5$RUBY$start:65535)
    at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:203)
    at org.jruby.internal.runtime.methods.CompiledMethod.call(CompiledMethod.java:255)
    at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:292)
    at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:135)
    at Users.apple.Documents.tools.hbase_minus_0_dot_98_dot_0_minus_hadoop2.bin.hirb.__file__(/Users/apple/Documents/tools/hbase-0.98.0-hadoop2/bin/hirb.rb:190)
    at Users.apple.Documents.tools.hbase_minus_0_dot_98_dot_0_minus_hadoop2.bin.hirb.load(/Users/apple/Documents/tools/hbase-0.98.0-hadoop2/bin/hirb.rb)
    at org.jruby.Ruby.runScript(Ruby.java:697)
    at org.jruby.Ruby.runScript(Ruby.java:690)
    at org.jruby.Ruby.runNormally(Ruby.java:597)
    at org.jruby.Ruby.runFromMain(Ruby.java:446)
    at org.jruby.Main.doRunFromMain(Main.java:369)
    at org.jruby.Main.internalRun(Main.java:258)
    at org.jruby.Main.run(Main.java:224)
    at org.jruby.Main.run(Main.java:208)
    at org.jruby.Main.main(Main.java:188)
2014-03-29 10:28:21,137 ERROR [main] client.HConnectionManager$HConnectionImplementation: Can't get connection to ZooKeeper: KeeperErrorCode = ConnectionLoss for /hbase

ERROR: KeeperErrorCode = ConnectionLoss for /hbase

And my /etc/hosts looks right:

127.0.0.1   localhost
255.255.255.255 broadcasthost
::1             localhost 
fe80::1%lo0 localhost
127.0.0.1 activate.adobe.com
127.0.0.1 practivate.adobe.com
127.0.0.1 ereg.adobe.com
127.0.0.1 activate.wip3.adobe.com
127.0.0.1 wip3.adobe.com
127.0.0.1 3dns-3.adobe.com
127.0.0.1 3dns-2.adobe.com
127.0.0.1 adobe-dns.adobe.com
127.0.0.1 adobe-dns-2.adobe.com
127.0.0.1 adobe-dns-3.adobe.com
127.0.0.1 ereg.wip3.adobe.com
127.0.0.1 activate-sea.adobe.com
127.0.0.1 wwis-dubc1-vip60.adobe.com
127.0.0.1 activate-sjc0.adobe.com
127.0.0.1 adobe.activate.com
127.0.0.1 209.34.83.73:443
127.0.0.1 209.34.83.73:43
127.0.0.1 209.34.83.73
127.0.0.1 209.34.83.67:443
127.0.0.1 209.34.83.67:43
127.0.0.1 209.34.83.67
127.0.0.1 ood.opsource.net
127.0.0.1 CRL.VERISIGN.NET
127.0.0.1 199.7.52.190:80
127.0.0.1 199.7.52.190
127.0.0.1 adobeereg.com
127.0.0.1 OCSP.SPO1.VERISIGN.COM
127.0.0.1 199.7.54.72:80
127.0.0.1 199.7.54.72
ztirom
  • 4,382
  • 3
  • 28
  • 39
Rickie Lau
  • 291
  • 2
  • 8
  • 15

5 Answers5

12

I also met the same problem and struggled for a long time. Following the instructions here, before run ./bin/hbase shell command you should use ./bin/start-hbase.sh first. Then my problem was solved.

freedomn-m
  • 27,664
  • 8
  • 35
  • 57
Kehe CAI
  • 1,161
  • 12
  • 18
3

As your hbase-site.xml says - you have tried running hbase on hdfs also and now you are trying to run on local file system.
Solution : run hadoop.x.x.x/bin/start-dfs.sh first, and then run hbase.x.x.x/bin/start-hbase.sh . It will now run as expected on local file system.

Chandra kant
  • 1,554
  • 1
  • 11
  • 14
  • still get the same error... and actually I want to run Hbase in standalone mode and use java api to connect Hbase without Hadoop, do you have any idea how can I do that? Thank you very much. – Rickie Lau Mar 30 '14 at 07:28
  • can you post the output of hbase-...-local.log from the logs directory.It will be inside you hbase.x.x.x directory. – Chandra kant Mar 30 '14 at 07:31
  • here is my log file: https://docs.google.com/file/d/0BxtBre5A8J61SWRsclE2dnQzdVk/edit – Rickie Lau Mar 30 '14 at 12:41
  • Looked at the log files. Looks like there is an incompatibility between filesystem hbase wants and what its getting from hdfs. I could provide you a solution but truly speaking unless I work on hbase-0.98 version , I won't be able to give you the exact solution. My suggestion is - since you are new to this, go for a stable version . People have worked on that extensively and could give you exact solutions. Also roll back from hadoop2.x version to stable 1.x version. Once you get comfortable, go for latest versions. That said try "hbase hbck -fixVersionFile" command and see what happens.. – Chandra kant Mar 30 '14 at 13:36
  • actually I want to get rid of hadoop and maybe use Hbase-0.94.x, anyway, thanks very much for your patient answer :) – Rickie Lau Mar 30 '14 at 14:19
0

I was in this problem too.

If you trying in stand alone, only use hbase library and remove hadoop from your libraries and use hbase.hadoop libraries.

  • 2
    And how could I do that? Just remove the whole hadoop folder? – Rickie Lau Mar 29 '14 at 16:05
  • I'm using libraries that are in this path: hbase-094.16/lib/* -- and then run hbase with in terminal (I'm using CentOs). –  Mar 30 '14 at 08:15
  • and you cannot use hdfs and your os filesystem . –  Mar 30 '14 at 10:10
  • I exchange my Hbase version to 0.94.17 and remove all the local hadoop configuration, now it works ok with shell. But I still get some problem here: http://stackoverflow.com/questions/22755912/get-stuck-when-using-java-api-to-connect-hbase – Rickie Lau Mar 31 '14 at 07:40
0

I faced this problem when i didn't add my hostname in /etc/hosts file.

for example, my hostname is node1.

add 127.0.0.1 node1 in /etc/hosts
Shawn Mehan
  • 4,513
  • 9
  • 31
  • 51
user1531214
  • 21
  • 1
  • 3
-1

I also faced this problem, later got conclusion

When I write start-hbase.sh directly into hdfs shell its showing error "No Command".

Then I navigate to hbase bin folder cd /usr/local/hbase/bin and gave the command ./start-hbase.sh . It started working (zookeeper and master services found running).

Also for hbase shell, first you need to enter hbase bin folder, then type ./hbase shell

Hope this works :)