1

I'm trying to use hbase-testing-util (1.2.0) on my project, but i get the following error:

An exception or error caused a run to abort: All datanodes 127.0.0.1:54655 are bad. Aborting... 

I'm using IntelliJ on Windows 10, and I've correctly setup HADOOP_HOME environment variable.

I read that property mapred.max.split.size could help me, but i don't know which file I have to modify.

  • Do you have a datanode running on your host? – tk421 May 13 '18 at 15:22
  • No. On Windows OS i've only the IDE and the Hadoop binaries. I've also put hbase-testing-util in pom.xml of my project. – Marco Catalano May 13 '18 at 15:51
  • It won't work unless you run Hadoop (all daemons) in addition to all the services HBase needs. – tk421 May 14 '18 at 00:14
  • I've installed Hadoop on Windows that it's up and running... but i get the same error. Do you know how to set the datanode port in HBaseTestingUtility() definition? – Marco Catalano May 19 '18 at 10:48
  • Which version of Hadoop? Look at https://stackoverflow.com/questions/50221379/how-do-i-set-up-hbase-with-hdfs-3-1-0/ for a reference. – tk421 May 19 '18 at 15:38
  • HBase 1.2.0 and Hadoop 2.7.3: this combination is supported. Do you have a code example? – Marco Catalano May 20 '18 at 10:23
  • `dfs.datanode.address`, `dfs.datanode.http.address`, and `dfs.datanode.ipc.address` should be defined in your hdfs-site.xml. See https://hadoop.apache.org/docs/r2.7.3/hadoop-project-dist/hadoop-hdfs/hdfs-default.xml as this shows the defaults. – tk421 May 21 '18 at 18:10

0 Answers0