0

Some time ago I have installed and configured Hadoop on my computer and it has been working fine. However, I have now tried to connect to hdfs and I got an error:

Call From USER-MacBook-Air.local/192.168.0.174 to localhost:8020 failed on connection exception: java.net.ConnectException: Connection refused;

My configuration of core-site.xml is:

<configuration>
  <property>
    <name>hadoop.tmp.dir</name>
    <value>/usr/local/Cellar/hadoop/hdfs/tmp</value>
    <description>A base for other temporary directories</description>             
  </property>
  <property>
    <name>fs.default.name</name>
    <value>hdfs://localhost:8020</value>
  </property>
</configuration>

Configuration of hdfs-site.xml:

<configuration>
  <property>
    <name>fs.default.name</name>
    <value>localhost:8020</value>
  </property>

  <property>
    <name>dfs.replication</name>
    <value>1</value>
  </property>
</configuration>

When I try to check my Namenode:

bin/hadoop namenode -format

STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = USER-MacBook-Air.local/192.168.0.174
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 3.3.0

I don't know why, but I think the problem is with a host of NameNode. But I am not sure how to solve it and how it should be. I'm working on mac with Java 15. At the moment I am pretty much confused. Any suggestions on how to solve the issues would be appreciated. Thank you.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Ksenija
  • 31
  • 4
  • The connection failed, so HDFS isn't running... Note: Hadoop doesn't support Java 15 and any crashes would be shown in the logs, not the output of a specific command – OneCricketeer Jun 26 '21 at 22:25

1 Answers1

0

try change your configs with the following for 1 node:

hdfs-site.xml

<configuration>
  <property>
    <name>dfs.datanode.max.transfer.threads</name>
    <value>4096</value>
  </property>
  <property>
    <name>dfs.replication</name>
    <value>1</value>
  </property>
</configuration>

core-site.xml:

<configuration>
 <property>
  <name>fs.defaultFS</name>
  <value>hdfs://localhost:9000</value>
 </property>
 <property>
  <name>hadoop.tmp.dir</name>
  <value>/yourpath/hadoop-3.x.x/dirdata</value>
 </property>
</configuration>

Make sure you can do 'ssh localhost' without any problem or asking password (see this).

Majid Hajibaba
  • 3,105
  • 6
  • 23
  • 55