1

I'm trying to run Hadoop 2.7.1 on Centos 7.

When I'm invoking script start-dfs.sh, I'm getting following error:

    localhost: starting datanode, logging to /usr/bin/hadoop/logs/hadoop-hadoop-   datanode-localhost.localdomain.out
localhost: nice: /usr/bin/hadoop/share/hadoop/hdfs/bin/hdfs: Permission denied

environment variables are set as:

HADOOP_HOME=/usr/bin/hadoop
HADOOP_HDFS_HOME=/usr/bin/hadoop/share/hadoop/hdfs
HADOOP_COMMON_LIB_NATIVE_DIR=/usr/bin/hadoop/lib/native
HADOOP_CLASSPATH=/usr/bin/hadoop/share/hadoop/hdfs/*

core-site.xml:

<property>
    <name>fs.defaultFS</name>
    <value>hdfs://localhost:9000</value>
</property>

hdfs-site.xml has not been set up. I'm struggling for the last few days, so I would also be helpful for any link to a site with detailed explanation of each option.

tnx

additional:

/usr/bin/hadoop/logs:hadoop]start-dfs.sh
Starting namenodes on [localhost]
59: starting hadoop-daemons.sh
hadoop@localhost's password: 
localhost: starting namenode, logging to /usr/bin/hadoop/logs/hadoop-hadoop-namenode-localhost.localdomain.out
localhost: nice: /usr/bin/hadoop/share/hadoop/hdfs/bin/hdfs: No such file or directory
73: starting hadoop-daemons.sh
hadoop@localhost's password: 
localhost: starting datanode, logging to /usr/bin/hadoop/logs/hadoop-hadoop-datanode-localhost.localdomain.out
localhost: nice: /usr/bin/hadoop/share/hadoop/hdfs/bin/hdfs: No such file or directory
Starting secondary namenodes [0.0.0.0]
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
ECDSA key fingerprint is 14:f8:1e:51:32:e9:25:94:45:99:5d:a6:01:89:a2:7d.
Are you sure you want to continue connecting (yes/no)? no
0.0.0.0: Host key verification failed.
bongo
  • 11
  • 5
  • check for permissions on /usr/bin/hadoop/share/hadoop/hdfs/bin/hdfs – duck Sep 02 '15 at 11:10
  • group and owner are correctly set up (for 'hadoop' user). Also, I checked all other directories. – bongo Sep 02 '15 at 11:27
  • With other directories, you mean @bongo, the entire tree on /usr/bin/hadoop? – chomp Sep 02 '15 at 12:03
  • Please provide error reported in logfile - /usr/bin/hadoop/logs/hadoop-hadoop-datanode-localhost.localdomain.log while running start-dfs.sh? – Shubhangi Sep 02 '15 at 12:44
  • http://codesfusion.blogspot.in/2013/10/setup-hadoop-2x-220-on-ubuntu.html – Kishore Sep 02 '15 at 14:04
  • nice: /usr/bin/hadoop/share/hadoop/hdfs/bin/hdfs: Permission denied ulimit -a for user hadoop core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 15036 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 1024 – bongo Sep 02 '15 at 17:44
  • I have changed my initial post to give you full message that I got – bongo Sep 02 '15 at 17:56

0 Answers0