I'm trying to run Hadoop 2.7.1 on Centos 7.
When I'm invoking script start-dfs.sh, I'm getting following error:
localhost: starting datanode, logging to /usr/bin/hadoop/logs/hadoop-hadoop- datanode-localhost.localdomain.out
localhost: nice: /usr/bin/hadoop/share/hadoop/hdfs/bin/hdfs: Permission denied
environment variables are set as:
HADOOP_HOME=/usr/bin/hadoop
HADOOP_HDFS_HOME=/usr/bin/hadoop/share/hadoop/hdfs
HADOOP_COMMON_LIB_NATIVE_DIR=/usr/bin/hadoop/lib/native
HADOOP_CLASSPATH=/usr/bin/hadoop/share/hadoop/hdfs/*
core-site.xml:
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>
hdfs-site.xml has not been set up. I'm struggling for the last few days, so I would also be helpful for any link to a site with detailed explanation of each option.
tnx
additional:
/usr/bin/hadoop/logs:hadoop]start-dfs.sh
Starting namenodes on [localhost]
59: starting hadoop-daemons.sh
hadoop@localhost's password:
localhost: starting namenode, logging to /usr/bin/hadoop/logs/hadoop-hadoop-namenode-localhost.localdomain.out
localhost: nice: /usr/bin/hadoop/share/hadoop/hdfs/bin/hdfs: No such file or directory
73: starting hadoop-daemons.sh
hadoop@localhost's password:
localhost: starting datanode, logging to /usr/bin/hadoop/logs/hadoop-hadoop-datanode-localhost.localdomain.out
localhost: nice: /usr/bin/hadoop/share/hadoop/hdfs/bin/hdfs: No such file or directory
Starting secondary namenodes [0.0.0.0]
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
ECDSA key fingerprint is 14:f8:1e:51:32:e9:25:94:45:99:5d:a6:01:89:a2:7d.
Are you sure you want to continue connecting (yes/no)? no
0.0.0.0: Host key verification failed.