My environment is CentOS 7; Spark 1.6.1; Hadoop 2.6.4; and I have two slave-node in a cluster mode.
When I tried hadoop command, I got WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
And I check hadoop checknative -a
, I got all false responses. Part of this problem is solved when I add
export HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_HOME}/lib/native
export HADOOP_OPTS="-Djava.library.path=${HADOOP_HOME}/lib/native/"
in hadoop-env.sh
and reinstall openssl-devel.
However, I still got warning when I type hadoop checknative -a
:
[hadoop@host-10-174-101-17 ~]$ hadoop checknative -a
16/07/13 14:36:24 WARN bzip2.Bzip2Factory: Failed to load/initialize native-bzip2 library system-native, will use pure-Java version
16/07/13 14:36:24 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
Native library checking:
hadoop: true /usr/local/hadoop/lib/native/libhadoop.so.1.0.0
zlib: true /lib64/libz.so.1
snappy: true /lib64/libsnappy.so.1
lz4: true revision:99
bzip2: false
openssl: true /lib64/libcrypto.so
16/07/13 14:36:24 INFO util.ExitUtil: Exiting with status 1
I reinstalled bizp2 and I check bzip2 --version
:
[hadoop@host-10-174-101-17 ~]$ bzip2 --version
bzip2, a block-sorting file compressor. Version 1.0.6, 6-Sept-2010.
Copyright (C) 1996-2010 by Julian Seward.
This program is free software; you can redistribute it and/or modify
it under the terms set out in the LICENSE file, which is included
in the bzip2-1.0.6 source distribution.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
LICENSE file for more details.
bzip2: I won't write compressed data to a terminal.
bzip2: For help, type: `bzip2 --help'.
What's more, I check dictionary ~/lib64/, both libbz2.so.1
and libbz2.so
exist, it's said that if bzip2 is well loaded, this should be the path.
It seems that bzip2 is well installed, but hadoop cannot load it.
I also tried to recompile as https://issues.apache.org/jira/browse/HADOOP-10409 mentioned.
If I follow exactly as it, it doesn't work:
[hadoop@host-10-174-101-17 ~]$ strings /export/apps/hadoop/latest/lib/native/libhadoop.so | grep initIDs
strings: '/export/apps/hadoop/latest/lib/native/libhadoop.so': No such file
If I change path to my path of hadoop: /usr/local/hadoop/lib/native/libhadoop.so, this is what it got:
[hadoop@host-10-174-101-17 ~]$ strings /usr/local/hadoop/lib/native/libhadoop.so | grep initIDs
Java_org_apache_hadoop_io_compress_lz4_Lz4Compressor_initIDs
Java_org_apache_hadoop_io_compress_lz4_Lz4Decompressor_initIDs
Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_initIDs
Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_initIDs
Java_org_apache_hadoop_crypto_OpensslCipher_initIDs
Java_org_apache_hadoop_io_compress_zlib_ZlibCompressor_initIDs
Java_org_apache_hadoop_io_compress_zlib_ZlibDecompressor_initIDs
Java_org_apache_hadoop_io_compress_lz4_Lz4Compressor_initIDs
Java_org_apache_hadoop_io_compress_lz4_Lz4Decompressor_initIDs
Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_initIDs
Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_initIDs
Java_org_apache_hadoop_crypto_OpensslCipher_initIDs
Java_org_apache_hadoop_io_compress_zlib_ZlibCompressor_initIDs
Java_org_apache_hadoop_io_compress_zlib_ZlibDecompressor_initIDs
Java_org_apache_hadoop_io_compress_lz4_Lz4Decompressor_initIDs
Java_org_apache_hadoop_io_compress_lz4_Lz4Compressor_initIDs
Java_org_apache_hadoop_io_compress_zlib_ZlibDecompressor_initIDs
Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_initIDs
Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_initIDs
Java_org_apache_hadoop_io_compress_zlib_ZlibCompressor_initIDs
Java_org_apache_hadoop_crypto_OpensslCipher_initIDs
Though, I check hadoop checknative -a
, still it doesn't work.
What can I do in this situation? Thank you very much.