57

I am getting this error on startup of Hadoop on OSX 10.7:

Unable to load realm info from SCDynamicStore put: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /user/travis/input/conf. Name node is in safe mode.

It doesn't appear to be causing any issues with the functionality of Hadoop.

Travis Nelson
  • 2,590
  • 5
  • 28
  • 34

7 Answers7

76

Matthew Buckett's suggestion in HADOOP-7489 worked for me. Add the following to your hadoop-env.sh file:

export HADOOP_OPTS="-Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk"
Jeromy Carriere
  • 776
  • 7
  • 3
  • I don't have a hadoop-env.sh because I'm just compiling against hadoop but I'm running into this issue. Could I set this flags on my gradle script or something? – tutuca Apr 11 '14 at 21:07
39

As an update to this (and to address David Williams' point about Java 1.7), I experienced that only setting the .realm and .kdc properties was insufficient to stop the offending message.

However, by examining the source file that is omitting the message I was able to determine that setting the .krb5.conf property to /dev/null was enough to suppress the message. Obviously if you actually have a krb5 configuration, better to specify the actual path to it.

In total, my hadoop-env.sh snippet is as follows:

HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.realm= -Djava.security.krb5.kdc="
HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.conf=/dev/null"
Community
  • 1
  • 1
mdaniel
  • 31,240
  • 5
  • 55
  • 58
  • 1
    Thanks! This was the only answer that worked on Java 1.7 on OS X! – Mark Roddy Mar 07 '14 at 16:26
  • Works brilliantly for me – davideanastasia Mar 22 '14 at 19:38
  • It worked also for me (but only at the second restart, this is quite puzzling) on OS X 10.10.1 and Java 1.7. Thanks. – mgaido Feb 12 '15 at 07:52
  • This issue can appear in hbase (OS X Yosemite). Extending HBAE_OPTS in hbase-env.sh as follows can resolve it. `export HBASE_OPTS="${HBASE_OPTS} -Djava.security.krb5.realm= -Djava.security.krb5.kdc=" export HBASE_OPTS="${HBASE_OPTS} -Djava.security.krb5.conf=/dev/null"` – Mahesh May 25 '15 at 02:25
16

I'm having the same issue on OS X 10.8.2, Java version 1.7.0_21. Unfortunately, the above solution does not fix the problem with this version :(

Edit: I found the solution to this, based on a hint I saw here. In the hadoop-env.sh file, change the JAVA_HOME setting to:

export JAVA_HOME=`/usr/libexec/java_home -v 1.6`

(Note the grave quotes here.)

Community
  • 1
  • 1
Adam Cataldo
  • 703
  • 8
  • 10
  • I had the same problem, and this was the fix for me. – Clay Aug 24 '13 at 15:09
  • I was facing this problem for HBase, adding this line to conf/hbase-env.sh fixed the issue for me. – Arun Gupta Oct 07 '13 at 20:33
  • A simpler example of this same problem can also be found in this question: http://stackoverflow.com/questions/14716910/still-getting-unable-to-load-realm-info-from-scdynamicstore-after-bug-fix – TheCatParty Apr 14 '14 at 22:10
13

FYI, you can simplify this further by only specifying the following:

export HADOOP_OPTS="-Djava.security.krb5.realm= -Djava.security.krb5.kdc="

This is mentioned in HADOOP-7489 as well.

btiernay
  • 7,873
  • 5
  • 42
  • 48
5

I had similar problem on MacOS and after trying different combinations this is what worked for me universally (both Hadoop 1.2 and 2.2):

in $HADOOP_HOME/conf/hadoop-env.sh set the following lines:

# Set Hadoop-specific environment variables here.
export HADOOP_OPTS="-Djava.security.krb5.realm= -Djava.security.krb5.kdc="

# The java implementation to use.
export JAVA_HOME=`/usr/libexec/java_home -v 1.6`

Hope this will help

Vladimir Kroz
  • 5,237
  • 6
  • 39
  • 50
  • This worked for me; probably worth mentioning that I was working with Hbase so had to change HADOOP_OPTS to HBASE_OPTS throughout. – Tom McIntyre Dec 20 '13 at 14:32
4

and also add

YARN_OPTS="$YARN_OPTS -Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk"

before executing start-yarn.sh (or start-all.sh) on cdh4.1.3

KaKa
  • 1,543
  • 12
  • 18
1

I had this error when debugging MapReduce from Eclipse, but it was a red herring. The real problem was that I should have been remote debugging by adding debugging parameters to the JAVA_OPTS

-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=1044

And then creating a new "Remote Java Application" profile in the debug configuration that pointed to port 1044.

This article has some more in-depth information about the debugging side of things. It's talking about Solr, but works much the same with Hadoop. If you have trouble, stick a message below and I'll try to help.

JnBrymn
  • 24,245
  • 28
  • 105
  • 147