1

I'm trying to get the simplest Hadoop "hello world" setup to work, but when I run the following command:

hadoop jar /usr/share/hadoop/hadoop-examples-1.0.4.jar grep input output 'dfs[a-z.]+'

I get the following warning:

12/11/30 16:36:40 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

And a full error trace that looks like:

12/11/30 16:57:18 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
12/11/30 16:57:18 WARN snappy.LoadSnappy: Snappy native library not loaded
12/11/30 16:57:18 INFO mapred.FileInputFormat: Total input paths to process : 6
12/11/30 16:57:18 INFO mapred.JobClient: Running job: job_local_0001
12/11/30 16:57:18 INFO util.ProcessTree: setsid exited with exit code 0
12/11/30 16:57:18 INFO mapred.Task:  Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@7d4ecfa4
12/11/30 16:57:18 INFO mapred.MapTask: numReduceTasks: 1
12/11/30 16:57:18 INFO mapred.MapTask: io.sort.mb = 100
12/11/30 16:57:18 WARN mapred.LocalJobRunner: job_local_0001
java.lang.OutOfMemoryError: Java heap space
    at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:949)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:428)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
    at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:212)
12/11/30 16:57:19 INFO mapred.JobClient:  map 0% reduce 0%
12/11/30 16:57:19 INFO mapred.JobClient: Job complete: job_local_0001
12/11/30 16:57:19 INFO mapred.JobClient: Counters: 0
12/11/30 16:57:19 INFO mapred.JobClient: Job Failed: NA
java.io.IOException: Job failed!
    at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1265)
    at org.apache.hadoop.examples.Grep.run(Grep.java:69)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
    at org.apache.hadoop.examples.Grep.main(Grep.java:93)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
    at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:156)

I'm running Ubuntu 12.04 and Java version:

java version "1.7.0"
Java(TM) SE Runtime Environment (build 1.7.0-b147)
Java HotSpot(TM) 64-Bit Server VM (build 21.0-b17, mixed mode)

Any ideas?

Alex Averbuch
  • 3,245
  • 5
  • 33
  • 44
  • This is just a warning, not an error (and a very normal one, nothing to be concerned about). Are you certain that you setup up your environment correctly? What do the logs say? – anonymous1fsdfds Nov 30 '12 at 17:44

1 Answers1

2

The warning tells you that the compression codec is not (properly) installed for Hadoop. To install the Snappy compression, have a look at: http://code.google.com/p/hadoop-snappy/

However, a more serious issue is the OutOfMemoryError you get. Check you input, increase the heap size, if necessary. You might also have a look at this related question:
out of Memory Error in Hadoop

Community
  • 1
  • 1
Lorand Bendig
  • 10,630
  • 1
  • 38
  • 45