0

I am following this tutorial http://ebiquity.umbc.edu/Tutorials/Hadoop/14%20-%20start%20up%20the%20cluster.html They are using hadoop version hadoop-0.19.1. The version that I installed is hadoop-0.20.204.0. I can format the HDFS fine using this command bin/hadoop namenode -format.

The problem arises when i want to use the jobtracker which says no such file or directory when i use this command bin/haoop jobtracker. Also when i try and run data node and tasktracker the same error message is returned. Have the files been moved to a different place or have I not installed it correctly.

the file is missing from the download package. The error message is when I am in the folder hadoop-0.20.204.0 on the cygwin system.

here is the hdfs being formatted

$ bin/hadoop namenode -format
bin/hadoop: line 301: C:\Program: command not found
12/06/27 22:17:51 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = ALEXDEV-PC/192.168.1.2
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 0.20.204.0
STARTUP_MSG:   build = git://hrt8n35.cc1.ygridcore.net/ on branch branch-0.20-se                                                                                                                                                                                               curity-204 -r 65e258bf0813ac2b15bb4c954660eaf9e8fba141; compiled by 'hortonow' o                                                                                                                                                                                               n Thu Aug 25 23:35:31 UTC 2011
************************************************************/
Re-format filesystem in \tmp\hadoop-ALEXDEV\dfs\name ? (Y or N) Y
12/06/27 22:17:57 INFO util.GSet: VM type       = 32-bit
12/06/27 22:17:57 INFO util.GSet: 2% max memory = 19.33375 MB
12/06/27 22:17:57 INFO util.GSet: capacity      = 2^22 = 4194304 entries
12/06/27 22:17:57 INFO util.GSet: recommended=4194304, actual=4194304
12/06/27 22:17:57 INFO namenode.FSNamesystem: fsOwner=ALEXDEV
12/06/27 22:17:57 INFO namenode.FSNamesystem: supergroup=supergroup
12/06/27 22:17:57 INFO namenode.FSNamesystem: isPermissionEnabled=true
12/06/27 22:17:57 INFO namenode.FSNamesystem: dfs.block.invalidate.limit=100
12/06/27 22:17:57 INFO namenode.FSNamesystem: isAccessTokenEnabled=false accessK                                                                                                                                                                                               eyUpdateInterval=0 min(s), accessTokenLifetime=0 min(s)
12/06/27 22:17:57 INFO namenode.NameNode: Caching file names occuring more than                                                                                                                                                                                                10 times
12/06/27 22:17:58 INFO common.Storage: Image file of size 113 saved in 0 seconds                                                                                                                                                                                               .
12/06/27 22:17:58 INFO common.Storage: Storage directory \tmp\hadoop-ALEXDEV\dfs                                                                                                                                                                                               \name has been successfully formatted.
12/06/27 22:17:58 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at ALEXDEV-PC/192.168.1.2

path to environmental variable C:\Program Files (x86)\Java\jdk1.6.0_32

I ran the command again

ALEXDEV@ALEXDEV-PC ~/hadoop-0.20.204.0 $ bin/hadoop namenode Error: JAVA_HOME is not set.

I am not sure how todo the escape character for the brackets in the path (x86) part

alex
  • 398
  • 1
  • 6
  • 24
  • Can you post the error msg / stack trace? Is the file missing from the local file system, or HDFS? Does the path to the file exist, does the hadoop user under which these services start, have enough permissions to create files / folders to the path specified? – Chris White Jun 27 '12 at 10:27

1 Answers1

1

Sounds like unescaped spaces in the path name that it infers. Try moving the hadoop directory to the root of your drive (so something like c:\hadoop\" or similar.

Drizzt321
  • 993
  • 13
  • 27
  • I setup the enviromental variable as explained above. I think you are right though with the path name. Do I need to set the path of JAVA_HOME inside cygwin I have only set it in windows 7 environmental variables. – alex Jun 27 '12 at 22:20
  • dunno, check what's on line 301 of the bin/hadoop script. It may mean you need to trace variables elsewhere in the script. Just try moving the hadoop dir out of a path that has spaces in it, such as c/hadoop/ – Drizzt321 Jun 27 '12 at 22:50