0

Am running map reduce program, But its getting stuck.

19/09/16 09:35:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/09/16 09:35:04 INFO client.RMProxy: Connecting to ResourceManager at localhost/127.0.0.1:8032
19/09/16 09:35:05 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
19/09/16 09:35:05 INFO input.FileInputFormat: Total input files to process : 1
19/09/16 09:35:06 INFO mapreduce.JobSubmitter: number of splits:1
19/09/16 09:35:06 INFO Configuration.deprecation: yarn.resourcemanager.system-metrics-publisher.enabled is deprecated. Instead, use yarn.system-metrics-publisher.enabled
19/09/16 09:35:07 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1568605566346_0002
19/09/16 09:35:07 INFO impl.YarnClientImpl: Submitted application application_1568605566346_0002
19/09/16 09:35:07 INFO mapreduce.Job: The url to track the job: http://ec2-18-222-170-204.us-east-2.compute.amazonaws.com:8088/proxy/application_1568605566346_0002/
19/09/16 09:35:07 INFO mapreduce.Job: Running job: job_1568605566346_0002

Here is my disk avaiability

Filesystem      Size  Used Avail Use% Mounted on
udev            3.9G     0  3.9G   0% /dev
tmpfs           786M  9.5M  776M   2% /run
/dev/sda3       184G   12G  163G   7% /
tmpfs           3.9G  138M  3.7G   4% /dev/shm
tmpfs           5.0M  4.0K  5.0M   1% /run/lock
tmpfs           3.9G     0  3.9G   0% /sys/fs/cgroup
/dev/sda5       125G   21G   98G  18% /home
cgmfs           100K     0  100K   0% /run/cgmanager/fs
tmpfs           786M   60K  786M   1% /run/user/1000
tmpfs           786M     0  786M   0% /run/user/1001

May I Know please, what going wrong ..? Its just only a single node hadoop cluster.

Thanks

Jenny Rose
  • 111
  • 9
  • You need to look at the ResourceManager UI to determine 1) Are NodeManagers active 2) is there enough resources to run the job. Otherwise the job is waiting for available resources – OneCricketeer Sep 16 '19 at 05:42
  • Possible duplicate of [Wordcount program is stuck in hadoop-2.3.0](https://stackoverflow.com/questions/23397763/wordcount-program-is-stuck-in-hadoop-2-3-0) – Skanda Shastry Sep 16 '19 at 14:57

0 Answers0