3

We have executed the below hadoop command and the program uses MarReduce api. when we run the program it is throwing an Exception even though file is present at the input location. Please guide us.

[cloudera@quickstart mapreduce-bitcoinblock-1.0]$ hadoop jar example-hcl-mr-bitcoinblock-0.1.0.jar org.zuinnote.hadoop.bitcoin.example.driver.BitcoinBlockCounterDriver /user/cloudera/bitcoin/input /user/cloudera/bitcoin/output1
args[0]=/user/cloudera/bitcoin/input
 args[1]=/user/cloudera/bitcoin/output1
##### before the setoutputpath
##### after the setoutputpath
16/10/07 00:04:09 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
16/10/07 00:04:10 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
16/10/07 00:04:11 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
16/10/07 00:04:11 INFO mapred.FileInputFormat: Total input paths to process : 1
16/10/07 00:04:12 WARN hdfs.DFSClient: Caught exception 
java.lang.InterruptedException
    at java.lang.Object.wait(Native Method)
    at java.lang.Thread.join(Thread.java:1281)
    at java.lang.Thread.join(Thread.java:1355)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:862)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:600)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:789)
16/10/07 00:04:12 INFO mapreduce.JobSubmitter: number of splits:1
16/10/07 00:04:12 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1475232088753_0022
16/10/07 00:04:13 INFO impl.YarnClientImpl: Submitted application application_1475232088753_0022
16/10/07 00:04:13 INFO mapreduce.Job: The url to track the job: http://quickstart.cloudera:8088/proxy/application_1475232088753_0022/
16/10/07 00:04:13 INFO mapreduce.Job: Running job: job_1475232088753_0022
16/10/07 00:04:24 INFO mapreduce.Job: Job job_1475232088753_0022 running in uber mode : false
16/10/07 00:04:24 INFO mapreduce.Job:  map 0% reduce 0%
16/10/07 00:04:38 INFO mapreduce.Job: Task Id : attempt_1475232088753_0022_m_000000_0, Status : FAILED
Error: java.lang.NullPointerException
    at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.close(MapTask.java:210)
    at org.apache.hadoop.mapred.MapTask.closeQuietly(MapTask.java:1976)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:468)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

16/10/07 00:04:59 INFO mapreduce.Job: Task Id : attempt_1475232088753_0022_m_000000_1, Status : FAILED
Error: java.lang.NullPointerException
    at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.close(MapTask.java:210)
    at org.apache.hadoop.mapred.MapTask.closeQuietly(MapTask.java:1976)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:468)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

16/10/07 00:05:20 INFO mapreduce.Job: Task Id : attempt_1475232088753_0022_m_000000_2, Status : FAILED
Error: java.lang.NullPointerException
    at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.close(MapTask.java:210)
    at org.apache.hadoop.mapred.MapTask.closeQuietly(MapTask.java:1976)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:468)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

16/10/07 00:05:39 INFO mapreduce.Job:  map 100% reduce 0%
16/10/07 00:05:40 INFO mapreduce.Job:  map 100% reduce 100%
16/10/07 00:05:41 INFO mapreduce.Job: Job job_1475232088753_0022 failed with state FAILED due to: Task failed task_1475232088753_0022_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0

16/10/07 00:05:41 INFO mapreduce.Job: Counters: 10
    Job Counters 
        Failed map tasks=4
        Killed reduce tasks=1
        Launched map tasks=4
        Other local map tasks=3
        Data-local map tasks=1
        Total time spent by all maps in occupied slots (ms)=65085
        Total time spent by all reduces in occupied slots (ms)=0
        Total time spent by all map tasks (ms)=65085
        Total vcore-seconds taken by all map tasks=65085
        Total megabyte-seconds taken by all map tasks=66647040
Exception in thread "main" java.io.IOException: Job failed!
    at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:876)
    at org.zuinnote.hadoop.bitcoin.example.driver.BitcoinBlockCounterDriver.main(BitcoinBlockCounterDriver.java:66)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

The program looks like below

/**
* Copyright 2016 ZuInnoTe (Jörn Franke) <zuinnote@gmail.com>
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
*    http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
**/

/**
 * Simple Driver for a map reduce job counting the number of transactons in a given blocks from the specified files containing Bitcoin blockchain data
 */
package org.zuinnote.hadoop.bitcoin.example.driver;

import java.io.IOException;
import java.util.*;

import org.apache.hadoop.fs.Path;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapred.*;
import org.apache.hadoop.util.*;
import org.zuinnote.hadoop.bitcoin.example.tasks.BitcoinBlockMap;
import org.zuinnote.hadoop.bitcoin.example.tasks.BitcoinBlockReducer;

import org.zuinnote.hadoop.bitcoin.format.*;

/**
* Author: Jörn Franke <zuinnote@gmail.com>
*
*/

public class BitcoinBlockCounterDriver  {



 public static void main(String[] args) throws Exception {
    JobConf conf = new JobConf(new Configuration(), BitcoinBlockCounterDriver.class);

    conf.setJobName("example-hadoop-bitcoin-transactioncounter-job");

    conf.setMapOutputKeyClass(Text.class);
    conf.setMapOutputValueClass(IntWritable.class);

    conf.setOutputKeyClass(Text.class);
    conf.setOutputValueClass(LongWritable.class);

    conf.setMapperClass(BitcoinBlockMap.class);
    conf.setReducerClass(BitcoinBlockReducer.class);

    conf.setInputFormat(BitcoinBlockFileInputFormat.class);
    conf.setOutputFormat(TextOutputFormat.class);

    /** Set as an example some of the options to configure the Bitcoin fileformat **/
     /** Find here all configuration options: https://github.com/ZuInnoTe/hadoopcryptoledger/wiki/Hadoop-File-Format **/
    conf.set("hadoopcryptoledger.bitcoinblockinputformat.filter.magic","F9BEB4D9");

    System.out.println("args[0]=" + args[0] + "\n args[1]=" + args[1]);
    FileInputFormat.addInputPath(conf, new Path(args[0]));
    System.out.println("##### before the setoutputpath");
    FileOutputFormat.setOutputPath(conf, new Path(args[1]));
    System.out.println("##### after the setoutputpath");    


    JobClient.runJob(conf);
 }

}

one more program

/**
* Copyright 2016 ZuInnoTe (Jörn Franke) <zuinnote@gmail.com>
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
*    http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
**/


/**
 * Simple Mapper for counting the number of Bitcoin transactions in a file on HDFS
 */
package org.zuinnote.hadoop.bitcoin.example.tasks;

/**
* Author: Jörn Franke <zuinnote@gmail.com>
*
*/
import java.io.IOException;
import org.apache.hadoop.mapred.*;
import org.apache.hadoop.io.*;
import org.zuinnote.hadoop.bitcoin.format.*;


import java.util.*;

     public  class BitcoinBlockMap  extends MapReduceBase implements Mapper<BytesWritable, BitcoinBlock, Text, IntWritable> {
        private final static Text defaultKey = new Text("Transaction Count:");
        public void map(BytesWritable key, BitcoinBlock value, OutputCollector<Text, IntWritable> output, Reporter reporter) throws IOException {
            // get the number of transactions
             output.collect(defaultKey, new IntWritable(value.getTransactions().size()));
             }

        }
Dayanand
  • 31
  • 2
  • Hi, Can you please provide the output from logfiles on the nodes where the mapper was executed? Can you raise an issue on github: https://github.com/ZuInnoTe/hadoopcryptoledger/issues Additionally, can you please check with the newest version of the library and use the new MapReduce API (see examples in hadoopcryptoledger)? Finally, can you please provide insights on the used Hadoop version? – Jörn Franke Jan 11 '17 at 20:33

0 Answers0