0

Why in hadoop getting empty txt file while reading from HDFS. i am using the itreative method in hadoop ofcourse i have to do place the output txt file into hadoop HDFS and for next iteration retrive it from hadoop HDFS. At this part of retriving my Map only job get the txt file with correct name but it is totaly empty.

    public void map(LongWritable key, Text value, Context context)
        throws IOException, InterruptedException {
    String[] str=value.toString().split("\\s+");
    int noToken=str.length-1;
    String token="";
    String curNode=str[0];
    float p =0;
    String[] keyRank = null;

        try{

            URI[] localpath= context.getCacheFiles();
            FileReader fr = new FileReader (localpath[0].toString());
            BufferedReader br = new BufferedReader (fr);

            String line = "inf";
            while(line!=null){
                line = br.readLine();

                if(line==null)
                    break;

                //System.out.println(line+" line");
                keyRank = line.toString().split("\\s+");

                try{
                    //System.out.println(keyRank[1].toString()+" key rank ");
                tsum=tsum+Float.parseFloat(keyRank[1].toString());
                tNode++;
            }catch (NumberFormatException e){
                   System.out.println(" rank MapOnly float exception");
               }

1 Answers1

0

Instead of using this

    FileReader fr = new FileReader (localpath[0].toString());
    BufferedReader br = new BufferedReader (fr);

use this code

        FileSystem fs = FileSystem.get(context.getConfiguration());
        Path path = new Path(localpath[0].toString());
        InputStreamReader fr = new InputStreamReader (fs.open(path));
        BufferedReader br = new BufferedReader (fr);

Also u need to ipmort the hadoop File system and input stream reader as given below

       import org.apache.hadoop.fs.FileSystem;
       import java.io.InputStreamReader;