0

i want use my java class on the hadoop hdfs, now i must rewrite my functions. the problem is, if i use the InputStreamReader my app read wrong values.

here my code (so it's work, i want use the uncommented code part):

public static GeoTimeDataCenter[] readCentersArrayFromFile(int iteration) {
    Properties pro = new Properties();
    try {
        pro.load(GeoTimeDataHelper.class.getResourceAsStream("/config.properties"));
    } catch (Exception e) {
        e.printStackTrace();
    }
    int k = Integer.parseInt(pro.getProperty("k"));
    GeoTimeDataCenter[] Centers = new GeoTimeDataCenter[k];
    BufferedReader br;
    try {
        //Path pt=new Path(pro.getProperty("seed.file")+(iteration-1));
        //FileSystem fs = FileSystem.get(new Configuration());
        //br=new BufferedReader(new InputStreamReader(fs.open(pt)));
        br = new BufferedReader(new FileReader(pro.getProperty("seed.file")+(iteration-1)));
        for(int i =0; i<Centers.length; i++){
            String[] temp = null;
            try{
                temp = br.readLine().toString().split("\t");
                Centers[i] = new GeoTimeDataCenter(Integer.parseInt(temp[0]),new LatLong(Double.parseDouble(temp[1]),Double.parseDouble(temp[2])),Long.parseLong(temp[3]));
            }
            catch(Exception e) {
                temp = Seeding.randomSingleSeed().split("\t");
                Centers[i] = new GeoTimeDataCenter(i,new LatLong(Double.parseDouble(temp[0]),Double.parseDouble(temp[1])),DateToLong(temp[2]));
            }
        }
        br.close();
    } catch (IOException e) {
        e.printStackTrace();
    }
    return Centers;
}

maybe someone know this problem?

best regards

Pa Rö
  • 449
  • 1
  • 6
  • 18

1 Answers1

1

i have found the problem. i have get a checksum exception. now i delete all .crc files from my input file. in this way i get no checksum exception and the buffered reader work fine (uncommented code part, upstairs).

Pa Rö
  • 449
  • 1
  • 6
  • 18