I have a Map function and a Reduce function outputting kep-value pairs of class Text and IntWritable.. This is just the gist of the Map part in the Main function :
TableMapReduceUtil.initTableMapperJob(
tablename, // input HBase table name
scan, // Scan instance to control CF and attribute selection
AnalyzeMapper.class, // mapper
Text.class, // mapper output key
IntWritable.class, // mapper output value
job);
And here's my Reducer part in the Main function which writes the output to HDFS
job.setReducerClass(AnalyzeReducerFile.class);
job.setNumReduceTasks(1);
FileOutputFormat.setOutputPath(job, new Path("hdfs://localhost:54310/output_file"));
How do i make the reducer write to a Sequence File instead?
I've tried the following code but doesn't work
job.setReducerClass(AnalyzeReducerFile.class);
job.setNumReduceTasks(1);
job.setOutputFormatClass(SequenceFileOutputFormat.class);
SequenceFileOutputFormat.setOutputPath(job, new Path("hdfs://localhost:54310/sequenceOutput"));
Edit: here is the output message i get when i run
WARN hdfs.DFSClient: DataStreamer Exception: org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException: No lease on /sequenceOutput/_temporary/_attempt_local_0001_r_000000_0/part-r-00000 File does not exist. Holder DFSClient_NONMAPREDUCE_-79044441_1 does not have any open files.
13/07/29 17:04:20 WARN hdfs.DFSClient: Error Recovery for block null bad datanode[0] nodes == null
13/07/29 17:04:20 WARN hdfs.DFSClient: Could not get block locations. Source file "/sequenceOutput/_temporary/_attempt_local_0001_r_000000_0/part-r-00000" - Aborting...
13/07/29 17:04:20 ERROR hdfs.DFSClient: Failed to close file /sequenceOutput/_temporary/_attempt_local_0001_r_000000_0/part-r-00000