everybody.I'm using mr to process some log file, the file is on hdfs. I want to retrieve some info form the file and store them to hbase.
so I launch the job
HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` ${HADOOP_HOME}/bin/hadoop jar crm_hbase-1.0.jar /datastream/music/useraction/2014-11-30/music_useraction_20141130-230003072+0800.24576015364769354.00018022.lzo
if I just run job as "hadoop jar xxxx" it shows "not find HbaseConfiguraion"
My code is quite simple,
public int run(String[] strings) throws Exception {
Configuration config = HBaseConfiguration.create();
String kerbConfPrincipal = "ndir@HADOOP.HZ.NETEASE.COM";
String kerbKeytab = "/srv/zwj/ndir.keytab";
UserGroupInformation.loginUserFromKeytab(kerbConfPrincipal, kerbKeytab);
UserGroupInformation ugi = UserGroupInformation.getLoginUser();
System.out.println(" auth: " + ugi.getAuthenticationMethod());
System.out.println(" name: " + ugi.getUserName());
System.out.println(" using keytab:" + ugi.isFromKeytab());
HBaseAdmin.checkHBaseAvailable(config);
//set job name
Job job = new Job(config, "Import from file ");
job.setJarByClass(LogRun.class);
//set map class
job.setMapperClass(LogMapper.class);
//set output format and output table name
job.setOutputFormatClass(TableOutputFormat.class);
job.getConfiguration().set(TableOutputFormat.OUTPUT_TABLE, "crm_data");
job.setOutputKeyClass(ImmutableBytesWritable.class);
job.setOutputValueClass(Put.class);
job.setNumReduceTasks(0);
TableMapReduceUtil.addDependencyJars(job);
but when i try to run this MR, I cannot execute "context.write(null,put)", it seems the "map" halts at this line. I think it has relationship with "kerbKeytab", does it mean I need to "login" when I run the "map" process