0

I am new to hadoop filesystem. I didn't find any helpful link on google related to Hadoop FileSystem.

I want to authenticate using Kerberose while using Hadoop FileSystem.

Sample Code :

    Path src = new Path("C:\\testing\\a\\a.avro");
    Path dest = new Path("webhdfs://aaaa:50070/dummy/dummy.avro");
    WebHdfsFileSystem web = new WebHdfsFileSystem();
    try {
        Configuration conf = new Configuration();
        conf.set("fs.default.name","webhdfs://aaaa:50070");
        web.setConf(conf);
        FileSystem fs = FileSystem.get(web.getConf());
        fs.copyFromLocalFile(false,src,dest);
    } catch (IOException e) {
        e.printStackTrace();
    }
}

How do i achieve kerberos authication with the above code? How to set principle and keytab values?

user608020
  • 313
  • 4
  • 15

1 Answers1

0

The short answer to your question is: your hdfs-client code knows how to authenticate if you configure everything right (absolutely nothing to do with the code you posted here).

I recommend you go over the Hadoop security tutorials.

It seems that your hdfs-client is a Windows system. To access a Kerberized cluster (including Kerberized HDFS) you'll need to enable TGT session access, see Registry Key to Allow Session Keys to Be Sent in Kerberos Ticket-Granting-Ticket and you have to make sure your process is not an UAC restricted administrator (see Access to Session Keys not possible using a restricted Token).

Your cluster cannot be a Windows cluster, Kerberized Windows clusters are still in development (YARN-1063, YARN-1972 etc).

Remus Rusanu
  • 288,378
  • 40
  • 442
  • 569