2

In my hadoop cluster . we had to authenticate the web UI components so started to search and find this option

https://streever.atlassian.net/wiki/spaces/HADOOP/blog/2014/03/07/491558/Securing+Hadoop+HDP+Web+UI+Component+s

I had used the authentication described above and got it working as it requests a username and password when i connect to the webui .

My problem now is that when i look into the namenode logs it is shows an unauthorized error when the namenode connect to the journalnode

I looked up online , but it all links point to enable kerberos authentication . But i cannot do that as my manager told that it will require considerable amount of time to get it working in our cluster .

So my question is there a setting in namenode or hdfs for that matter where i can specify the JETTY authentication credentials.

Like i can connect to the journalnode using

curl -u username:password http://192.168.14.22:8480

or using authorization header

curl -H "authorization: Basic ZGF2aWQ6aGFkb29w" http://192.168.14.22:8480

The unauthorized error

org.apache.hadoop.hdfs.server.namenode.TransferFsImage$HttpGetFailedException: Fetch of http://node1.qaperf.flytxt.com:8480/getJournal?jid=flycluster&segmentTxId=6938&storageInfo=-63%3A2141723110%3A0%3ACID-26cc5859-c0e5-4ddb-acfd-c96c7a10b238 failed with status code 401
    Response message:
    Unauthorized
            at org.apache.hadoop.hdfs.server.namenode.EditLogFileInputStream$URLLog$1.run(EditLogFileInputStream.java:471)
            at org.apache.hadoop.hdfs.server.namenode.EditLogFileInputStream$URLLog$1.run(EditLogFileInputStream.java:456)
            at java.security.AccessController.doPrivileged(Native Method)
            at javax.security.auth.Subject.doAs(Subject.java:422)
            at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)
            at org.apache.hadoop.security.SecurityUtil.doAsUser(SecurityUtil.java:448)
            at org.apache.hadoop.security.SecurityUtil.doAsCurrentUser(SecurityUtil.java:442)
            at org.apache.hadoop.hdfs.server.namenode.EditLogFileInputStream$URLLog.getInputStream(EditLogFileInputStream.java:455)
            at org.apache.hadoop.hdfs.server.namenode.EditLogFileInputStream.init(EditLogFileInputStream.java:141)
            at org.apache.hadoop.hdfs.server.namenode.EditLogFileInputStream.nextOpImpl(EditLogFileInputStream.java:192)
            at org.apache.hadoop.hdfs.server.namenode.EditLogFileInputStream.nextOp(EditLogFileInputStream.java:250)
            at org.apache.hadoop.hdfs.server.namenode.EditLogInputStream.readOp(EditLogInputStream.java:85)
            at org.apache.hadoop.hdfs.server.namenode.EditLogInputStream.skipUntil(EditLogInputStream.java:151)
            at org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream.nextOp(RedundantEditLogInputStream.java:178)
            at org.apache.hadoop.hdfs.server.namenode.EditLogInputStream.readOp(EditLogInputStream.java:85)
            at org.apache.hadoop.hdfs.server.namenode.EditLogInputStream.skipUntil(EditLogInputStream.java:151)
            at org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream.nextOp(RedundantEditLogInputStream.java:178)
            at org.apache.hadoop.hdfs.server.namenode.EditLogInputStream.readOp(EditLogInputStream.java:85)
            at org.apache.hadoop.hdfs.server.namenode.FSEditLogLoader.loadEditRecords(FSEditLogLoader.java:190)
            at org.apache.hadoop.hdfs.server.namenode.FSEditLogLoader.loadFSEdits(FSEditLogLoader.java:143)
            at org.apache.hadoop.hdfs.server.namenode.FSImage.loadEdits(FSImage.java:898)
            at org.apache.hadoop.hdfs.server.namenode.FSImage.loadFSImage(FSImage.java:753)
            at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:329)
            at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:984)
            at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:686)
            at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:586)
            at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:646)
            at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:820)
            at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:804)
            at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1516)
            at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1582)

Please Help.

Joakim Erdfelt
  • 46,896
  • 7
  • 86
  • 136
Albin Paul
  • 3,330
  • 2
  • 14
  • 30

0 Answers0