We have a small Hadoop cluster where the JobTracker is configured as dynamic (moves from node to node). We'd like to make the data, log files, and interactions more publicly available through a common WebUI (Hadoop MapReduce Job Tracker) rather than through the command line.
The implementation thought is to make all nodes in the hadoop cluster have their web ports open for outbound and create a common DNS alias to all nodes so there's a constant reference to the JobTracker node. Is this a best practice? Also interested in installing a front-end add-on like Apache Hue (http://www.gethue.com) that end-users can access.
I know there's a capability to make the JobTracker static, which would solve this problem but probably introduce others - but I'm sure by making dedicated nodes, it eliminates some of the hadoop intended purposes and power of clustered nodes.
Appreciate any insight on how to strategically best deploy a consistent and accessible URL for admin and end-users.