I am trying to run Junits for a spark project in intelliji. Junits initialize local hadoop cluster using hadoop-minicluster dependency. Tests run fine with hadoop version - 2.7.3.2.6.5.0-292. Since we upgraded the our environment, l need to rebuild project with with hadoop version - 3.2.2. Test cases are failing with below error :
dependencies : hadoop-hdfs(3.2.2), hadoop-common(3.2.2), hadoop-minicluster(3.2.2)
IOE creating namenodes. Permissions dump:\npath '/tmp/cluster1/data': \n absolute:/tmp/cluster1/data\n permissions: ----\npath '/tmp/cluster1': \n absolute:/tmp/cluster1\n permissions: drwx\npath '/tmp': \n absolute:/tmp\n permissions: drwx\npath '/': \n absolute:/\n permissions: dr-x\n java.io.IOException: Failed to save in any storage directories while saving namespace. at org.apache.hadoop.hdfs.server.namenode.FSImage.saveFSImageInAllDirs(FSImage.java:1241) ~[hadoop-hdfs-3.2.2.jar:?] at org.apache.hadoop.hdfs.server.namenode.FSImage.saveFSImageInAllDirs(FSImage.java:1198) ~[hadoop-hdfs-3.2.2.jar:?] at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:190) ~[hadoop-hdfs-3.2.2.jar:?] at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1230) ~[hadoop-hdfs-3.2.2.jar:?] at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:427) ~[hadoop-hdfs-3.2.2.jar:?]
Initial code is to use temp directory in MAC - /private/var/../. Since it gave error I tried to change the path to "tmp". Seems it is failing for permissions at absolute path - "/User" itself. I dont have permissions to run it on sudo.
Someone please suggest what can be done here.