0

I have this line in my code:

DistributedFileSystem.get(conf).delete(new Path(new URI(otherArgs[1])), true);    

otherArgs[1] has this value: hdfs://master:54310/input/results

I receive this exception:

Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS:hdfs://master:54310/input/results, expected: file:///
at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:354)
at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:55)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:367)
at org.apache.hadoop.fs.ChecksumFileSystem.delete(ChecksumFileSystem.java:430)
at <package>.<classname>.main(Degree.java:137)    

Note: I tried to use new Path(otherArgs[1]) without URI but got the exact same error !

Thanks, -K

seenukarthi
  • 8,241
  • 10
  • 47
  • 68
user259632
  • 101
  • 5

2 Answers2

0

It looks like you have not set fs.default.name in core-site.xml.

check this link

If you have already set that, make sure the config files are in classpath

You can also set the fs.default.name property from your driver

conf.set("fs.default.name", "hdfs://yourserver:port");
vishnu viswanath
  • 3,794
  • 2
  • 36
  • 47
0

It turns out that I was running my jar using "hadoop -jar " instead of "hadoop jar ". All conf files are correct and in place.

Problem solved but i still have no idea why using "-jar" made it run as a local (pseudo distributed) !

user259632
  • 101
  • 5