0

I was trying to append the data frame to existing parquet file found option to have the saveMode to append. But when I was trying to append it throws the error it was not the directory.

data.coalesce(1).write.mode(SaveMode.Append).parquet("/user/root/AppendTest");

Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=root, access=EXECUTE, inode="/user/root/AppendTest":root:root:-rw-r--r-- (Ancestor /user/root/AppendTest is not a directory).

P.S: While was creating the new file it was generated to the folder and then I have renamed to the desired file.

I have checked How to overwrite the output directory in spark but that doesn't solve my problem here. I have tried the ways mentioned in the above questions(issue mentioned is also different).

devanathan
  • 768
  • 3
  • 10
  • 39
  • Possible duplicate of [How to overwrite the output directory in spark](https://stackoverflow.com/questions/27033823/how-to-overwrite-the-output-directory-in-spark) – ernest_k Mar 29 '18 at 07:33
  • @Emest, I have checked that question but that issue was diffrent – devanathan Mar 29 '18 at 07:47
  • 1
    The file you try to access can only be changed by root so you have to run your spark job as root or change the permission to this file – TobiSH Mar 29 '18 at 08:55

0 Answers0