0

I have a 100GB large backup file, which i compress with bzip2. But bzip2 for some reason keeps the original file.

If i try it, it deletes the original file as it should:

# touch file
# ls file*
file
# bzip2 -z file
# ls file*
file.bz2 

But when i do the same from backup cron script in the night, it keeps both original and compressed file. Does anybody know why? Is it possible, that it could be because of the filesize?

Thanks for your advice!

Tomyk
  • 111
  • 6
  • Do you use `-k` (keep) as a flag in the script? Does the script run with permissions that would allow it to read the file but not write either to the file or the directory it is in (both is necessary to delete a file)? – Sven Apr 09 '14 at 09:29
  • No, i don't. I use `bzip2 -z file`. Permissions are also ok, the scripts itself writes the file there and then compresses it. Thanks. – Tomyk Apr 09 '14 at 12:17
  • Then add a check if `bzip` ran sucessfully and if yes, delete the file yourself. – Sven Apr 09 '14 at 12:30
  • Thanks, i'm doing it this way. But i wanted to know, why is this happening... i don't like when my production systems behave non-deterministically. – Tomyk Apr 09 '14 at 12:45
  • I think this could happen if you have read permission, but not write permission on the file. – 7yl4r Jan 16 '18 at 20:58

0 Answers0