3

I have a folder with many files inside, i can't known number of files because I can't do ls or other listing command.

These files are temporary so I tried rm -rf my-folder but it takes too many time, and I get disconnected from the server.

I reconnect and restart the command, but maybe it's useless, if the rm cmd have to read the entire directory before deleting files.

OS : Debian 10

Anyone to help ? Ty

Tero Kilkanen
  • 36,796
  • 3
  • 41
  • 63
  • Set your ssh client to keep alive the connection. Or put the `rm` command in background. And yes, it take good amount of time. I am not aware of other safe utils to do this. – Romeo Ninov Aug 26 '22 at 15:08
  • 1
    You can try `nohup rm -rf my-folder > /dev/null 2>&1 &` to allow the `rm` command to continue running even if you get disconnected. Alternately, use something like `tmux` or `screen` that will let you re-attach to a terminal session after you get disconnected. – larsks Aug 26 '22 at 17:06
  • larsks `-f` is always a bad idea to use it in first place. most files can be deleted without a hazzle by using `find` or even `xargs` remember -f can freak your system, using screen can solve a timeout issue – djdomi Aug 26 '22 at 17:12
  • thanks for the tips – Shawn Forsman Aug 27 '22 at 19:55

2 Answers2

2

If there are a large number of files in the folder, what I would do is:

Delete files in my-folder by having find -delete

find /path/to/my-folder -type f -delete

Delete files in my-folder by having find execute rm

find /path/to/my-folder -type f -exec rm -f {} \;

If there are lot of subdirectories, find by depth first deleting directories:

find /path/to/my-folder -depth -type d -exec rm -rf {} \;
karl
  • 121
  • 4
0

Problem solved. Waited for hours to « rm -rf some-dir » to finish.