230

When using sudo rm -r, how can I delete all files, with the exception of the following:

textfile.txt
backup.tar.gz
script.php
database.sql
info.txt
codeforester
  • 39,467
  • 16
  • 112
  • 140
masjoko
  • 2,311
  • 2
  • 14
  • 4
  • 5
    Sounds like a question for http://unix.stackexchange.com/ – Jason Dec 01 '10 at 14:16
  • 1
    There are 2 ways to read this question, and the existing answers cover both interpretations: EITHER: (a) preserve files with the specified names _directly_ located in the target directory and - as `rm -r` implies - _delete everything else, including subdirectories_ - even if they contain files with the specified names; OR: (b) traverse the entire subtree of the target directory and, in each directory, delete all files except those with the names listed. – mklement0 Jun 05 '14 at 04:31
  • 2
    To everyone doing this, *please make a backup first*. I've just wasted several days worth of work because I forgot to exclude `.git`, and not having pushed, I was unable to recover over 30 commits. Make sure you exclude everything you care about, hidden folders included. And set `-maxdepth 1` if you're dealing with directories. – Lonami Feb 06 '21 at 21:38
  • @Jason [Remove all files/directories except for one file - Unix & Linux Stack Exchange](https://unix.stackexchange.com/q/153862/209677) – Pablo Bianchi Feb 10 '21 at 01:43

20 Answers20

239
find [path] -type f -not -name 'textfile.txt' -not -name 'backup.tar.gz' -delete

If you don't specify -type f find will also list directories, which you may not want.


Or a more general solution using the very useful combination find | xargs:

find [path] -type f -not -name 'EXPR' -print0 | xargs -0 rm --

for example, delete all non txt-files in the current directory:

find . -type f -not -name '*txt' -print0 | xargs -0 rm --

The print0 and -0 combination is needed if there are spaces in any of the filenames that should be deleted.

awi
  • 2,826
  • 1
  • 17
  • 12
  • 2
    it seems xargs has a text size limit – frazras Mar 05 '13 at 06:37
  • 26
    to delete multiple patterns : `find . -type f -not -name '*ignore1' -not -name '*ignore2' | xargs rm` – Siwei Mar 18 '13 at 07:38
  • 5
    what about directories? it will delete all files, but does it remove folders?! – orezvani Apr 30 '13 at 05:41
  • @emab: It will not delete directories, that's what the "-type f" does. – Emil Stenström Jul 16 '13 at 13:21
  • 33
    Instead of "| xargs rm", find takes a -delete parameter. – Emil Stenström Jul 16 '13 at 13:22
  • Another couple of items that will reduce (probably) unwanted side effects: add `-not -name ".*" -maxdepth 1` to protect configuration and version control metadata. – Leo Mar 26 '14 at 13:39
  • it might be worth adding a -f after the rm, (ie. `rm -f --`) as this command throws a `rm: missing operand` error otherwise which may be problematic if the command is used in a script. – Rooster Aug 12 '16 at 20:03
  • 1
    instead of `-print0 | xargs -0 rm --` you can just `-delete` – Andy Sep 11 '17 at 06:57
  • maybe somebody could make a bash function to remove everything except the arguments given and add it to the .bashrc. for example `rme textfile.txt backup.tar.gz` – Foad S. Farimani Feb 01 '18 at 11:59
  • using `-delete` but removing `-type f` is actually really handy, because it deletes all directories which end up empty from deleting the files, but keeps the ones that happen to have your kept files in them – Hashbrown Jul 11 '19 at 05:14
157
rm !(textfile.txt|backup.tar.gz|script.php|database.sql|info.txt)

The extglob (Extended Pattern Matching) needs to be enabled in BASH (if it's not enabled):

shopt -s extglob
pl1nk
  • 1,952
  • 1
  • 13
  • 11
  • 3
    I get "syntax error near unexpected token \`('" when I do `shopt -s extglob; rm -rf !(README|LICENSE)`. Any idea why? – Dennis Oct 26 '13 at 22:44
  • @nic I don't remember, sorry. I probably ended up using `find` with the `-delete` option. – Dennis Jan 14 '14 at 15:32
  • This is the best solution for me and it works by default on my Ubuntu 12.04.4 LTS with no need for shopt – xtian May 05 '14 at 10:16
  • @xtian: `extglob` is OFF by default, so if you find it's on already, something must have turned it on, such as your bash profile (either directly or by sourcing a script that turns it on, which happens with careless 3rd-party scripts). – mklement0 Jun 05 '14 at 03:35
  • 3
    @nic, @Dennis: The syntax error suggests that something OTHER than `bash` was used, e.g., `dash`, where `extglob` is not supported. However, in an _interactive_ `bash` shell, the command will ALSO not work as stated, though for different reasons. The short of it: execute `shopt -s extglob` BY ITSELF; ONCE SUCCESSFULLY ENABLED (verify with `shopt extglob`), execute `rm -rf !(README|LICENSE)`. (While `extglob` is not yet enabled, `!(...)` is evaluated by _history expansion_ BEFORE the commands are executed; since this likely fails, NEITHER command is executed, and `extglob` is never turned on.) – mklement0 Jun 05 '14 at 03:45
  • 3
    @nic, @Dennis, @mklement0: I had the same issue with "syntax error near unexpected token (" when executing the command within a .sh file (#! /bin/bash) but not when I was running it in a command line interface. It turned out that in addition to the `shopt -s extglob` run in the command line interface, I had to rerun it inside my script file. This solved the problem for me. – Ahresse Jan 06 '17 at 11:49
  • if someone would like to combine it, it's not always good idea. `rm -rf * .* !(src/|lost+found/)` <-- it didn't work as I expect, it deleted everything ;) – Qback Jan 23 '18 at 15:09
  • Deleted everything for me on OSX/zsh – theUtherSide Jun 01 '18 at 05:53
  • This works perfectly as long as there is only one of these. If I try to use two, the second one does not work, but gives weird paths with /./ in the middle. – Henrik Heino Jan 16 '19 at 13:18
43

find . | grep -v "excluded files criteria" | xargs rm

This will list all files in current directory, then list all those that don't match your criteria (beware of it matching directory names) and then remove them.

Update: based on your edit, if you really want to delete everything from current directory except files you listed, this can be used:

mkdir /tmp_backup && mv textfile.txt backup.tar.gz script.php database.sql info.txt /tmp_backup/ && rm -r && mv /tmp_backup/* . && rmdir /tmp_backup

It will create a backup directory /tmp_backup (you've got root privileges, right?), move files you listed to that directory, delete recursively everything in current directory (you know that you're in the right directory, do you?), move back to current directory everything from /tmp_backup and finally, delete /tmp_backup.

I choose the backup directory to be in root, because if you're trying to delete everything recursively from root, your system will have big problems.

Surely there are more elegant ways to do this, but this one is pretty straightforward.

Anirudh Ramanathan
  • 46,179
  • 22
  • 132
  • 191
darioo
  • 46,442
  • 10
  • 75
  • 103
  • 2
    Works also really well with egrep, e.g. for cleaning intermediate latex files: `find . | egrep -v "\.tex|\.bib" | xargs rm` – mtsz Feb 07 '13 at 18:39
  • amazing command! for removing directories just change to rm -r – Aris Aug 13 '13 at 11:24
  • 1
    If you want to simply delete everything in the current directory other than the excluded criteria: `find . -maxdepth 1 | grep -v "exclude these" | xargs rm -r` works much faster as it doesn't need to drill down in to directories unnecessarily. – But those new buttons though.. Jan 17 '14 at 19:25
  • Re `find` pipeline: efficiency issues aside (3 commands are used for what `find` could do alone), this will not work as intended with filenames with embedded spaces and will potentially delete the wrong files. – mklement0 Jun 05 '14 at 04:18
  • A command that stores "to-be-saved" files in another location (`/tmp_backup`) doesn't end well if it's interrupted—from the user's perspective, *all* the files have vanished, unless they know where to go looking for them to get them back. For that reason I'm not in favour of this type of strategy. – Craig McQueen Mar 30 '16 at 05:40
23

I prefer to use sub query list:

rm -r `ls | grep -v "textfile.txt\|backup.tar.gz\|script.php\|database.sql\|info.txt"`

-v, --invert-match select non-matching lines

\| Separator

To avoid preserving files with similar names:

rm -r `ls | grep -v "^textfile.txt$\|^backup.tar.gz$"`
Nick Tsai
  • 3,799
  • 33
  • 36
20

Assuming that files with those names exist in multiple places in the directory tree and you want to preserve all of them:

find . -type f ! -regex ".*/\(textfile.txt\|backup.tar.gz\|script.php\|database.sql\|info.txt\)" -delete
Dennis Williamson
  • 346,391
  • 90
  • 374
  • 439
19

You can use GLOBIGNORE environment variable in Bash.

Suppose you want to delete all files except php and sql, then you can do the following -

export GLOBIGNORE=*.php:*.sql
rm *
export GLOBIGNORE=

Setting GLOBIGNORE like this ignores php and sql from wildcards used like "ls *" or "rm *". So, using "rm *" after setting the variable will delete only txt and tar.gz file.

theharshest
  • 7,767
  • 11
  • 41
  • 51
  • 8
    +1; A simpler alternative to setting and restoring the `GLOBIGNORE` variable is to use a subshell: `(GLOBIGNORE='*.php:*.sql'; rm *)` – mklement0 Jun 05 '14 at 04:34
9

Since nobody mentioned it:

  • copy the files you don't want to delete in a safe place
  • delete all the files
  • move the copied files back in place
gniourf_gniourf
  • 44,650
  • 9
  • 93
  • 104
  • 2
    It is more complicate because you have to take care of permissions after you copy them back. – Romulus Oct 01 '14 at 15:59
  • 2
    @RemusAvram You can use appropriate switches with `cp` or `rsync` to preserve permissions. Anyways, this is just an alternate method (given as a suggestion) that has its place here, as an answer to the OP. – gniourf_gniourf Oct 01 '14 at 16:03
  • This may not be appropriate in many situations wherein the files to be retained are actively being used by other processes. It is also cumbersome of the files to be removed are just a small subset and a large number of files are involved. – codeforester Jan 19 '18 at 18:41
  • @codeforester: sure. But there are situations where it's appropriate, though... in fact I don't really see the point of your comment `:)`. – gniourf_gniourf Jan 19 '18 at 21:34
7

You can write a for loop for this... %)

for x in *
do
        if [ "$x" != "exclude_criteria" ]
        then
                rm -f $x;
        fi
done;
mishunika
  • 1,876
  • 1
  • 16
  • 20
6

A little late for the OP, but hopefully useful for anyone who gets here much later by google...

I found the answer by @awi and comment on -delete by @Jamie Bullock really useful. A simple utility so you can do this in different directories ignoring different file names/types each time with minimal typing:

rm_except (or whatever you want to name it)

#!/bin/bash

ignore=""

for fignore in "$@"; do
  ignore=${ignore}"-not -name ${fignore} "
done

find . -type f $ignore -delete

e.g. to delete everything except for text files and foo.bar:

rm_except *.txt foo.bar 

Similar to @mishunika, but without the if clause.

lhuber
  • 1,452
  • 1
  • 10
  • 13
5

If you're using zsh which I highly recommend.

rm -rf ^file/folder pattern to avoid

With extended_glob

setopt extended_glob
rm -- ^*.txt
rm -- ^*.(sql|txt)
sudo bangbang
  • 27,127
  • 11
  • 75
  • 77
4

Trying it worked with:

rm -r !(Applications|"Virtualbox VMs"|Downloads|Documents|Desktop|Public)

but names with spaces are (as always) tough. Tried also with Virtualbox\ VMs instead the quotes. It deletes always that directory (Virtualbox VMs).

juanfal
  • 113
  • 5
  • Does NOT work for files. `rm !(myfile.txt)` removes all including `myfile.txt` – khaverim Feb 10 '17 at 23:20
  • should execute `shopt -s extglob` first as @pl1nk pointed out – zhuguowei Jun 21 '17 at 01:58
  • Yes, it's very important to activate extended globbing (*extglob*) in bash, I am doing this in a daily routine basis, over a few Macs in labs: `shopt -s extglob` and then `cd /Users/alumno/` and finally `rm -rf !(Applications|Virtualbox*VMs|Downloads|Documents|Desktop|Public|Library)` Read about extended globbing [here](https://www.linuxjournal.com/content/bash-extended-globbing) – juanfal Apr 03 '18 at 14:21
4

Just:

rm $(ls -I "*.txt" ) #Deletes file type except *.txt

Or:

rm $(ls -I "*.txt" -I "*.pdf" ) #Deletes file types except *.txt & *.pdf
wyx
  • 3,334
  • 6
  • 24
  • 44
  • This is not recommended. Parsing `ls` output can become a problem is. See [Here](https://mywiki.wooledge.org/ParsingLs) for detailed information. – R J Aug 22 '18 at 10:27
2

Make the files immutable. Not even root will be allowed to delete them.

chattr +i textfile.txt backup.tar.gz script.php database.sql info.txt
rm *

All other files have been deleted.
Eventually you can reset them mutable.

chattr -i *
1

I belive you can use

rm -v !(filename)

Except for the filename all the other files will e deleted in the directory and make sure you are using it in

LarsTech
  • 80,625
  • 14
  • 153
  • 225
0

This is similar to the comment from @siwei-shen but you need the -o flag to do multiple patterns. The -o flag stands for 'or'

find . -type f -not -name '*ignore1' -o -not -name '*ignore2' | xargs rm

Daniel Chen
  • 350
  • 1
  • 10
0

You can do this with two command sequences. First define an array with the name of the files you do not want to exclude:

files=( backup.tar.gz script.php database.sql info.txt )

After that, loop through all files in the directory you want to exclude, checking if the filename is in the array you don't want to exclude; if its not then delete the file.

for file in *; do
  if [[ ! " ${files[@]} " ~= "$file"  ]];then
    rm "$file"
  fi
done
tripleee
  • 175,061
  • 34
  • 275
  • 318
Leonardo Hermoso
  • 838
  • 12
  • 26
  • The regex comparison is incorrect -- it will preserve any file whose name is a substring of one of the protected files (though the surrounding spaces somewhat mitigate that; but file names can contain spaces). You should collect an array of all files, then subtract the files you want to exclude from that array, then remove the remaining files. – tripleee Aug 22 '18 at 11:57
0

The answer I was looking for was to run script, but I wanted to avoid deleting the sript itself. So incase someone is looking for a similar answer, do the following.

Create a .sh file and write the following code:

    cp my_run_build.sh ../../ 
    rm -rf * cp  
    ../../my_run_build.sh . 
    /*amend rest of the script*/
ouflak
  • 2,458
  • 10
  • 44
  • 49
-1

Since no one yet mentioned this, in one particular case:

OLD_FILES=`echo *`
... create new files ...
rm -r $OLD_FILES

(or just rm $OLD_FILES)

or

OLD_FILES=`ls *`
... create new files ...
rm -r $OLD_FILES

You may need to use shopt -s nullglob if some files may be either there or not there:

SET_OLD_NULLGLOB=`shopt -p nullglob`
shopt -s nullglob
FILES=`echo *.sh *.bash`
$SET_OLD_NULLGLOB

without nullglob, echo *.sh *.bash may give you "a.sh b.sh *.bash".

(Having said all that, I myself prefer this answer, even though it does not work in OSX)

18446744073709551615
  • 16,368
  • 4
  • 94
  • 127
  • The existing answer by Leonardo Hermoso does this with fewer bugs. This will fail for file names with spaces; using an array elegantly solves that particular problem (and also less crucially avoids using uppercase for his private variables, which is generally to be avoided). – tripleee Aug 22 '18 at 11:56
-3

Rather than going for a direct command, please move required files to temp dir outside current dir. Then delete all files using rm * or rm -r *.

Then move required files to current dir.

fedorqui
  • 275,237
  • 103
  • 548
  • 598
Ajeesh
  • 45
  • 6
  • This may not be appropriate in many situations wherein the files to be retained are actively being used by other processes. It is also cumbersome of the files to be removed are just a small subset and a large number of files are involved. – codeforester Jan 19 '18 at 18:39
-3

Remove everything exclude file.name:

ls -d /path/to/your/files/* |grep -v file.name|xargs rm -rf
Brian Tompsett - 汤莱恩
  • 5,753
  • 72
  • 57
  • 129
mik-mak
  • 43
  • 3
  • 3
    This will fail horribly and might cause data loss if your directory/files have spaces or unusual characters. You should never parse ls. https://mywiki.wooledge.org/ParsingLs – R J Sep 12 '18 at 04:09