How to find and delete multiple files in Linux at a time (get rid of *. Argument list too long Error)

Today i need to delete the multiple files from a folder which filled up the whole server – Linux. Directory consumed more than 200 GB due to some wrong configuration at database level.

So i decided to use the normal command i used till date:

$ find * -name testdb* | xargs rm -f {} \;
-bash: /bin/find: Argument list too long


Some how it did not work and finally i found the other way to delete the files.

1. Using Linux find command to wipe out millions of files

a.) Finding and deleting files using find’s -exec switch:

# find . -type f -exec rm -fv {} ;

This method works fine but it has 1 downside, file deletion is too slow as for each found file external rm command is invoked.

b.) Finding and deleting big number of files with find’s -delete argument:

Luckily, there is a better way to delete the files, by using find’s command embedded -delete argument:

# find . -type f -print -delete

c.) Deleting and printing out deleted files with find’s -print arg

If you would like to output on your terminal, what files find is deleting in “real time” add -print:

# find . -type f -print -delete

To prevent your server hard disk from being stressed and hence save your self from server normal operation “outages”, it is good to combine find command with ionice, e.g.:

# ionice -c 3 find . -type f -print -delete

Just note, that ionice cannot guarantee find’s opeartions will not affect severely hard disk i/o requests. On heavily busy servers with high amounts of disk i/o writes still applying the ionice will not prevent the server from being hanged! Be sure to always keep an eye on the server, while deleting the files nomatter with or without ionice. if throughout find execution, the server gets lagged in serving its ordinary client requests or whatever, stop the execution of the cmd immediately by killing it from another ssh session or tty (if physically on the server).

2. Using a simple bash loop with rm command to delete “tons” of files

An alternative way is to use a bash loop, to print each of the files in the directory and issue /bin/rm on each of the loop elements (files) like so:

for i in *; do
rm -f $i;

If you’d like to print what you will be deleting add an echo to the loop:

# for i in $(echo *); do
echo “Deleting : $i”; rm -f $i;

The bash loop, worked like a charm in my case so I really warmly recommend this method, whenever you need to delete more than 500 000+ files in a directory.


Thanks !

Hope it helps..


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s