## File deletion problem....

The place to post if you need help or advice

Moderators: ChrThor, LXF moderators

### File deletion problem....

I look after a machine which is running out of disk space. I will be replacing the existing hard drives in it whenever I can, but that may take a while. In the meantime, I've copied one (large) section of files off the machine onto a USB drive. Since these files are of historic interest rather than everyday use, getting rid of them will free up around 180 GB of space, enough to keep them going until I can change the drives.

However, I've hit a bit of a problem. I need to leave the directory structure (which is quite deeply nested and full of silly Windows names with spaces in them) intact, to avoid confusing a bit of Windows software that accesses the file areas.

I can use Midnight Commander to scroll through the areas and delete the files, but that is going to take a month of Sundays ( probably literally....) to complete. Not to mention driving me crazy in the process. So.....

Is there any way I can automate the deletion of files while leaving the directory structure untouched?

Back in the days I still used OS/2, I had a utility which would recurse (sp?) through a directory structure, executing commands at each level. Can't find the source for it, and my scripting skills are somewhat limited - so far I've not found a way to kill the files without killing the directories as well.

Any suggestions?

Paul.
paulm
LXF regular

Posts: 242
Joined: Mon Apr 03, 2006 4:53 am
Location: Oxfordshire, UK

It's fairly simple, but does involve deleting the directory structure: do a loop which ls's every directory, then use that output as input for a new loop until you hit only file names. Save the various paths (and -names!) in a text file. Next, remove the entire dir-structure. Then, use the previously made text file to make new dirs with mkdir. You can make them recursively by using the final dir-level:
Code: Select all
mkdir -p /path/to/final/dir/level
will do the trick and create parent-dirs, if not already existent
Dutch_Master
LXF regular

Posts: 2476
Joined: Tue Mar 27, 2007 1:49 am

Dutch_Master wrote:It's fairly simple, but does involve deleting the directory structure: do a loop which ls's every directory, then use that output as input for a new loop until you hit only file names. Save the various paths (and -names!) in a text file. Next, remove the entire dir-structure. Then, use the previously made text file to make new dirs with mkdir. You can make them recursively by using the final dir-level:
Code: Select all
mkdir -p /path/to/final/dir/level
will do the trick and create parent-dirs, if not already existent

Thanks Dutch_Master. I'll have to remember that one...

I was about to answer my own question, though whether the answer I came up with is optimum on not, I don't know.

When I asked, I suspected that the answer would be some strange incantation of find. Being a lazy sod, I've always resisted learning anything other than the simplest uses of find. But I decided I'd better do some research.

In the end, this is what I came up with:

Code: Select all
find . -type f -exec rm {} ';'

You do need to be sure you're in the right directory tree before executing that little lot. And, since the box I'm cleaning up is an SME server, rm has to be unaliased first (default in SME is alias rm='rm -i' for safety).

I tried on one of the smaller tree sections to make sure nothing was going to blow up, then ran it on the base of the tree I needed to clean. Not all that fast - took 20 minutes to run, reducing the tree from 154GB to 31MB. I suspect it probably hit the machine fairly hard as well - from my research, it seems that '-exec' executes the command for every file, and there were a lot of files. Still, I was doing the job over an SSH connection, and at this time of night, its not likely anyone else was trying to use that server.

I guess I'll have to learn the rest of find's syntax - that was a lot easier than wading through the lot with Midnight Commander.

Paul.
paulm
LXF regular

Posts: 242
Joined: Mon Apr 03, 2006 4:53 am
Location: Oxfordshire, UK

Two tips: prefixing a command with \ tells the shell not to use the alias, so use \rm instead of rm.

As you've already found , exec runs the command once for each file. Use + instead of ; (sometimes referred to as exec+) to have find batch as many files as possibly to one invocation of the command.

Code: Select all
find . -type f -exec \ rm "{}" +

I've also added quotes around the braces, in case any of the files have spaces in their names. It may be wise to use an absolute path instead of ., to avoid emptying the wrong directory throuh a misplaced cd.
"Insanity: doing the same thing over and over again and expecting different results." (Albert Einstein)

nelz

Posts: 8615
Joined: Mon Apr 04, 2005 11:52 am
Location: Warrington, UK

nelz wrote:Two tips: prefixing a command with \ tells the shell not to use the alias, so use \rm instead of rm.

Thanks. Wasn't aware of that one.

As you've already found , exec runs the command once for each file. Use + instead of ; (sometimes referred to as exec+) to have find batch as many files as possibly to one invocation of the command.

Code: Select all
find . -type f -exec \ rm "{}" +

I've also added quotes around the braces, in case any of the files have spaces in their names. It may be wise to use an absolute path instead of ., to avoid emptying the wrong directory throuh a misplaced cd.

Great. Thanks again. I didn't have quotes around the curly braces, but it seems to have dealt with silly Windows file names (of which there were quite a few).

I'll have to do some more reading. Find certainly saved me a heap of time....

Paul.
paulm
LXF regular

Posts: 242
Joined: Mon Apr 03, 2006 4:53 am
Location: Oxfordshire, UK