
Fastest Way to Delete Large Folders in Windows (2015) - megahz
http://mattpilz.com/fastest-way-to-delete-large-folders-windows/
======
makecheck
Unless you are extraordinarily low on disk space, you should never need to
"wait" to delete things so speed shouldn't matter.

The correct method is to immediately move/rename the target, then launch an
expensive delete in the background. That frees up the location you are trying
to use.

I used to see people set up entire workflows that started with, essentially,
"wait 20 minutes to finish recursively deleting the previous data" instead of
just "move it out of the way and start immediately".

~~~
canes123456
I needed to delete 70GB and 100,000s files across 10 machines last week. It
took an hour using the command method on each machine. Using the GUI froze
windows and even if it worked would take at least a day.

~~~
43224gg252
Linux use to have a very similar problem but it was finally fixed in kernel
4.10. Surprised windows suffers from the same thing and now it's got me
wondering if MacOS does this too...

~~~
maxxxxx
MacOS is much faster than windows for similar operations

------
captainmuon
Wasn't at some distant time in the past deleting done by just renaming a file
and changing the first character to a question mark? I think that also applied
to folders: Delete a folder's "file", and all the contents are gone,
recursively. Not sure how they found free blocks when creating a file though.
One downside is that you never knew how much space you had left. I think at
some point they changed it so that all file operations updated the "free space
left" label in the partition. This may have been FAT around Win98, but maybe
I'm misremembering or it may be some more obscure file system I'm thinking
of...

~~~
dspillett
[I'm assuming you mean DOS FAT filesystems (FAT12, FAT16, FAT32) - anything
else might be different]

> Wasn't at some distant time in the past deleting done by just renaming a
> file and changing the first character to a question mark?

That is just the entry in the directory listing - it would still walk the file
allocation table and mark the relevant blocks as unused. The FAT operated as a
collection of linked lists - the directory listing mentions the first block
and each entry in the FAT states which the next block is for a file or 0 if it
is the last block. Really a sub-directory was just a special file containing
filenames and other properties so deleting an empty directory follows the same
process. The root directory is a special case, being of fixed length (12
blocks IIRC) in a fixed position of the structure.

> I think that also applied to folders: Delete a folder's "file", and all the
> contents are gone, recursively.

No, you couldn't delete a directory with contents by default. Ordering a
recursive delete would perform a depth-first search-and-delete on each
individual object.

> Not sure how they found free blocks when creating a file though.

As the FAT had been updated, it was a simple first-empty-block search.

Unless something _had_ deleted something just by editing the directory entry,
in which case the blocks would be left alone as they would still be marked as
in use. I did see this used as a way to try hide sensitive information without
it getting overwritten, both as a naive copy prevention mechanism and a "hide
my porn" technique.

------
jongalloway2
If you're doing this regularly, a good alternative is to install on a separate
partition and symlink it. Then to delete, you just quick format the partition.

    
    
      echo Y | format Z: /FS:NTFS /X /Q
    

source: [https://superuser.com/a/352321](https://superuser.com/a/352321)

Also, on Windows 10 you can use Ubuntu bash and run rm -rf. I've read that
it's faster, but haven't tested extensively.

 _Update_ : I did a quick test with 100,000 files and the 'del /f/s/q
foldername > nul' approach was about 50% faster than 'rm -rf' on my machine.

~~~
tracker1
for that matter if you use the msys *nix tools (comes with git for windows),
you get a bash prompt in windows and can do the same... I find for general
terminal stuff the msys bash is more to my liking... though I haven't tried
the ubuntu bash in a while (it irked me in many ways).

------
EvanAnderson
He's not handling spaces in directory names in the "cd %1" portion of his
context-menu entry. Fortunately his "fastdel.bat" should prompt the user and,
hopefully, they'll catch the mistake.

Calling CMD.EXE from Explorer without double-quoting can have unintended
consequences when ampersands and parentheses are present in the filename too.
I'd hate to "Fast Delete" a directory named "& rd /s /q %SystemRoot% &" using
his shell context menu entry.

If I was hell-bent on doing this I'd probably add it to my user registry w/
the command:

    
    
       reg add "HKEY_CURRENT_USER\Software\Classes\Directory\shell\Fast Delete\command" /d "cmd /c rd /s ""%1"""
    

There's still probably some fun metacharacter injection possibilities there,
too, however.

------
bastijn
Unfortunately, RMDIR as well as other windows commands suffer from the long
path issue. You can turn on long path name support in win10 but that did break
our regular build tooling.

My current way to delete any folder regardless depth of the tree is to use
robocopy (robocopy D:\EmptyFolder D:\FolderToDelete /MIR). It is actually
pretty damn fast, might be faster than using RMDIR /S /Q.

------
coverband
This whole page is making me claustrophobic with visions of so many things
that can go wrong for trying to optimize the folder delete... Compounded by a
helpful samaritan who is offering an EXE download to set up a batch file... :(

------
moomin
The answer to this used to be Visual SourceSafe.

~~~
computerex
SourceSafe is never the answer.

~~~
klodolph
"What source control system makes you doubt the logic of using source control
altogether?"

------
ramshanker
At some point/scale, quick format would be even faster.

------
laurent123456
Reading this article reminds me how tedious it is to customise Windows. All
these manual steps in system dialogs, registry and so on have to be repeated
every time you want to re-install Windows.

I wanted to run a command as admin on startup recently and had to create a
scheduled task for this, which of course will be lost the next time I
reinstall. Same for services, etc.

~~~
stevep001
You must now know about Powershell. For example, scheduled tasks:
[https://blogs.technet.microsoft.com/heyscriptingguy/2015/01/...](https://blogs.technet.microsoft.com/heyscriptingguy/2015/01/13/use-
powershell-to-create-scheduled-tasks/)

------
Filligree
I haven't used Windows seriously for a while, but I'm pretty sure you can use
filenames without extensions now.

Doesn't that mean that the " _._ " pattern won't match everything?

~~~
fredoralive
If you mean * .* [1], it's special cased so it still works (and .* as a suffix
in general will match no suffix).

[https://blogs.msdn.microsoft.com/oldnewthing/20071217-00/?p=...](https://blogs.msdn.microsoft.com/oldnewthing/20071217-00/?p=24143)

[1] Seeing as I fell into it too, it's probably the _intuitive_ formatting
codes that messed it up, not the parent poster themselves.

~~~
softawre
Technical interviews at my company are 1 question: how do you format a complex
regex on hacker news?

~~~
thaumasiotes

        *monospace it*

~~~
vxNsr
Btw, how do you do that?

Also how do you create lists?

~~~
thaumasiotes
I don't know of any list-creating functionality. Lines that begin with four
spaces get monospaced.

------
toyg
Note that these solutions will bypass the trashcan, which is why they
shouldn't really be used willy-nilly.

They will also trip up in the same way Explorer does, on paths that are too
long.

------
coin
Summary - use command line tools

------
krylon
Does anyone know how PowerShell compares to cmd.exe for this particular
problem? In PowerShell, one would type:

    
    
      rm -Recurse -Force $path
    

It has the advantage that it only takes a single command, but I am not
entirely certain about the performance.

~~~
jdmichal
My experience has been that Powershell is as fast as cmd for this task.

EDIT: As a test, I cloned my local Maven repository a few times. This resulted
in 48,613 files, 16,590 folders, and 2.71 GB on disk. Here's the result of:

    
    
        Measure-Command { rm '.\.m2 - Copy' -Recurse -Force }
    
        TotalDays         : 0.00196887874421296
        TotalHours        : 0.0472530898611111
        TotalMinutes      : 2.83518539166667
        TotalSeconds      : 170.1111235
        TotalMilliseconds : 170111.1235
    

~~And it had a CPU pegged the entire time. So no, Powershell is still terrible
at this. Stick with cmd.~~

EDIT2: Tried it again, with RMDIR and using the timing script found here:

[https://stackoverflow.com/a/6209392](https://stackoverflow.com/a/6209392)

    
    
        timecmd "RMDIR /S /Q .m2c > NUL"
    
        command took 0:2:40.15 (160.15s total)
    

So within the same magnitude of time.

I might try it one more time after lunch with the DEL followed by RMDIR combo
to see if that changes anything.

~~~
krylon
Thank you very much!!!

~~~
jdmichal
Had similar timing for DEL, so it looks like Powershell is as good as cmd for
this use case:

    
    
        timecmd "DEL /F /Q /S .m2c\*.* > NUL"
    
        command took 0:2:57.60 (177.60s total)

------
hardlianotion
Similar issue deleting files through OS X Finder, also solved by going to the
command line.

------
vorotato
Okay but presumably it's doing something with that extra time, is it valuable?

~~~
kijin
It's calculating the amount of space you've freed up and displaying it in a
pretty graph along with the name of every file and folder that is being
deleted.

This might be helpful if the filenames and sizes make you realize that you're
trying to delete the wrong folder so that you can abort immediately. Other
than that, it's just eye candy.

~~~
alkonaut
There is rarely any excuse for having a UI task that takes longer than a few
seconds and doesn't have a progress indicator. That basically means that if
you want to perform a 30 second task you _should_ show that progress bar even
if that now means it's a 60 second task.

I'd probably try to cheat in this particular scenario either by keeping a best
guess for the recursive delete complexity in the file system itself, or by
simply showing a worse progress indicator such as a counter of files without a
total count. The progress indication doesn't necessarily need to include time
remaining, especially when the cost is this high.

~~~
scopecreep
> The progress indication doesn't necessarily need to include time remaining,
> especially when the cost is this high.

Exactly. I don't particularly care _which_ file out of half a million it's on
-- I just want to know at a glance if it's still running or has somehow
frozen/locked up.

------
test6554
Now if we could only delete files that are allegedly in use by another
process.

~~~
alkonaut
Or, it could at least identify the process locking the file and offer to kill
it.

~~~
awiesenhofer
There's _handle.exe_ , a SysInternals Tool that helps with that

[http://technet.microsoft.com/en-
us/sysinternals/bb896655.asp...](http://technet.microsoft.com/en-
us/sysinternals/bb896655.aspx)

~~~
alkonaut
Yeah, or procexp, and it's even possible with resmon that's built in. But for
the average user it's basically impossible.

------
agumonkey
Never reached a case where shift + del wasn't fast enough, interesting

~~~
acuozzo
I restore 35mm film scans (movies) and I sometimes have to delete six figures
of raw frames (many TBs in total) from my write-RAID when a big intermediate
render doesn't go the way I want it to.

~~~
agumonkey
But but but, how many individual files ? FS deletion is not the same as
erasing all Bytes. I can delete a 128PB file in an instant, by crossing it out
of the index.

~~~
acuozzo
> six figures of raw frames

------
damien207
I will suggest to try "Long Path Tool" program.

------
tsomctl
On a related note, Cygwin is also faster that Windows Explorer, although I
don't know how it compares to RMDIR.

------
winstonewert
Is there actually a reason to do both del and rmdir instead of just rmdir? Or
is the post just being superstitious?

~~~
masklinn
rmdir will only remove _visible_ non-system files, and will fail with the non-
obvious "The directory not empty" when failing to delete one such file and
subsequently not being able to delete the corresponding directory.

An alternative would be to use dir + attrib to make all files visible (I don't
know that stripping out the system flag by default is a good idea) before
running rmdir.

------
frederik0203
Try using long path tool . It worked for me. I hope it helps.

------
LinuXY
[https://windirstat.net/](https://windirstat.net/)

~~~
j_s
WizTree is faster since is uses only the NTFS MFT (eerily similar to the
command line vs. Windows Explorer comparison in the article).

[https://antibody-
software.com/web/software/software/wiztree-...](https://antibody-
software.com/web/software/software/wiztree-finds-the-files-and-folders-using-
the-most-disk-space-on-your-hard-drive/)

------
nvivo
rimraf is extremely fast. Started using it to delete large node_modules
folders and got used to it.

~~~
to3m
Note that the latest npm has changed its behaviour for folder links - if you
refer to a dependency by a folder name ("fred":"file:../fred", that sort of
thing) rather than npm version or git link (etc.), npm now creates a junction
inside node_modules that links directly to the original folder.

Deletion tools that don't know how to distinguish junctions and folders may
then find the original files via the junction and delete them...

(del and rmdir don't suffer from this. I did get a strange error message from
rmdir, though, and it didn't actually delete the junction. GNU-Win32's rm
blithely follows the junction and deletes everything in it.)

------
jscholes
Legitimate question - why/how has this made it to the front page?

~~~
sammoth
Because developers may need to delete large folders in Windows

~~~
excalibur
This is incredibly basic computer knowledge. Has Windows actually fallen so
far out of the loop that there are developers who know Rust or Go, but don't
know something this fundamental?

~~~
Karunamon
It's incredibly basic computer knowledge that a modern day OS is so badly
designed that it requires the user to drop into the command line to delete
large folders efficiently because otherwise, it spends most of its time
telling you how long it will take to tell you how long it will take?

That's one hell of an indictment of Windows!

~~~
problems
There's some truth to it though - the Windows Explorer file operations are
flaky as hell.

I was just trying to copy files between an old system drive and a new one but
I kept getting infinite recursion issues with the "Documents and Settings"
(which is linked to Users) directory, permission issues, etc. Even as the
SYSTEM user. I wound up having to learn a little robocopy for it.

~~~
E6300
Yeah, since Windows Vista/7, when Documents and Settings was moved to Users,
the system creates new user directories with a bunch of symbolic
links/junctions that don't seem to be properly created and always cause issues
when copying the directory to another place or deleting the directory. I
always get this problem when moving my profile directory out of the system
partition.

Incidentally, it's amazing that Windows _still_ doesn't let you specify a
custom location to create the profile directory of a new user.

------
coding123
Install Linux?

------
xori
Can't you just SHIFT+DELETE?

~~~
jscholes
From the article:

> There is, in fact, a significant amount of of overhead when you trigger the
> standard delete action in Windows including when either emptying the Recycle
> Bin or directly deleting files via Shift+Del.

> Upon deleting the ~46,000 files from the NDK package, it took 38 seconds
> with console output enabled and 29 seconds with output disabled on a
> standard non-SSD hard drive, scraping off a quarter of the time. By
> comparison, the same deletion process via a standard Shift+Del in Windows
> Explorer took an agonizing 11 minutes.

