Hacker News new | past | comments | ask | show | jobs | submit login

Written in D!

The basic idea behind this project is that you only need to cast 100 random samples to have roughly a 1% resolution, which is usually enough to know what ate your disk space. Even on very slow drives, resolving 100 samples should take no more than a few seconds.




Does it show the accuracy estimate, or it's wholly left to user guess?


It is displayed as "Resolution" on the bottom. (Total size divided by the number of samples so far.)


Good! The terminology was confusing for me, I am not used to the word "resolution" in this meaning and measured by KiB.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: