Hacker News new | past | comments | ask | show | jobs | submit login

I love it. I was just doing some Fermi estimates for a friend on the data for a project he has in the pipeline. I was curious whether or not it would be cost efficient for his project's budget to go with NVMe SSDs or have to stick with traditional SATA ones, and turns out it doesn't even matter (for now) because at least the first three months of data will fit in 256GB of RAM, even allowing for a 2.5x factor stemming from some (estimated) inefficient storage or data structure use in a scripting language like Ruby or Python.

Edit: And after those first three months he'll know more about the use and performance demands of the project and will be able to make far more accurate decisions about storage categories.




Where's 2.5x from? I'd be curious to see any actual data on comparing memory footprint for a problem in C/Go/Rust to Python/Ruby. I'm sure it varies widely, but 2.5x might not be far off.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: