> If you can fit your problem in memory, it’s probably trivial.
A corollary: "in-memory is much bigger than you probably think it is."
I thought I knew what a large amount of RAM was, and then all the major clouds started offering 12TB VMs for SAP HANA.
edit: this seems like it's touched on very briefly with "Computers can do more than you think they can." but even that only talks about 24GB machines (admittedly in 2012, but still, I'm sure there were plenty of machines with 10x that amount of RAM back then)
Even comparatively senior engineers make this mistake relatively often. If you're a SaaS dealing with at most 100GB of analytical data per customer, (eventually, sharded) postgres is all you need.
A corollary: "in-memory is much bigger than you probably think it is."
I thought I knew what a large amount of RAM was, and then all the major clouds started offering 12TB VMs for SAP HANA.
edit: this seems like it's touched on very briefly with "Computers can do more than you think they can." but even that only talks about 24GB machines (admittedly in 2012, but still, I'm sure there were plenty of machines with 10x that amount of RAM back then)