
NetflixOSS: Announcing Hollow - samber
http://techblog.netflix.com/2016/12/netflixoss-announcing-hollow.html
======
e28eta
I was hoping they'd give some examples with the heap size needed for a given
dataset size, but I never found them (looked through
[http://hollow.how](http://hollow.how) also).

I also wondered what you do when the heap cost gets too large. The Tooling
section has a section on splitting/sharding I think answers my question.

Finally, I wonder if/why keeping all the local data in RAM is a hard
requirement. For example, I suspect you could build a similar
producer/consumer delta/snapshot mechanism, and shove all the data into
something like a local SQLite db. Or even just serialize their heap format to
disk, and then mmap it. I suspect the answer is latency, and that they have
the $$$ to spend on RAM and machines for sharding.

