Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Out of curiosity, what does "working with" mean here? What operations did you need to perform on it? Streaming reads, transforms, indexed reads, appends, edits?

I'm thinking that any general-purpose JSON loader is likely to perform badly for a 100GB file, purely because it'll use 2x (or more likely 10x) as much memory for the parsed representation. So you'd want some kind of special-case reader for huge files -- maybe it just builds some kind of sparse index with pointers back into an mmap of the raw data.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: