Hacker News new | past | comments | ask | show | jobs | submit login

At some point you’re in grep/sed/awk/duckdb/streaming data land.



> duckdb

Not sqlite, out of curiosity? which do you use when? I'm going to need one of them soon, which is why I'm asking ...

FWIW, the aformentioned text editor working with the 200+ MB file could edit the entire thing almost immediately (at the UI level; I don't know what happened in memory). Each record was less than a screen-length, so maybe 100 char/line conservatively, so maybe 2 million lines. iirc I could insert a character on every line without problematic latency - my memory says mabye 10 seconds, but unless it was a dirty operation, that can't be true?


Duckdb is better at ingesting big tables and interacting with parquet, arrow, pandas etc directly. Better for analytics workflows. Better = faster and less boilerplate / data conversion code.


Thanks!


Sqlite is better for applications / transactions though.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: