Hacker News new | past | comments | ask | show | jobs | submit login

How about reading the entire Json document out of the database, making a copy of that document, updating the copy, and saving that copy to the database...

Is how someone on my old team designed an internal tool




Depending on load patterns, for an internal tool that might be perfectly reasonable; you can trivially look at the history, for one thing.


Is this not how you update a JSON column on a db that doesn't have (or you choose to not use for reasons) a native JSON type with a sprinkling of data should be immutable?


Well without some form of locking you have a race condition. Two updates done at the same time can have the overwrite the other


One team in my company insists on storing everything related to a target in an MB-size Elasticsearch document and then do all the aggregation client-side, because they already use ES for everything else and don't know how to use a relational database.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: