> Today, IndexedDB can typically store hundreds of megabytes to multiple gigabytes of data, depending on device capacity. Chrome allows up to ~80% of free disk space per origin (tens of GB on a desktop), Firefox now supports on the order of gigabytes per site (10% of disk size), and even Safari (historically strict) permits around 1GB per origin on iOS.
Does anyone else see this as more harmful than beneficial? It would be one thing if expanded storage were a new permission that I had to explicitly grant to a site, maybe with the ability to set a budget per site. As is, now every random website that I visit can store an arbitrary amount of data on my hard drive and it's on me to stay on top of that by clearing out data regularly (but hopefully remembering not to clear data from the sites that I care about!).
I don't think we can live in a world where arbitrary sites can store whatever they want on your computer—leaving it up to you to clean up the mess periodically—and at the same time the primary copy of important data from important apps is stored in the same way. In a world where people have to regularly clean up the mess left by hundreds of other developers, even a "local first" web app has to treat the data as only safe once it lands on the developer's server, which means the primary copy is on the server, not the device. In a few cases maybe it would be beneficial to have a full copy offline (maybe your users regularly deal with unreliable connectivity), but most apps won't benefit enough to be worth the hassle.
There's no user data sovereignty here until browsers give users sovereignty over who gets to store data in the first place.
Ask for permission at several points and gradually move those points depending on frequency of use. While way to much for first use, if you've used the db 10 times per day for 5 years the 1 gb cap is very annoying.
Some feedback for the amount used would also be nice (also depending on frequency of use)
so, 6ms each way from my house in the suburbs of Philadelphia, to the closest Fly region near NYC about 80 miles away. When measuring UI response latency, this is the number that matters. No, your database is not in the edge. But cached query results are!
1) yes but everyone does NOT live “100-200” ms away, so i think we’d all appreciate if the local first vendor went and measured the real number rather than pulling this completely fake 100-200 ms number out of their ass (justified in the image with a big blue line across the Atlantic from Germany to SF, no less!). invoking the “Call of Duty” test - if you can play online shooter games, then you can get a sub-40ms ping to a game server in your region
2) engineering is done with numbers. Which market of “everyone”? Half the world is enterprise workbench software, and when slack goes down all work stops. We aren’t trying to make customer support software work in the subway. What exact market are we targeting that does?
I agree with most of the points made in the article and I think this is wonderfully written. But I disagree with the conclusion.
I want local first software to be the future. But even mobile apps, where much of this has been possible since day one, rarely work correctly offline. Unfortunately the user experience benefits vs the development complexities tradeoff mean that teams don’t do it unless they’re absolutely forced to.
My pet peeve here is the Fastmail app in Android. It doesn't even open if there is no internet connection! Why? I'm pretty sure it caches already many mails locally, why doesn't it work?
> Calendar reminders will not show a notification.
This means that notifications are generated server-side and pushed to the device. I guess this is to...save battery? Wouldn't a background process work well enough, battery-wise?
Therefore sync/engine layers like rxdb or Triplit.dev exist: to remove the complexity and have a "plug & play" system, without requiring each project/dev team to figure out those same problems from scratch.
Yes I know but that’s still only part of the puzzle. And why bother when you can just… not bother? Will it make the company more money? Lead to more sales? That’s why stuff like this is hard to prioritise.
If developers ran the ship from top to bottom you’d probably see more stuff like this in use. But product justifications are difficult to come by.
(notable that the article is written by someone at RxDB so they’re not exactly unbiased in their judgement)
Kudos to the developer of RxDB — he's been ahead of the now re-emerging lo-fi trend and put in consistent and high-quality work for years ! A fantastic SDK with excellent documentation & a large plugin selection, as well as fair licensing (i.e. free but some paid pro features to support RXDB dev) !
One-time purchase, local-first software seems like nirvana for both developers and users: it respects users and simplifies development while providing financial compensation for developers.
Ironically, all of this was already implemented 30+ years ago — local storage, local code providing UI and data manipulation, seldom-connected model for dial-up or Internet, automated replication & conflict resolution with servers, DOD-grade security, etc..
It was Lotus Notes. Then IBM bought it and started ramping up pricing and walling it off.
I'm convinced that, had IBM gone the other direction and opened it up, they could have owned the Internet. Instead, Notes is just a forgotten footnote and IBM is... who are they again?
Does anyone else see this as more harmful than beneficial? It would be one thing if expanded storage were a new permission that I had to explicitly grant to a site, maybe with the ability to set a budget per site. As is, now every random website that I visit can store an arbitrary amount of data on my hard drive and it's on me to stay on top of that by clearing out data regularly (but hopefully remembering not to clear data from the sites that I care about!).
I don't think we can live in a world where arbitrary sites can store whatever they want on your computer—leaving it up to you to clean up the mess periodically—and at the same time the primary copy of important data from important apps is stored in the same way. In a world where people have to regularly clean up the mess left by hundreds of other developers, even a "local first" web app has to treat the data as only safe once it lands on the developer's server, which means the primary copy is on the server, not the device. In a few cases maybe it would be beneficial to have a full copy offline (maybe your users regularly deal with unreliable connectivity), but most apps won't benefit enough to be worth the hassle.
There's no user data sovereignty here until browsers give users sovereignty over who gets to store data in the first place.
reply