All apps that you download from App Store can live offline, where they're usable without Internet or trusting some faraway web server.
You can't make a web app that can do that, and to some people it smells like Apple trying to force developers to release through App Store.
Apple forcing local apps to distribute through the app store is a feature.
No, not in the era of "progressive web apps", which is really just a little bit of branding around interconnected APIs. The Cache API in particular means that a webapp can be downloaded and made available offline on a permanent basis. Unless it isn't actually permanent at all, which is what Apple are doing here.
The web and the App Store are just delivery mechanisms for code with different trade-offs built into them. Apple have added an extra trade-off on the web side in the name of privacy.
The advantage of PWAs then seems to be the ability to dodge the app store certification which, while onerous, is not a bad thing for your clients.
Except when you have to pass some of the 30% Apple fee on to your clients.
...I guess I don't really see why the current state of things should block any future development. Browsers shouldn't ever implement new features because users today aren't expecting them to exist?
Yes, you as the user could wipe those out. But now Apple is doing it just because you didn't use it in 7 days. And the user will not blame Apple, they won't even know Apple did that. They will blame the app developer, who in the interest of privacy didn't want to push your personal Todos to a database online.
Again, just a contrived example, please don't go down the road of why a server should have been used. Let's stick to the use case described.
Phones are computers. Even if you remote all connectivity to the outside world, they still function as well as any PC from the early 90s (or earlier). A huge amount of compute power, tons of storage, etc. etc.
Where as developing a PWA can be done on any hardware, and would be natively cross-platform. An offline PWA does not require an active connection, and in fact is the one of the reasons behind the idea of developing a PWA instead of a general webapp or website.
All other browsers allow the use of local storage to optimize and enhance your experience by allowing things like pre-loading data or storing your preferences. This disappears with the decision Apple made to clear storage.
> In the last reported year, customers spent an estimated 54.2 billion U.S. dollars on on in-app purchases, subscriptions, and premium apps in the Apple App store.
So, roughly 30% cut goes to Apple - which is around 16 billion; and developers got the rest, around 38 billion.
With PWAs, 100% goes to developers. As you pointed out, that's a threat and motivation for Apple to continue racheting up their closed ecosystem, and keep PWAs crippled on Apple devices.
Providing even a free native app via the App Store to access a service with a subscription model becomes a very risky proposition given Apple's rules, though.
This is where things get pretty shady with Apple's terms for native apps and the App Store. Take a look at Spotify's experience for a different version of the story.
As lliamander said, if they don't care, why not make it free? I don't for a moment believe the argument about creating a barrier for negative actors. They could still screen apps before allowing them into the App Store, and if that mechanism is working reliably then the charge is unnecessary as a deterrent, while if it is not then the financial deterrent isn't going to be enough to stop a lot of people willing to make these kinds of apps anyway.
Not for users - now there's one less avenue for developers to get them something they want.
Not for developers - now they have to jump through additional hoops to make something that works cross platform.
Who exactly does this benefit?
Do any of you have an example of a good offline-only PWA that will be affected by this?
But if you look at native apps, especially ones I use on desktop OSes, they're dominated (at least in my usage) by offline-first or offline-only apps---and for me, this is a feature, not a bug. This doesn't have to mean they don't have sync, by the way, it just means that's separate from the main functionality of the app.
A perfect example of this is Dropbox: it syncs to your local disk by default. It's easy to forget how valuable this is until you go camping (or similar) and suddenly you realize you forgot to star that one directory you care about. Now your mobile phone is useless, but your laptop works no problem. And due to this being factored out into a separate app, all my files now work regardless of file type (I don't need separate offline support in every app I use, since that's the default).
There are two ideas that go together well:
* The app can work offline
* The app doesn't need a server to function
Neither of those prevent a sync function from existing.
Right now, apps can do both of those. Why don' we want PWA's to be able to do the same? Why do I have to go through Apple's walled garden in order to so? Especially when said alternative is in a sandbox?
This sounds like a seriously poorly thought out idea. Want to clear tracking data from random websites I've been to? That's great. But you don't mess with the data stored by apps I have specifically _chosen_ to install on _my_ device.
Where is Apple's famous UX here? What legitimate argument is there for clearing data of an app the user has added to their home screen?
Just because current apps are buggy doesn't mean we should enforce those bugs at a platform level.
Plus a local note-taking app I created a while ago
Have you ever gone through the app review process? It can be frustratingly capricious, which makes it very expensive. We've had features in our app for years, displayed in plain sight, and then all of a sudden they decide to block an update because of these utterly innocuous features. No rhyme or reason, and now we've got to spend dev time fixing a "problem" that never was a problem before. And we have to delay our entire update because of it.
PWAs offer a way around that uncertainty and added cost. There's also the cost of a developer license, and the Apple hardware you have to buy to run XCode (and probably iOS devices too, so you can test IRL).
EDIT: Also, it's probably cheaper to develop one PWA than a PWA + N native apps, even if N=2. Probably lots cheaper. Now, perhaps there's a way to build a native app that is just a wrapper around WebKit/Safari and a PWA, but you'd still be subject to Apple's walled garden. For example, think of Gab or some such website whose apps have been banned by the various app stores...
So, an offline app's size, when compared to just browsing the web, isn't a compelling difference (especially since it's downloaded maybe once a month or so).
No, it just has to run in a browser.
Or from an extracted archive (much like a native app).
They can't, PWAs can only be served over https
Also, the entire point of PWAs is that they are supposed to have feature parity with local apps, but delivered via the browser. This change is obviously counter to that goal.
Forcing companies to give Apple 30% is not a feature.
If companies feel they can deliver a net experience in webapp that's better than an app, then so be it, it's their choice.
App makers are smart enough to know what makes sense for them.
Yes, they changed direction in 2008. That's just it, though. They changed direction.
The history of Apple is filled with examples of this dynamic. iPhone was a group effort among many talented and influential people and I doubt Forstall and others driving software had same opinion on third party apps. They just didn’t pick that battle before it made sense to. Every other computing platform at the time (including Windows Mobile, Palm, and BlackBerry) supported third party apps, it’s not like the use case was novel or difficult to see, and the webs limitations were considerable. Adding apps was a default path temporarily set aside.
Relevant quote (emphasis mine):
> Now ITP has aligned the remaining script-writable storage forms with the existing client-side cookie restriction, deleting all of a website’s script-writable storage after seven days of Safari use without user interaction on the site.
If a website hasn't been used for 7 days, I'm happy for its data to disappear and save space on my device.
You might be, but maybe not everyone is. I've worked on apps based around multimedia content where downloading in advance to watch or listen later was a big deal, because a typical user also travels a lot and might well be going away for longer than a week. Even if they can get the same data again next time they're online, it might still be much slower and more expensive for them to do that on an international data plan instead of back home.
I'm not sure how much that assumption really holds any more, nor why it should necessarily continue to do so even if it has so far. Technology evolves, and so does how we use it. In the case of the web, and web apps in particular, they have evolved to satisfy a need for convenience in software distribution that many traditional desktop OSes had hopelessly neglected for a very long time and where the developer experience for native mobile apps is less than ideal.
I appreciate your comment about the trust issue, but the bottom line is that these technologies do serve a useful purpose for some people -- I have the customer feedback at my own businesses to make that clear -- and the experience web developers can offer on Android with PWAs will now be significantly better than what they can offer on iOS.
But why shouldn't new possibilities change the modern computing experience or the role of browsers within it? Millions of users are benefitting from new capabilities of modern browsers, even if they don't know the details any more than they know what goes into any other software they use. Why is local storage of data, or the idea of a PWA more generally, special in this respect?
No, that is trivial to do: just make an actual damn application.
What the author is complaining about is that it’s impossible to make a text document that pretends to be an application that stores data in ways they were never intended to be stored.
A webapp is "an actual damn application". Can we just dispense with the repetitive arguments about this every time anyone so much as mentions adding interactivity to a web page?
So trivial that all it needs is learning a completely new skill set and tools, signing up for a gated distribution mechanism that can kill your application on a whim if you violate any of the rules over which you have no control, and then giving a huge cut of your revenues to the rent-seeking platform owner?
The web has been more than just text documents since around the turn of the millennium. It's probably about time we stopped ignoring 20 years of very popular evolution and pretending that what might have been "intended" before a lot of people reading this comment were born should still guide what we build today.
You must've been not following things. The web platform is an application platform and has developed to that end, for many years.
Progressive Web Apps are applications based on standard Web APIs that are designed with the intent to enable offline-capable applications with persistent offline storage of significant amounts of data.
No it’s not. Using it like that is a lasagna of dirty hacks. The web is for structured text with hyperlinks, everything else is bullshit that doesn’t belong on the web.
First it's a bunch of dirty hacks. Then it's an informal convention. Then it's a standard. Lots of technology evolved that way.
All the stakeholders driving the web standards forward are focusing on making it a more powerful application platform.
> The web is for structured text with hyperlinks, everything else is bullshit that doesn’t belong on the web.
That's your personal opinion on what the web platform should be, not what it is. Of course it's a crappy platform in many respects. Of course a lot of people don't like the way it goes. It doesn't matter.
Are any of them outside of chrome’s WebWorker team, or the community of devs that were suckered into a model that really has never gained traction for iOS?
I’m sort of sympathetic to the devs who bought in to the solution, but this looks an awful lot like a pr pressure campaign that is unhappy with how this affects googles disintermediation goals.