Hacker News new | past | comments | ask | show | jobs | submit login
For developers, Safari is crap and outdated (perrysun.com)
500 points by feross on July 27, 2021 | hide | past | favorite | 598 comments



> Because IE was seriously outdated, lacking support for cutting-edge web APIs and technologies enabling the modern websites and web apps we use today.

This is just false, or at best only half of the story to make a false equivalence. IE was cutting edge until Firefox. The biggest problem with IE was all the insanely weird shit it did, not missing features. It would “fix” missing tags, that would then be broken on every standards compliant browser, had active x, and had an absolutely insane amount of security bugs.

Mobile safari ain’t the IE of the modern day. It’s just fucking not. I’m sorry. Chrome is closer to IE than Safari because of market share AND developers developing exclusively for it forgetting anything and everything else.

Does mobile safari still suck? Sure. But we don’t need to call it IE or constantly drone on with comparisons to IE. Let something suck in its own unique and shitty way, please.


The analogy is simple: back in the day devs got sick of IE because it was the one thing holding them back. You can make a site, easily, that looked beautiful on all the other major browsers, but because IE was outdated when it came to standards support, you had to basically spent all your time fighting with it.

Safari serves that role today. There are tons of things I'd love to do but can't because they're not supported on iOS Safari.


But that isn't the case. At all. IE wasn't difficult because it was behind the times; it was difficult because it was deliberately different. The 2nd pillar of Microsoft's infamous Embrace, Extend, Extinguish strategy ensured incompatibility, which doubled developer efforts.

The idea of IE being purely an "outdated" browser seems to be one perpetuated by young devs reading & misunderstanding old blogposts. The narrative is attractive because it fits the impatient desire be an early adopter of "shiny new thing", but it doesn't match the reality of how IE6 actually worked.

Safari is compliant. It's behind but compliant. That makes it incredibly easy to develop for because backward compat. & progressive enhancement work well if you're not too lazy to develop according to well-known best practice.

What doesn't work well is the spec. du jour that the Google have just added to Chrome before submitting to a standards body and are proceeding to use their monopoly position to pressure other browsers to adopt. That's IE6 behaviour.


> The idea of IE being purely an "outdated" browser seems to be one perpetuated by young devs reading & misunderstanding old blogposts.

That’s certainly not true: it was indeed outdated for much of its life. Yes, through IE6, it did things differently; beyond that point, though, it simply fell behind. It stopped evolving in any meaningful way—with the possible exception of tabs being introduced in IE7–while other browsers continued innovating.

From IE6 through IE11, Internet Explorer wasn’t just doing things differently; it was truly behind. Tasks that could be achieved without JavaScript in other browsers requires JavaScript (or worse) in IE. Eye candy like gradients? Maybe you could get away with some of IE’s custom CSS extensions, but you had to get lucky or change your design.

I may be young by your standards, but I vividly remember developing for IE5 and IE6. I even remember changing my plans for a project when Chrome was announced. IE really isn’t that old.


You’re holding on too tight to an argument for the sake of analogy. It’s cute and evocative but it’s blinding you to the bigger picture.

The issues with Google monopolising the direction of standards development are the major issue facing browsers, users and developers on the web, today.

I get that you’re disappointed Safari doesn’t support the same stuff, the same way as Chrome. Tough. You’re attempting to develop and deploy to a platform with backwards comparability baked into its design and development process. If you want to use the new shiny, do so, but accept what you’re doing carries the risks it has and work that into your strategy. You can hope Safari will implement what you want, but unless you’re contributing to Webkit, or on a W3C committee, you don’t have much skin in that game.

As a web developer you should be considering the needs of your users first. Some of your users could be on Lynx, on that browser which comes with Haiku, on a shitty phone from 8 years ago. THAT’s the web.


That's absurd.

I'm not targetting Lynx, Haiku, or 8 year old shitty phones. Nor are most web developers.

And the attitude that you must support those things is detrimental to the web. It makes people feel like they're doing it wrong of they don't spend all their time chasing those rare edge cases. Which if you're trying to start something new, can kill your productivity.

I'd rather have web developers target the common platforms only, than have those developers give up and switch to native apps.


I don't think it's about such things, it's about pushing features that are developer convenient but not really user convenient.

I'm on an older MacOS version with Safari 12 (too lazy to update, but that's on me), and there are sites that work just fine one day and then break the next because they changed how they want to load random content for their forever scrolling.

I'm sure there's a business case for the development team, but as a user, all I see is that what worked just fine for users before is now broken. The same happens with Firefox too because of Chrome-focused optimizations that a lot of sites seem to make don't always work with Firefox.

For me as a user, I don't see it as Safari/FF holding developers back, I see that because of some unknown benefit, I'm being corralled into using Chrome. I make my peace and just don't visit sites that decide on such changes, but the benefit for me as a user for these changes is completely non-apparent. It's very tempting to write it off as "it's convenient for the devs" due to lack of any relevant information on why such changes are made.


Personally I find very few sites that work in Chrome but not Firefox. But as a web developer I'm still irked by some of the more "fun" APIs that firefox refuses to support such as USB/PWAs/NFC etc.

Your issue however also ties in to what the article said about how Apple does not auto update Safari. You're tied to Safari 12 partially because the upgrade flow is a huge disruptive process as you have to update your entire OS to a new major revision.

Who can say why those sites are breaking for your version of Safari. Web devs do love to just use the "latest and greatest" APIs just for fun, but it could also be that their changes have improved the experience for the rest of users on modern browsers, and that for them outweighs the loss of users still of Safari 12.

I'm not saying every use of new APIs is justified by web devs, the population of web devs is too large and varied to make any statement about the group as a whole. I am saying that the lack of modern features by Safari and Firefox _is_ a detriment to users and web devs, and it pushes developers to make native apps instead which I would argue are a worse experience for many users for certain use cases, and a worse experience for many developers as well.


A few years ago there were no sites not working in FF. Now it's my banking site (top 10 or 20, BNP subsidiary), it's a local mountain resort integrated payment, it's several internet shops. My 2fa token is collecting dust because web sites support only google implementation of 2fa. Firefox itself if in shambles and constantly tried to googlify itself, be it UI or addon api. The trend is worse and worse every day, now that FF is below 10% adoption. Soon single Google Chrome browser will take over completely.


> I am saying that the lack of modern features by Safari and Firefox

If only Chrome implements that feature, it’s not “a modern feature” for the web. It’s an “experimental technology”. It’s only a “modern feature” when several browsers implement it.


Not if you're comparing against other application platforms such as Android, JavaFX, etc.


>Your issue however also ties in to what the article said about how Apple does not auto update Safari. You're tied to Safari 12 partially because the upgrade flow is a huge disruptive process as you have to update your entire OS to a new major revision.

Sorry, but to be clear, my lack of update is just that -- lack of ambition to do so. I don't really have a compelling reason to update MacOS or Safari.

>Who can say why those sites are breaking for your version of Safari. Web devs do love to just use the "latest and greatest" APIs just for fun, but it could also be that their changes have improved the experience for the rest of users on modern browsers, and that for them outweighs the loss of users still of Safari 12.

This is kind of my point though -- WebDev is kind of unique as I see it as there are far more opaque changes than with other software disciplines. Changelists are pretty common for every other software discipline, but I'm not sure I can recall too many websites ever mentioning backend changes they make.

WebDev changes and fast, and a general feeling is that "outdated" in WebDev appears far faster than anything else (e.g., a friend uses my old 2012 MacBook Air -- it works perfectly still, runs modern Editing and Authoring software (adobe suite), plays videos, works on most websites just fine. Some sites like imgur just shit the bed on it except for Chrome because of browser compatibility) For a situation like this, when absolutely everything else runs fine except for a single website on browsers, it feels hard to blame OS Vendors when it's only the site that is having issues being displayed.

>I am saying that the lack of modern features by Safari and Firefox _is_ a detriment to users and web devs,

I'm not sure I'm convinced for the reason above, but it's not really possible for an end user to comment because of:

- Lack of transparency on why the changes were done in the first place

- LifeCycle for features in WebDev are much different than other software life cycles

- Old OSes and hardware can run comparatively contemporary software (i.e., new and modern) without issue, it's only WebDev that bumps into this

- In many cases, the old stack worked "fine" for users regardless of browser

This is where the impression that it's just feature chasing comes from. You can say Safari is a detriment to me, but it has the best battery life (expected), is fastest on MacOS (also expected), has a UI I really like and prefer (likely expected as it's made to look like the rest of MacOS.

The new feature that web developers want to use from Chrome isn't just weighed against my experience on your site; it's weighed against the overall browser performance the computer performance, and the browser experience with the computer.

I might visit a single site for 10-30 minutes on a given day. I'm using the web browser almost constantly for many many many sites. When one site tries to tell me my browser is insufficient/outdated and it's the only one that is causing issues, I don't feel particularly persuaded by this. It's not about raw numbers; Hacker News performs the same way for me today as it did the day I joined, and while I'm sure they've done gobs of optimizations for the amount of traffic and users they handle, from my point of view as a user on many browsers, the experience is the same.


I don't disagree with most of what you're saying, but I think we're coming at it from different perspectives. I am looking at it from the perspective of software which does not yet exist but will be built in the future. In that case, missing Safari features _are_ a detriment to users, because it prevents developers from building this new software.

From the perspective of users of existing software that seems to break with no reason I agree. It's stupid how heavyweight imgur has become when it used to just be a fast simple website that worked everywhere.

Maybe you're content with the functionality your system has now and need no more, but I am still on the quest for new things I can do digitally, and easier way to do things I already can.

Take Figma for example, I'm not even a designer but even I enjoy having access to a collaborative drawing app that is trivial to share with other collaborators. If browsers had not agreed to implement Canvas and all rolled it out, Figma would likely not exist. Perhaps they could have created a packaged native application for all major operating systems, but in reality that's a ton more work, and a huge impediment for users to convince collaborators to buy/install some native app so they can work on a drawing together.

What other software are we missing out on because the barrier for interacting with USB/NFC/Bluetooth/Notifications/Background-Sync is too high.

In a world with native apps only, only the big players can afford to target all platforms, and only the big enough use cases can justify the expense.


> In that case, missing Safari features _are_ a detriment to users, because it prevents developers from building this new software.

> Perhaps they could have created a packaged native application for all major operating systems, but in reality that's a ton more work, and a huge impediment for users to convince collaborators to buy/install some native app so they can work on a drawing together.

> What other software are we missing out on because the barrier for interacting with USB/NFC/Bluetooth/Notifications/Background-Sync is too high.

These are quite a bit overstating the problem. The barrier to entry to develop a native app using safari is quite simple, and you can extend safari to do many things, including being spawned by golang and adding other FFI's for javascript functions. The barrier to entry is your willingness to learn and you only see these complaints coming from new developers. Taking a canvas API and implementing yet another collab drawing app is not innovating, it's using an API.

On the flip side, each new API brings more and more surface area for attacks. And if we keep stacking new APIs we don't have enough time to mature and secure up the existing ones. Notice the only example apps given are attempts to replace native apps with web apps.


This is not about my willingness to learn, it's about my willingness to invest.

Sure I can make a native app that extends Safari with the APIs I need, and I can also do that again for Android, and again for Windows and again for macOS. Or the browsers could implement features developers like me want and I only have to invest once.

Replacing native apps with web apps IS innovative.

Sure the app's functionality is the same as those before it, but the delivery and interaction paradigm is so much improved for users. Being able to invite a friend to participate on what I'm doing without them having to install a native app IS a step forward in user experience.


> I can also do that again for Android, and again for Windows and again for macOS.

This is one file at best per platform, even less now given the frameworks available. And what you learn in the process goes on to benefit your career forever.

> Being able to invite a friend to participate on what I'm doing without them having to install a native app IS a step forward in user experience.

This is entirely subjective. And webapps have their own version of this by forcing usage of chrome, forcing account creation, forcing facebook usage, etc. Web experiences for anything complex doesn’t exactly instill confidence given the shaky experience and devs inability to create seem less offline experiences even with all the required transports available.

Then there’s performance aspects of things. Things like Unreal Engine Editor running in the web, or blender, just seems ridiculous. Sometimes you need native performance.

There’s also security issues with each additional API and web has a bad history with security.

People are making and remaking collaboration apps every day. They’re a dime a dozen now. Apps that truly are innovative that people use to create more things are still native apps for good reason. Otherwise the current set of features supported by safari is enough for me to do my daily work and I feel like Im missing nothing by not using chrome.


You’re missing my point.

I’m not against browsers having features. They can add whatever they want. Devs can target them too.

But “the web”, the platform, the standards process, the language specs, the method of their development is all about maintaining access for the widest range of users for the longest time possible. Or it was. There’s nothing about it stopping you from restricting your own market to support users and devices you want to support… but insisting all mainstream browsers support all the brandest-newest features because ONE of them added it does a disservice to the platform.


Someone has to propose new APIs. Google is the only one at the moment seriously pushing the web as a complete application platform via Chromebooks, and even they are doing a half assed job.

Firefox abandoned FirefoxOS, Apple is focused on the App Store.

What's the last interesting API proposed by someone other than Google?


> I get that you’re disappointed Safari doesn’t support the same stuff, the same way as Chrome.

In some cases, yes, but in others, no. For example, Safari still clings to their proprietary image formats. That means I need two versions of each image: something legacy or proprietary for Safari users (PNG, JPEG, HEIF) and something to appease the Google Gods (WebP) so I don't get blasted by Core Web Vitals. It's the IE PNG and SVG issues all over again. (Yes, I'm aware that Safari keeps claiming they've added WebP support--it's broken and has been broken since its inception.)

It's this proprietary, "we're going to do things our way and ignore the conventions" that annoys me. Chrome does it, too, but at least Chrome users have the ability to choose another browser; no such luck on iOS. WebP has been around for a long time now as an unencumbered format; there's no reason to continue avoiding it other than those sweet, sweet royalties that Apple wants to collect on HEIF.

> If you want to use the new shiny, do so, but accept what you’re doing carries the risks it has and work that into your strategy.

No, I don't want to use the new shiny; I want to eliminate the old bloat--all the compatibility code required to achieve the same experience across a variety of browsers while simultaneously appeasing search engines.

I don't want to be forced to use Apple's proprietary image and video formats to achieve decent performance in Safari--formats that other browsers often can't reasonably support. I don't want to be forced to do things the "Apple way" and rely on JS when I could be using CSS that works in all other browsers.

This is the same path that IE took. Right now they're at the IE7 stage: you can get roughly the same effect that you can with other browsers, but you have to go about it differently. At the same time, they're starting to lag behind. IE7 really exemplified this lag: it took them far too long to add tabs. Proper PNG support was in a similar boat but took even longer.

> Some of your users could be on Lynx

My sites would work better in Lynx if I didn't have to spend so much time making them work in Safari--but they do work in Lynx; they're just less-than-pretty because, guess what, Safari lacks support for the CSS that I need to accommodate semantic HTML.

> on that browser which comes with Haiku

It can't be that important if nobody even knows what it's called.

> on a shitty phone from 8 years ago.

Phones from 8 years ago run sites a lot better when they don't have to download redundant code and JS polyfills to achieve what could've been done with pure CSS. Although, if they don't support TLSv1.2, they're not going to be able to connect to some of my sites anyway; tough cookies. (They'll still be able to access my static websites, where enforcing modern encryption is less of a concern.)


> That means I need two versions of each image: something legacy or proprietary for Safari users (PNG, JPEG, HEIF) and something to appease the Google Gods (WebP) so I don't get blasted by Core Web Vitals.

WebP uses VP8, which requires hardware support to run correctly. Apple's "proprietary" format is also open, and created by a standards body instead of Google. VP8 also has some patent claims on it by Nokia, even though Google has released it under a free patent license. I'd definitely second guess that if I were a legal dept. So really you're arguing that you prefer Google's proprietary format over Apple's. To each their own but know your desired format isn't as free and open as some would hope.

> Chrome does it, too, but at least Chrome users have the ability to choose another browser; no such luck on iOS.

What if iOS users view this as a feature? I know I certainly do. It's impossible these days to know with certainty which apps are running in a browser. And given Chrome supporting all the things it supports it can be a security issue. So knowing an app must be using webkit to me is a feature.

> I don't want to be forced to use Apple's proprietary image and video formats to achieve decent performance in Safari

Again, Apple's formats are open. And further this should be completely transparent to you as you should have conversion scripts in place already.

> My sites would work better in Lynx if I didn't have to spend so much time making them work in Safari

My experience is a bit different. I develop for Safari, then check it in Chrome and FF. It works most of the time with few exceptions that Chrome implements to perform it's magic.

> It can't be that important if nobody even knows what it's called.

That's actually the point. It supports the standards and you get a few extra market participants you wouldn't have if your customers happen to run that browser. You don't need to know the name at all, so long as it supports the standard. This benefits us by having more and more browser engines and competition, instead of the 3 we have today.

> Phones from 8 years ago run sites a lot better when they don't have to download redundant code and JS polyfills to achieve what could've been done with pure CSS.

The question is, is any of this needed? If it's CSS it clearly doesn't anything to do with the functionality of the site right?


> Apple’s “proprietary” format is also open

No, it isn’t. It’s useless without HEVC, which is encumbered and requires royalties,[0] most of which go to Apple. VP8 is free (as in “free beer,” not as in “freedom”).

> What if iOS users view this as a feature?

They can view it however they like. I’m typing this on iOS and find it disappointing.

> Again, Apple’s formats are open.

HEVC is not open.[0]

> My experience is a bit different.

Chrome has its own issues, though, as does Firefox. Safari just seems to have more than the other two. For example, for the longest time, Safari was the only browser to support color() and DCI-P3.

> That’s actually the point.

While Safari might not have the nasty Google habit of coming up with their own standards, they do have the nasty IE habit of failing to keep up with widely accepted standards. For example, to this day, HTTP/2 server push remains broken and will crash Safari if you attempt to push a zero-byte resource.

> The question is, is any of this needed?

If you want to cater to an audience that prefers a clean, efficient UI while simultaneously avoiding creating a horrible mess of dark patterns, yes. If you want to run on as many browsers as possible without looking like a Word document, yes. If you want to appease Google so you don’t get deranked, yes.

[0]: https://en.wikipedia.org/wiki/High_Efficiency_Video_Coding#P...


> No, it isn’t. It’s useless without HEVC, which is encumbered and requires royalties,[0] most of which go to Apple. VP8 is free (as in “free beer,” not as in “freedom”).

HEVC is H265 which is not as widespread. But either way, one benefit to Google I guess is you can avoid the $2.60 device fee. But also scroll down and see how many companies were involved in HEVC vs VP8/9.

> HEVC is not open.[0]

Source code is readily available, codec formats are as well, all that's keeping it from being open is a licensing fee.

> Chrome has its own issues, though, as does Firefox. Safari just seems to have more than the other two. For example, for the longest time, Safari was the only browser to support color() and DCI-P3.

Again, not my experience. I have only had to perform CSS hacks on chrome because chrome adds some CSS features ala IE.

> If you want to cater to an audience that prefers a clean, efficient UI while simultaneously avoiding creating a horrible mess of dark patterns, yes. If you want to run on as many browsers as possible without looking like a Word document, yes. If you want to appease Google so you don’t get deranked, yes.

So using safari I am part of the audience that doesn't want a clean UI? That's the entire reason I use a Mac and Safari. So the question stands. What does CSS have to do with SEO?

How do companies create cross browser experiences these days without making sites look like "word docs"? Apple themselves seem to have quite a design and css heavy site.

Since Safari is such an issue for you, why not just block the sites? Likely because you'll lose market share. So instead you try to demand your customers switch to something you prefer they use to make it easier for you to make money off of them. When will you Google shills leave the rest of us alone? Also how much are they paying you to promote Chrome these days?


I have been building websites since some time last century. I was there during the first browser wars. I was there for the first versions of those browsers! And I agree almost entirely with the complaints throughout this discussion about how outdated IE became and how lacking Safari now is.

It's true that IE6 took more flak than it really deserved because people forget how good it was for the time when it first became available. But you can't seriously claim that IE didn't fall far behind as newer browsers developed and eventually took over. You just can't. And you can't seriously claim that Safari isn't falling far behind in the same way today.


> It's true that IE6 took more flak than it really deserved because people forget how good it was for the time when it first became available.

This makes the same mistake as many sibling commenters here in that it equates "new features" with "good".

You're making out that IE6 didn't deserve flak in the early days because it had innovative new (non-standard, incompatible) features. You're making out that this was somehow a good thing. It wasn't. This was the cause of IE6 problems.

The general argument in comments is that IE5.5 / IE6 ~circa 2000-2004 was not problematic; that it only became problematic later due to the lack of updates or new features. In fact, the problems were present from the start: in 2004 developers were developing exclusively for IE6, putting "works in IE" badges at the bottom of their pages and excluding all Opera & Firebird users.

This was then exacerbated by the lack of updates to IE, but again–the reason this became more problematic wasn't because IE wasn't updated, it's because IE was different in the first place. In ways that are unlike Safari today.

> you can't seriously claim that Safari isn't falling far behind in the same way today

I can seriously claim it because it's fundamentally different: Safari is falling behind in a standards-compliant, consistent, detectable and progressively-enhanceable way. IE6 differed radically from specifications when released, and then proceeded to fall behind in fixing this divergence.

What I'm pointing out is that IE6 falling behind would not have been a massive problem if it hadn't first diverged radically from agreed compatibility with other browsers.

The box-model is the biggest example: specifying any padding AT ALL in CSS in 2005 meant you were making a choice between showing the specified padding correctly in IE, or in every other browser. Without brittle hacks discovered by tireless volunteers, there was no way around this: the same property did radically different things in IE vs others.

In 2021, if Safari doesn't support a feature, it doesn't support it: the property simply does nothing (neither useful nor harmful). Yes, maybe there are exceptions for edge-case bugs but these are fixed in patch releases, since (unlike IE6's different "features"), they're actually considered bugs by Apple.


This makes the same mistake as many sibling commenters here in that it equates "new features" with "good".

No. You seem to be reading that into something I wrote but it's not what I actually said.

New features are certainly important. There is no progress in the capabilities of the Web platform without them and our requirements for web sites and applications today are far more demanding than they were a decade or two ago.

However adding new features is not the only thing that matters in a browser. The quality of implementation also matters. For years Chrome was pushing support for new CSS3 features that we take for granted today like rounded corners and gradients. Unfortunately the rendering in Chrome when you used those features often looked like some kind of poorly blended, poorly aliased, low-res bitmap from the 1980s so if you wanted a professional-looking site you still had to use images anyway.

Safari is falling behind in a standards-compliant, consistent, detectable and progressively-enhanceable way.

No, it isn't. It has had numerous problems where it didn't even follow standard behaviour or have a good quality of implementation for features that it did claim to support. Maybe you've been lucky enough to avoid them, although I don't see how you could have worked on more than the simplest of web sites and interactive features on iOS devices without running into some of the more common problems.

The box-model is the biggest example: specifying any padding AT ALL in CSS in 2005 meant you were making a choice between showing the specified padding correctly in IE, or in every other browser. Without brittle hacks discovered by tireless volunteers, there was no way around this: the same property did radically different things in IE vs others.

Maybe not the most convincing example. IE's behaviour on this one was much more logical and useful than the standard definition. Web developers today typically use CSS features that were added later to use IE's box model by choice.

But also not the most convincing example because you're talking about several years after IE6 came out. As many of us have said throughout this discussion, the problem with IE wasn't so much what it did at the time IE6 was released, at which point IE was completely dominant in browser market share, but that afterwards it didn't keep up with other browsers as they gained share as well and didn't improve its standards compliance as those standards started to matter more.

In 2021, if Safari doesn't support a feature, it doesn't support it: the property simply does nothing (neither useful nor harmful).

I wish that were so. It would be preferable to what we have today, which includes both missing features and present features with poor quality of implementation. Let us know when rotating an iPhone between portrait and landscape reliably handles basic responsive design properly instead of doing its own strange scaling that often leaves some parts of a page wildly out of proportion with others. I've lost count of how many days I've wasted working around iOS Safari issues in just that one area alone.


> That’s certainly not true: it was indeed outdated for much of its life.

I left a more substantive comment deeper in this thread, but just want to correct this small point: we're not disagreeing, you've misread the line you quoted. Note the qualifier "purely". I'm not saying it wasn't outdated, I'm just saying the fact it was outdated wasn't the sole cause of problems.

(i.e. I'm saying that being outdated doesn't cause problems: being both outdated AND deliberately incompatible does. Safari is only outdated)


You're concentrating on the time period when IE was the only game in town and ignoring the 10 or so years post Firefox/Chrome (until basically they switched to Chromium) when it was behind the times and was a tremendous drain on web development.

Any attempt to say that otherwise is just wildly incorrect.


At the time of their release IE4-IE6 have been the most standard-compliant browser. IE was also the fastest at the time. Netscape was in a technological dead-end, and it took considerable time until its complete rewrite (Netscape 6/Mozilla/Firefox) stabilized and caught up. The biggest problem was MS simply dropping its development (IMHO their biggest strategic blunder of 2000s).

Back then all browsers had unstandardized behavior, there was plenty of Netscape specific and IE specific hacks. Some of them were intended as EEE, some of them were result of weak standardization or mere technical limitations of their respective architectures.


> At the time of their release IE4-IE6 have been the most standard-compliant browser

At the time of their release, Opera was the most standards-compliant browser. Beyond Opera, I'm not sure where Firebird, KHTML, etc. stood relative to IE4 specifically, but IE5.5/6.0 were well behind most contemporary browsers (lets exclude Netscape shall we; it was in crisis at the time and promptly discontinued). Opera, KHTML, Firefox and Webkit all passed Acid2 long before IE (IE8 was the first version).

It's worth remembering that Acid1 was released in 1998; IE6 was released in 2001. So this wasn't a time before web standards and cross-browser compat: they were very much a part of the discourse at the time. A discourse MS largely refused to participate in.

> Back then all browsers had unstandardized behavior, there was plenty of Netscape specific and IE specific hacks. Some of them were intended as EEE, some of them were result of weak standardization or mere technical limitations of their respective architectures.

That is true, but it was largely due to bugs other browser makers were working on fixing. The EEE had a heavy influence on MS' motivations to get fixes out (whereas with Safari there isn't really any evidence of any EEE being present).

A good counter-point is XMLHttpRequest. It was non-standard mainly because "AJAX" was a new-ish idea and standards didn't exist. There's good arguments that Microsoft's engagement with standardisation efforts of AJAX APIs was non-existent/actively-hostile, but at the end of the day the various APIs they implemented were pretty easy to detect (even the ActiveX variants), and generally could be targeted in a very cross-browser-compatible way.

That was very much the exception to the rule though: the majority of pains were layout related, and involved not extra nor missing features, but rather features that were in common with other browsers that were deliberately implemented in a different way.


> At the time of their release, Opera was the most standards-compliant browser.

That definitely does not correspond to my recollections. It was only in Opera 7 (2003) with the new Presto layouting engine that drastically improved standards compliance.

> Beyond Opera, I'm not sure where Firebird, KHTML, etc. stood relative to IE4 specifically

There wasn't any Firebird (neither Phoenix) at the time of IE6 release so how can you discuss this in relation to IE4 is beyond me. Mozilla Suite was still in 0.x beta and a year away from official release when IE6 was released.

"The real work on KHTML actually started between May and October 1999" (wiki citation) so I don't think they had a big impact on the discussed time period.

> but IE5.5/6.0 were well behind most contemporary browsers

So let's get concrete, which browsers were better in 1999-2001?

> It's worth remembering that Acid1 was released in 1998; IE6 was released in 2001. So this wasn't a time before web standards and cross-browser compat: they were very much a part of the discourse at the time. A discourse MS largely refused to participate in.

Are you aware that IE6 is Acid1 compliant? IIRC the first one as the GA released browser (discounting Netscape 6/Mozilla 0.X betas).

> Opera, KHTML, Firefox and Webkit all passed Acid2 long before IE (IE8 was the first version).

Which is a direct consequence of the fact that Microsoft after IE6 release dissolved the IE development team. IE8 was the first substantial update of IE in 8 years. (IE7 was a minor update)


IE7 brought PNG transparency support. As a webdeveloper in those days, this was huge. IIRC this was just at the start of the "shiny buttons" web2.0 design era.


I was writing, copy&pasting, linking, etc. but in the end I gave up. There's simply too much that is wrong and it would take a long list to refute every single thing and I just does not have the time or energy to waste (Safari have taken enough already, as IE used to).

In short, it isn't about "IE vs Safari". Both have multiple versions. What was awful was that IE was behind the times. Everything else was basically fixed by pasting a few lines of code from awesome people. Safari is way more work today than IE was back in the day and doing any modern web development is a breeze today - until you need to make it work in Safari.


> Safari is way more work today than IE was back in the day

There's a bunch of good and bad arguments in sibling comments: some of them may be subjective. But this quote really caught me out. Wow.

Wow.

Presuming that you've been developing website for more than 10 years, and that your knowledge of IE6 is based on hard experience and not just reading... what on earth are you doing that is so difficult in Safari?


I’d say trying to make a PWA with Push Notifications on iOS.


This seems to me like trying to catch a fish in space. It doesn’t exist. Everyone knows it doesn’t exists. So why are you trying for something that doesn’t exists and the only work wordings are inconvenient for the average users.

It’s like complaining that Xbox consoles are inadequate for not playing PS games when it’s already established that they won’t from the jump.


> IE wasn't difficult because it was behind the times; it was difficult because it was deliberately different.

That's not what I recall. As I recall, IE wasn't difficult just because it was behind the times, or because it was deliberately different; it was difficult because it had lots of bugs and inconsistencies. There were plenty of subtle incantations one had to use to make it behave consistently, and to avoid its worst bugs. We dealt with it because unfortunately MSIE still had lots of users, but once the number of users became low enough, all that bug-avoidance code was dropped by web developers.

Adding to that, MSIE's Javascript error messages were extremely cryptic. In my workplace at that time, we made a MSIE-only site (it required an ActiveX control for media streaming) run as much as possible on Mozilla/Firefox (of course the media player didn't work), just to be able to get better Javascript error messages when debugging errors on our Javascript code.


Is Safari truly "compliant"? It does so many things differently for no reason. I constantly have to write Safari-specific hacks to get things like scrolling in a modal or button appearance to work as expected.


Safari is compliant. It's behind but compliant.

Well, kind of. There are all sorts of minor glitches that only happen on Apple devices. A common complaint is iOS Safari's interpretation of vh units. It might be technically correct (because the spec is sufficiently ambiguous to argue that Apple's choice meets it) but it's still obviously broken for common practical applications that work everywhere else such as attaching content to the bottom of the visible area. The handling of the "notch" on devices that have one is likewise arguably technically correct but obviously broken.

So the objections to Safari aren't just about not supporting Google-style PWAs features. Some of them are about other big features too. Support for HTML5 media elements and the backing technologies has been very lacking by modern standards and numerous special cases have been needed over the years only to support Apple clients.


> There are all sorts of minor glitches

This is true but it's true of all software. And they tend to get fixed in patch releases (rather than waiting for a new major version). Even for bugs the Webkit team leave open for years, I think there's a big different between maintaining major incompatibilities in central features for years, and being slow fixing some edge-case glitches.


This is true but it's true of all software.

But it's not, is it? We're not talking about minor bugs here, we're talking about anomalous behaviour or unusual interpretations of the specs. Safari is far worse than any other major browser in that respect now that IE has largely faded away.


Pushing forward on spec implementation before submitting to standards bodies is how the standards bodies want developers to make things. They want a "market of ideas", and only standardize the ideas that win out.


I concur. I was just thinking about XMLHttpRequest [0] which was developed to support Outlook Web Access (years before Gmail). As developers discovered and played with it, it became the X in AJAX and eventually became a standard. That standardization happened after it had proved its usefulness and been applied in many different contexts by many different people.

0: https://en.m.wikipedia.org/wiki/XMLHttpRequest

1: https://en.m.wikipedia.org/wiki/Ajax_(programming)


> They want a "market of ideas", and only standardize the ideas that win out.

In practice this means Chrome implements an idea and the idea "wins out" because Chrome does it, until Chrome does a new idea.


It's always fun when some bog standard 2010 flexbox layout works perfectly in every modern browser until someone on an old iPhone tries it. Android has its own update woes to say the least, but at least Chrome receives evergreen updates through the Play Store. Why can't Apple do the same?


Safari is not holding back, Safari is just not implementing Google’s PWA (because it doesn’t really make sense on Apple devices where it would be far inferior to native). You can make a beautiful site in Safari and it will work everywhere (I do professionally). Chrome is the new IE (2001 era, when it was cutting edge and pushing proprietary technology) except ActiveX is PWA.


> because it doesn’t really make sense on Apple devices where it would be far inferior to native

That statement doesn't seem to make sense to me since Apple store's limitations make PWA's relatively more interesting on Apple devices than they would be on Android -- an existing application is not inferior to a non-existent one. On Android you can just serve the native applications yourself if Google doesn't like them. On Apple you can't so web application it is.


It makes zero business sense for small-medium companies to make separate android and IOS apps. That's man hours that will be spent on a specific platform and its app store, instead of on improving the core product.

PWAs are here to stay.


Except 0% of customers understand how PWAs work on mobile devices, 10% of webdevs understand how to make actually performant Apps, and 100% of execs think that using a PWA means their desktop site should be their mobile app (without realizing that, often times, your mobile UX likely has a vastly different context.)


0% of people know how to visit a website? :)

I should have specified B2B in my original comment though, B2C I have no experience in.


> using a PWA means their desktop site should be their mobile app

There's tons of mobile apps that never needed to be anything other than desktop sites honestly. Facebook, actually, is a perfect example.


I would argue that FB is actually a mobile app with a desktop site available as a checkbox that was reduced to being just a copy of the app in React. It didn't really explode until after the iPhone and like 85% of their users are on mobile Apps.

That said, an app for learning as an example should actually have a very different mobile experience as a user wont want hour-long study sessions on the subway (or in line at a subway restaurant.)


It absolutely make sense to make a state of the art app on a platform if it's key to the business. Also resources is not the problem, even big companies (like banks) make low cost web apps instead of native ones if they can. I don't see why Apple should facilitate compromised user experiences instead of setting the bar higher.


But from your own example, why on earth would you need a "state of the art app" for something like banking? You're displaying transactions, maybe getting push notifications about transactions, etc. Even the feature that you may argue requires a native app, mobile deposit where you take a picture of a check, can definitely be done with a browser. Stripe's new identity service where you take a picture of an ID does on-device, real-time image analysis and it works beautifully and simply, no app needed.


I would hate it if I had to unlock safari every-time with my face instead of having a single bank app that’s locked via Face ID. I also want the Bank app to only know of it’s self in it’s little sandbox instead of knowing what I’m browsing through cookies and other web fingerprinting methods. Bank apps should absolutely be apps and not web apps.


Also a PWA can’t compete in quality with a proper native app, should a competitor ever build one you’re toast.

The question just revolves around how integral an app is to the business. Online banking probably requires an app. A restaurant that offers online ordering probably does not. A video game requires an app. An online encyclopedia does not.


This is so wrong it hurts. I know businesses that had to decide between making a mobile PWA style website or a native app for their customers to use... and the customers all wanted the website.

No sitting around waiting to download and install updates. No corporate phone-wiping security policies. All the features that mattered, even if Safari didn't look quite as nice and missed one or two.

PWA vs native app doesn't matter at all to the competition, they are gonna be there regardless (esp. B2B scenarios). Whether a website or app works better for B2C is really highly market dependent.


I’d be interested to see how the customers made an informed choice between a non existent native app and a PWA app.

Probably they made the choice after listening to an opinionated story about ‘sitting around waiting for updates’. Which nobody does, of course, because if you mind that you just turn on automatic background updates.

At least in this story Safari is good enough to have the features that matter. The world is safe.


As for an informed choice, the company (and their clients) that I am thinking of is in a crowded market. Not having to make employees download a new app ended up being a surprisingly (to us) compelling selling point.

As for safari, we supported IE11, chrome, Firefox, mobile / android chrome, desktop and mobile safari. Safari easily added about 10% to our development budget, and safari users got a downgraded experience in certain cases. Not ideal, but close enough for me to not feel bad calling it the "new IE" (and yeah, I remember developing sites for IE 5.5 and 6).

Edit: speaking of auto updates, I have it turned on myself, and just yesterday had to wait around to update my mobile bank app to virtually deposit a check- log in, get kicked out to update, wait to download and install, then log in again before I could proceed. Not sure why, but auto update isn't a cure-all.


Here’s the magic part: if you are the developer, you get to decide if the backend works with only the newest version or also some earlier ones. And if you just do nothing, the user will be just as logged in in the new version. So, unless they are really unlucky, they’ll never have to wait for an update.


I think this is wild conjecture. Users don't care as much as you think about "native performance", they care about features and price. PWAs can be iterated faster than native apps, and they circumvent the app store tax. They also let the business control the onboarding and sales funnel vs text only app storefronts filled with garbage.


Users care about 'Native experience' when they actually want to use the app. There's good reason why companies like Starbucks or IG or your Email app are not PWAs.

The word users tend to have for non-native apps isn't about performance or such, it's about UX.

That word is 'shit'.

Nearly every single PWA developer thinks about how brilliant it is that they can write for two platforms at once but forgets that it means that you ALSO have to do every single last bit of legwork that Apple and Google did for you to give a good, correct-feeling UX on mobile and that the more you do the more you pay that performance cost and build yourself a tech debt prison. And don't get me started on the nightmare that is Ionic and Angular.

PWAs absolutely have their place: Apps that you might want to use...sometimes.. with notifications. That's it. Nothing more makes any sense.


Note IG does have a PWA but it’s ass since the native app is better and support more features than the web client.


Native performance is a feature in itself and is not about speed but about UI standards and features.

If your statement was true, there wouldn’t have to be any complaints about Apples ‘walled garden’ because there is no ‘walled garden’ for PWAs.


Apple effectively doesn't allow PWAs on iOS, so I don't follow your last point.


How does it not allow progressive web apps? Or do you mean apps that are nothing more than shells around a website?


I don’t follow this at all. How are PWAs “not allowed”?


It's also worth mentioning that Apple is a hardware company and that you need a Mac to even compile native apps for their platform.


It makes zero sense for most small-to-medium companies to make any kind of end user-facing app, period (whether native or a webapp). Most small businesses' needs don't call for a content pipeline substantially more complicated than a word processor, with some of them needing to be able to handle a simple form every now and then. Even in the case of online sales, does every business in the world really need a bespoke storefront?

How about people publish their catalog, and then, like, 3 or 4 interoperable "shopper" apps (with preferably at least one developed in the open) can emerge to do the heavy lifting? The way it works now, where Amazon and O'Reilly and AutoZone all have to hire for developers mirrors the ridiculousness of the streaming media platforms, where just because Amazon or Disney+ or whoever has the content and Netflix doesn't, people act as if it makes complete sense that we should have to enter their dumb app and use that provider's shitty title browser and video player, instead of the user picking the app that works the way they want and using it to play from whichever catalog has the content they need.


Safari can't even implement a key value store (indexedDB) correctly...

We had whole versions where it was basically stuffed, and even in the recent versions, I'm getting random errors where it just refuses to write to disk.

If they don't want to implement PWA, fine, their walled garden and wall that, claim it's for privacy etc etc. just don't half ass all the APIs around it then! Leave them out!


> We had whole versions where it was basically stuffed

A co-worker asked me "uh, why does the app use 120GB for indexeddb?"

Wildest bug I've ever had to deal with. Unfortunately it wasn't even just a safari issue.


>it doesn’t really make sense on Apple devices where it would be far inferior to native

Yeah, I get that it doesn't make sense for Apple, business wise -- which is why they are not doing it. But it sure would make sense to developers.


This simply isn't true - offset-path, CSS.registerProperty, there are plenty of instances where non PWA-related features aren't even close to implemented


This exactly.

Google's PWA agenda (displace native apps with web apps) is in opposition to Apple's (displace web apps with native apps) on iOS.

Apple's business interests are in opposition to PWAs, and Apple would also make the same argument you are citing that native apps offer a better user experience, performance per watt, privacy, security, etc..

However, mobile Safari can support something like Amazon Luna, so it can sidestep the app store to some extent as far as games are concerned.


"Safari is not holding back"

Name a place where safari is being innovative on the web.


Was just fighting Safari few days ago and I completely agree.

Noticed that "vw" and "vh" CSS3 units dont work in it like they do in other browsers. Upon digging, found a medium article explaining it - but as I was in safari, the article's code snippets or images wouldn't load properly.

As much as I got a bit hyped to try using safari after WWDC - I just can't. It's so discombobulated regarding web standards that I wonder if it can even count as a "web browser" any more than emacs can.


The problem also is web developers themselves, with the author of this piece and the Redditor they quote included.

Whatever Medium's needs are for putting images and the written word on the screen were satisfied a decade+ ago. The fact that that Medium article doesn't work is a consequence of how web developers have normalized the practice of passing the blame for their failures and lack of competence on to the browsers and browser makers by extension.

We shouldn't even use the term "web developer". The entire industry, bolstered by analogizing themselves to mobile app programmers, have snowed the world and warped expectations that they are or should be anything other than experts in digital content publishing. The entire notion of "full stack" (which embodies a lie in the very term) should have never even happened.

If we really want to pick bones, where the engineers working on web browsers and "web developers" are both fair targets, then the overwhelmingly pressing matter would have to be how the latter are a piss poor substitute for anyone competent the skills they should possess, which would involve library and information sciences and a smattering of practical tech discipline. The industry is a perfect example of the rampant "consultant effect", where people create massively complicated stacks that often don't accomplish anything that their forebears couldn't manage, and then get paid bloated salaries for mastering the tools and workflows that are ostensibly meant to solve the very problems that they are responsible for foisting onto the world in the first place.


> Whatever Medium's needs are for putting images and the written word on the screen were satisfied a decade+ ago. The fact that that Medium article doesn't work is a consequence of how web developers have normalized the practice of passing the blame for their failures and lack of competence on to the browsers and browser makers by extension.

The problem of displaying images and text on the web was solved in the first [EDIT: an early] HTML spec in the mid 90s. Yet web sites still manage to screw it up and make it unnecessarily dependent on JavaScript. I always thought the web would be a lot more pleasant if web "developers" would just stop programming for a bit and learn a little HTML.


> The industry is a perfect example of the rampant "consultant effect", where people create massively complicated stacks that often don't accomplish anything that their forebears couldn't manage, and then get paid bloated salaries for mastering the tools and workflows that are ostensibly meant to solve the very problems that they are responsible for foisting onto the world in the first place.

Organizations want to buy a solution to their problem instead of reorganizing themselves to develop the fundamentals required to not have the problem. It's like managing your back-pain with pain killers without actually doing any physical therapy or lifestyle changes to fix your body mechanics. It's a temporary solution that puts you on a ladder to needing ever more intensive interventions.


> As much as I got a bit hyped to try using safari after WWDC - I just can't. It's so discombobulated regarding web standards that I wonder if it can even count as a "web browser" any more than emacs can.

With a bit of effort you can embedded chromium in Emacs so it's basically true.


100% that, happens regularly. The most recent case being lack of `scrollIntoView` options[0], so smooth animations work everywhere but Safari. You have to go and hack/implement that manually just for Safari.

[0] https://developer.mozilla.org/en-US/docs/Web/API/Element/scr...


The amount of pain we had with IE is not even comparable to the sweet issues we may have with Safari.

Who survived those times, with the evil F12 debugger, can confirm.

Safari is a joy to develop with and it sucks I have to switch to chrome for its extensions.


Safari isn't that. Old edge, IE, and FF are that. Safari is WAY closer to chrome than IE ever was.


Back then there were many browsers and many actors involved in developing web standards. It was justified to criticise laggards.

Nowadays the Microsoft browser, Opera and others are just Chrome skins. Google is pushing various drafts and essentially dictating web standards and then even when a browser like Firefox does support those Google flavoured standards, it still gets blocked outright on new Google products or gets a limited feature set on existing ones. Or it gets hobbled by Google oopsies.

Why would anyone want to play in such a rigged game...


Yet I don't have a problem supporting Firefox, or honestly even desktop Safari for the most part. It is specifically iOS Safari that is being deliberately held back by Apple to protect their App Store monopoly.


I suspect that developers are simply angry with Apple for not letting them to fiddle with the latest stuff that Google is experimenting with.

Chrome is the new IE but with better manners if you consider the non standard Chrome specific stuff that makes websites to work only in Chrome.

Developers hated IE for not being standart, you had to write two versions of your CSS, you had to make your code accommodate the Box model differences(is the border included in the width of your DIV?) and bugs just to make the UI properly display a standart compliant HTML and CSS. You had to deal with JScript, Microsoft's JavaScript that is mostly the same but not exactly.

Due to the marketshare, Microsoft was able to push its own non-standart technology and pushed people to use IE if they don't want to be left out. This is very similar to what Google is doing with Chrome, sabotaging FF through popular Google properties like YouTube[0] and leaving out features if you are not using Chrome.

Apple may not be perfect but Google is not the "good guy" here.

[0]: https://www.zdnet.com/article/former-mozilla-exec-google-has...


> I suspect that developers are simply angry with Apple for not letting them to fiddle with the latest stuff that Google is experimenting with.

That is absolute bullshit. Browser push notifications have been out for years on other browsers (INCLUDING desktop Safari), and they are the one thing that Apple is dragging their feet on the most because they know it's the most important thing that prevents a dev from just doing a PWA and instead has to do a native app.

This is absolutely Apple protecting their App Store monopoly.


Personally I think Apple holding the line on browser push might be the single greatest thing Apple has done for the web in the past decade.

Meanwhile Google is moving the goalposts of "modern" web development so fast that it's nearly impossible for anyone to keep up. I can't believe the hacker community is so enthusiastic about this—you couldn't get a more canonical example of embrace, extend, extinguish as applied to an open protocol.

It wasn't long ago that we had four or five really strong browser implementations. Microsoft really did try with Edge and then gave up. Now we have just three: Chrome, Safari, and Firefox. And Mozilla isn't looking particularly healthy right now...


> Meanwhile Google is moving the goalposts of "modern" web development so fast that it's nearly impossible for anyone to keep up.

For a second I thought you were talking about Apple’s UI frameworks and languages, with all the different Swift versions over the last couple of years and different UI frameworks like Cocoa, Catalyst, and SwiftUI, deprecation of OpenGL and introduction of their own graphics API, etc.


That's a stretch. If anything, Apple is too conservative.

The UIKit was the only UI framework from 2007-2019. Then SwiftUI came and they both work very well together. As other frameworks come for stuff like graphics or device features, they don't really replace each other but stack up. On desktop, it's a similar situation.

How is the situation with the web technologies? What was the hip tech in 2007? AJAX? JQuery? What about 5 years ago? AngularJS? Meteor? Bootstarp? what about now? Is it ReactJS or Vue? What about tomorrow? Is it Svelte?

They all arrange Div's in the DOM but it's different framework each month.


What has the choice of a framework got to do with browsers?

No one is forcing you to use them. HTML and CSS still work perfectly fine, let the people with curiosity and energy experiment. Nothing is stopping you from using browser APIs with any code you want.


You can still use jQuery or plain vanilla HTML+CSS+JavaScript, and you still have access to any web technology out there. Websites written in 1999 will continue to render perfectly. Nobody will deprecate WebGL for example. Nobody forces you to learn a new framework, although the documentation for most of those is actually good unlike Apple’s.

Meanwhile for example you can’t use the latest iOS 15 API (SharePlay) without using Swift, and you need to learn a new half-baked declarative UI API called SwiftUI to write your own widgets. And let’s not talk about trying to run a 1999 executable today.

They all arrange pixels on the screen but it’s a different framework each month.


It's more like every decede but fair enough. If you really want, you can simply use C.


Uh, what? The UI framework story is pretty simple.

There used to be frameworks specific to each platform (AppKit, UIKit, WatchKit...). SwiftUI replaces those with a cross-platform solution, but it's not all there yet.

Catalyst is basically a sort of stop-gap to let devs easily convert Cocoa Touch apps for Mac.

Long-term, SwiftUI is the only thing that will matter.


Android web browsers have had push notifications for ten years now. This is not shifting the goalpost. It just happens to be the one feature that would make millions of native apps redundant.


Are millions of native apps have become redundant on Android in those 10 years? What's the situation? What iOS users missing out?


Yes. Not having to install another shitty spying marketing-loaded app.


Okay so, what we are missing on iOS? Do you have examples?


Off the top of my head, offsceeen canvas and being able to query for many processors the client device has to use for web workers


Just push notifications.


How is that any different to using the fully modern web app equivalent of this shitty spying marketing-loaded app?


Are you asking how going to a URL is different from installing an app?


With all the features being added to the Chrome Platform recently, given the trajectory Google is heading along with web applications, absolutely yes.


If you want the actual details then it’s best if you just google it. If you’re trying to make a point, I do know the differences, and it’s far off.


This is a personal opinion, but I'm actually really glad Apple are dragging their feet on this.

I absolutely despise presumptious software that feels entitled to my attention as and when it chooses. I have my browser set up to wholesale reject all push notifications, I don't let the vast majority of my apps send me push notifications, and any application that feels entitled to pollute my inbox (email already has an awful signal/noise ratio as a communications medium) with useless email alerts gets uninstalled or unsubcribed from immediately. Frankly browser notifications should never have been a thing, it's just another varient of the spamdemic that plagues the modern software landscape. I'll look at your software when I choose, and if the software starts dictating that it gets removed or silenced without exception. I honestly don't understand how most people stay sane with their phones and laptops pinging and flashing away like a pinball machine all day.


Firefox doesn't allow site for even asking permission to notify you without some good reason. It's only Chrome (as always) that doesn't care about your well being. If you want respect, drop the unrespecting relationships. (And yes, complain the hell out of it and ask for Google's head, because they deserve it, but act too.)

Push notifications are extremely useful for a few niche applications. Do not throw the baby out with the bathwater.


Push notifications allow me to spend less time on my phone/PC.

Without them, I would have to regularly check the app of interest for any important updates, losing both time and mental space with another thing to keep in the back of my mind.

Thanks to push notifications, I can defer interacting with the app to when it's actually relevant.

Also as I argued in another comment [1], web notifications are simply better than native app ones. Push notifications are here to stay, the choice is in the implementation, not whether they'll become culturally accepted or not (they already are, iOS users simple are nagged to install native apps to have them).


What updates are actually important though? A good 80% of notifications I'd get before I started systematically disabling them by default weren't important at all, they were just pointless nagware trying to get me to engage with the software more frequently so some analytics manager can justify why they shouldn't be sacked that quarter. It doesn't matter to these people that most of the extra clicks and presses are people telling them to piss off so they can get on with actually important work, they only care that engagement is going up. Proper tail wagging the dog bullshit.

Excessive notifications are a dark pattern which needs to be resisted, not encouraged. I'll continue blocking every single one of them and looking at apps on my own schedule, I will not allow random software to interrupt my flow because it thinks it has a right to my attention. Littering is culturally accepted too, but that doesn't mean either littering or its digital equivalent should be.


I agree and am of the same opinion, i’m glad Safari is playing by a slower tune and it definitely it isn’t google’s tune.


You don't want to be given the choice? Nobody is saying notifications should be silently accepted.


If google could move as fast as they could they'd serve their own interest at the expense of others and we know where that leads us. I'm glad that they can't move as fast and it takes a while for things to get accepted.


You're not wrong, there's real upsides to push notifications. For example the thing I love about the iWatch is how it lets me never check my phone because anything relevent is pushed to the watch.

But I think you are really underselling the downsides of push notifications. They are easily the most heavily abused feature of the modern web anywhere they exist. If every single app responsibly used push notifications then sure, it would be great. Here in reality push notifications aren't some personal majordomo making sure only important issues are brought to our attention, push notifications feel more like walking down the Vegas strip having leaflets constantly shoved in your face.

I think its premature to say that we have the answer and just maybe Apple dragging their feet on this will help us find a solution that is better than the advertiser friendly free for all that Google is pushing.


> push notifications feel more like walking down the Vegas strip having leaflets constantly shoved in your face.

Strong agreement here.

There's a significant (single digit %) of sites I run into that ask me to accept push notifications or ask for my email before any of the content loads. I usually close the tab when that happens.


I guess there’s a benefit of not having to take out your phone to check notifications but in the end they do compete for your attention and microdisrupt you anyway. And how many notifications on average are you willing to take in? I personally want as little as possible, the most important contacts and critical alerts are enough for me, the rest can wait till I check for them myself.


Yep and this is where it gets wierd. Maybe I should be allowed to assign priorities to different notifications and then have my phone automatically proritize what makes a "ding" based on the time of day or my geographic location. Maybe notifications should be throttled in some way so that every 30 minutes I receive a digest of notifications if there are any. Maybe they should be aggrigated till I have 5 accoumulated. Or maybe these are all terrible ideas because they require configuration and users aren't going to spend the time configuring their notifications.

This whole thing really goes to the different mindsets of Apple/iOS and Google/Android fans. The Apple camp is fine waiting for functionality to get something that is decently thought out, the Google camp wants everything now and will deal with the consequences later. We've seen this repeated with copy/paste and with background processes on both platforms. Android got there first, but the earliest Android implementations had serious limitations and issues.


> “Maybe I should be allowed to assign priorities to different notifications and then have my phone automatically proritize what makes a "ding" based on the time of day or my geographic location. Maybe notifications should be throttled in some way so that every 30 minutes I receive a digest of notifications if there are any. Maybe they should be aggrigated till I have 5 accoumulated. Or maybe these are all terrible ideas because they require configuration and users aren't going to spend the time configuring their notifications.”

If you haven’t tried it already, you may be interested in iOS 15’s new “Focus” feature set. Even tackles that last point on set up.


I simply solve that problem by having most notifications be silent. Only urgent ones (calls, text messages from a select group of people) are effectively "push" notifications. The rest buffers until I get to looking at my phone.


This is where Apple's paternalistic view takes over. I think Apple's design view is that you the user shouldn't have to take action to protect yourself from sties abusing this feature, the Google view is that it's a free for all. There are merits to both approach and I wish users of both platforms would understand that.


Notifications on all Android browsers I've tried are opt-in.

On the subject of notifications in general, Apple lets their users be spammed more by not adding granular notification categories like on Android.

Your comment doesn't reflect reality. If Apple took notification curation seriously like you imply, I would seriously consider switching to their smartphones, but they don't.


I think you need to take a look at iOS15 features. Not on par but a great start.


> This is where Apple's paternalistic view takes over. … There are merits to both approach…

Isn’t it generally accepted by both small children and adults that having a Dad is better than not?

Sure, teenagers rebel for a while against anyone caring about them, but they tend to grow back out of that as soon as they have to parent too.


I don't feel like that at all because on all of my devices, notifications are opt-in. I get notifications for services I want to receive notifications from (mostly IM).

The exception to this are services which trick you into accepting their notifications because they are beneficial to the experience while using them (ride hailing and delivery apps), and then use the same channels to spam you with marketing (Uber, ...). For those I simply chose to disable notifications altogether and instead let the drivers call me if I'm ever late.

E-mail fits that description better, because you almost always have to give out your address to register (even with Google SSO & co), which is then used for spam and you have to opt-out after the fact.


> I don't feel like that at all because on all of my devices, notifications are opt-in. I get notifications for services I use (mostly IM).

They aren't granular though, you either opt in to all or you get none. This is particularly annoying when you're using something like LetGo or OfferUp, because you basically need to opt in to receive messages from people interested in buying your second-hand stuff. But once you do, they take the opportunity to spam you with engagement boosting ads querying you if you have anything you want to list or telling you "we've selected these sales for you."

It's an extremely unpleasant design pattern for anyone who values their own time and attention.


>This is a personal opinion, but I'm actually really glad Apple are dragging their feet on this.

>I have my browser set up to wholesale reject all push notifications

I am struggling to understand the reasoning here. You don't like feature X but fortunately there is a configuration to turn it off. But you would still like it taken away from everybody else?

Perhaps I am misrepresenting your position, could you clarify?


I don't want to normalise endless notification spam, which is what having this feature in Google's advertiser-friendly conception of it available ultimately does. I'm already considered a bit strange my by friends for muting all group chats and disallowing notifications for things, we don't need to entrench this appalling social norm any further.

I find the idea that everyone should be instantly available at the click of the fingers to their friends, colleagues, managers, and crapware on their phones absolutely revolting. Push notifications are just interruptions as a service, out of band communications like that should only ever be used for important things but they just get endlessly abused as a form of spam in practice. The advertising industry already has enough tendrils to pry its way into our lives, I'd deprive it of yet another if it were up to me.


Isn't that what Apple's iOS notifications already do?

What's the justification for allowing it for a certain class of apps, but disallowing it for another class of apps, where the difference between the 2 categories of apps are just the technical infrastructure on which they are based?


It's a difference in philosophy for the most part. Google are an advertising company first and foremost, they'll do whatever is best for the advertising industry and to Google allowing every Tom, Dick, and Harry on the web to shove notifications in people's faces is great. Apple take a more paternalistic view, they see that push notifications in the browser as a feature that's often abused and so remove it entirely as to protect their user experience from annoying behaviour that's outside of Apple's direct control. Notifications originating from apps in the App Store in theory can be vetted much more closely and removed if they're deemed to be spammy.

Neither is intrinsically right and I think it's good that there's choice, what's winding me up is this idea that everyone should adopt the Google model unquestioningly as though Google are a standard unto themselves. I'm very much in the Apple camp, but it's not out of any love for Apple and more that I don't like the advertising industry and want to keep it as far away from my life as I realistically can. To me push notifications are like popups in the '00s, yes there's legitimate uses in limited cases but abuse is so widespread I'm going to treat every one as guilty until proven innocent.


There is a huge difference that you can turn off a feature A which might be very useful feature for many and you can't even enable the feature A because money. This case is about the latter..


I think depriving users of that functionality is more than worth it if it also deprives the advertising industry of yet another way to make everyone's life worse.


Thanks for your clarification, I understand your reasoning now


Push notifications are also a terrible feature that I don’t want on any of my browsers including desktop Safari. I don’t want developers to even have the option to show me a prompt to enable them.


I use notifications all the time in Firefox and Chrome. They're very useful for receiving updates about things like replies to comments, new videos from a certain video service and actual notifications from a few web apps.

Terrible websites are no reason not to implement a feature on one platform but not another, just a reason to disable them by default. You can already do that in every decent browser, it'll ddeny access prompts without even bothering you. There's no reason to take the option away from everyone because a few people don't like it, that's what settings are for.


> They're very useful for receiving updates about things like replies to comments, new videos from a certain video service and actual notifications from a few web apps.

The service formerly known as RSS/Atom.


I like having them for certain sites, but in Firefox at least, you can block sites from even requesting them if you really never want to use them.


That notifications are annoying is a myth.


Speak for yourself, sharing an open office with a person whose phone resembles a casino with all the pinging and flashing is profoundly irritating, as is having to sort through a Sisyphean list of pointless alerts to the point actually important things get missed.

I disable all notifications for everything by default, I'll look at your software on my schedule and nobody else's.


The next advancement after the some of the first web-browsers was pop up blockers.

So, at the very least; someone was annoyed at “notifications”.


I’ve been using Android for some years, and notifications are a non-issue. Yet it always comes up from safari apologists.


Novice users do not even know what it means to approve a notification. I look at their devices and often see a ton of web notifications that are objectively spam (eg, "tap here to meet hot singles"). The web notification spam is a huge issue.


Don’t you think that notifications and all the associated things are what pushing people to iOS?


Not at all. On Android you've had the ability to mute notifications by type for some time now. Every app that spams me gets its "ad group" notification group muted. I hear that iOS 15 is going to include something similar.


I hope they never implement them on mobile Safari, those things are a pain in the backside on the desktop.


In what way?


I see you are new here, would you like to receive notifications from us? You need to decide on it before even looking at out offerings.

NO

Okay, what about signing up to our newsletter?

NO

Okay, we will track you now all over the internet but if you like you can manage that.

JUST GO AWAY

Okay, btw to see the rest of the article or any of the pictures you need to create an account first.

---

Web technologist are clueless about what the Web has become. Please let browsers remain HTML and CSS viewing tools and make an app that I can review on the App Store page before committing to install it.

I hope Apple never implements website notifications on mobile.


You could just disable the notifications? You can do this in pretty much any browser.

https://support.apple.com/guide/safari/customize-website-not...


That doesn't mean that the website wont ask. Because they have single chance, the API is called after they receive your consent through an HTML dialog that looks similar to the native one.


If you disable them, they are not visible. If a website shows you a modal that looks like a notification consent prompt, I hardly see what that has to do with a browser vendor, which is the subject of this thread.


Because right now, they don't render that modal on Mobile Safari since there's no notification support. If they add notification support, we'll start getting those popups.


You could then use Brave which blocks in-website ads and annoyances even on iOS.


Would you like to be notified about meetings in your calendar?

Yes

Would you like notifications when someone mentions or DMs you in your work chat?

Yes

Would you like to know when you get emails?

Yes

Also only one of the things you mentioned has anything to do with notifications.


Meetings in my calendar, maybe, but I don't need them in the browser because I use a native app because I care about resource usage.

The rest of them, definitely not. Interruptions are extraordinarily costly. I'll respond within a few hours and when I message you about something, I have the same expectations.


No No No


We could have had passwordless internet if people behaved nicely. Instead of signing in into Gmail with a password and 2FA code, typing your e-mail address would have been enough to see your e-mails if everyone was honest and never misuse this power.


The nice thing about open standards like web push notifications vs. proprietary apps' arbitrary logic is that the open standard version puts users in control.

The browser is free to refuse every notification request AND make websites believe the notifications have been enabled. Unless smartphone OS vendors are kind enough, you can't do that with app notifications.

It seems you're arguing against the concept of push notifications itself, not whether the native or web app implementation is better.


We wouldn't need authentication then...


I agree that those patterns are annoying but installing an app is at least as many clicks and in fact I may have to search through a few apps to make sure I am installing the right one and not some ripoff that Apple tries to lure me towards with their broken search.

PWA's could be allowed to use things like the push API only if added to the home screen and this problem would be completely solved. You don't like browsers becoming app platforms but I am sorry to say that that ship has sailed on desktop and while Apple is fighting it on mobile, it's a fight they will lose eventually.


Installing an app gives you a chance to consider the option first, with websites you are required to download and install it to take a look and see what is it all about.

On the other hand I wouldn't have a problem with the model where PWA's notifications and other functionalities are limited to apps added to the home screen and the options to manage those is exactly the same with native apps. I don't advocate that all apps must go through the App Store.


As one of the rare Safari desktop users: the most useful key-bind I made was command-shift-r to bring up reader mode. A lot of the "growth hacking" metrics have made the web near-unusable, but reader mode goes a long way towards making it tolerable.


The thought of websites sending me notifications is reason enough for me to want either an app so that I can easily control it’s notifications, or to just not use the service at all.


A magazine article doesn't need to be able to send push notifications.


This is it.

This is the single most user-hostile restriction I’ve seen on iOS.


I think developers are angry most about it is buggy. And, no. It isn't only lack about the cutting edge tech. It don't even handle css2 correctly in many places. You can ran into bug in every damn common css or js property that even IE handles correctly. It is a true IE successor. Can you imagine making a input field on a whole screen container work properly on safari requires how many efforts? You will need several workaround to make it work properly. Even ie don't really require that…


It wouldn't be nearly as bad if it were possible to debug Safari issues on other platforms. Instead, you either have to own a Mac, or use something like Browser Stack. At least Chrome is cross-platform.


Actually, Safari integrates nicely with the desktop Safari. You can access the mobile Safari through the developer tools in the desktop Safari and debug as if it is the desktop one. It a shame that Safari is not OSS.


Did you mean to reply to the other comment? Desktop Safari still requires a Mac as far as I'm aware...


Yes it is unfortunate that requires a Mac but a Hackintosh is always an option :) Some many years ago it had a Windows version but it was discontinued.


Yeah, I don't really find installing a pirated operating system in order to run a web browser an acceptable option.


I've installed a lot of operating systems on my machines. Installing macOS was by far the MOST painful.


I don't think you can debug issues on iOS for that matter. Perhaps it changed, but at least a while ago this wasn't possible. And people say you can do work on iOS...


We are not angry about the lack of toys. We are angry about not having the possibility of using PWA as an app model. PWA are useless without Apple's support.

My case was a WebBluetooth based download of data from a web page. Now I have to maintain two apps. Thanks.


> My case was a WebBluetooth based download of data from a web page.

Web Bluetooth is not a web standard. It’s a Google API that only Blink supports. Firefox doesn’t support it either and doesn’t plan to:

> This API provides access to the Generic Attribute Profile (GATT) of Bluetooth, which is not the lowest level of access that the specifications allow, but its generic nature makes it impossible to clearly evaluate. Like WebUSB there is significant uncertainty regarding how well prepared devices are to receive requests from arbitrary sites. The generic nature of the API means that this risk is difficult to manage. The Web Bluetooth CG has opted to only rely on user consent, which we believe is not sufficient protection. This proposal also uses a blocklist, which will require constant and active maintenance so that vulnerable devices aren't exploited. This model is unsustainable and presents a significant risk to users and their devices.

https://mozilla.github.io/standards-positions/#web-bluetooth


Yes, and from users perspective I don't care. I find in eerie having the browser being an App platform.

Chrome and Android are very dominant platforms, if it is something groundbreaking that can be done in that model it would be done in Android and Chrome and Apple will adopt it. Google is not an underdog.


I have the exact opposite view. I have tons of apps on my phone that really have no business being apps, they should just be websites but with push notifications.

For example, I'm not a huge traveler (obviously not in the past year), but even before the pandemic I'd use AirBnB like once a year maybe. I don't want to download an app for something I use once a year. But before and during my trip push notifications are important because it's basically a messaging app with the owner.

I'm not talking about things like games or something else with complicated UIs, I'm talking about basic CRUD functionality where I occasionally want a push - most of my banking apps fit this as well.


I agree that many Apps don't have business being apps and they are just low effort attempts, however I still don't want to have websites asking me if I want to receive notifications on mobile.

Website installation process is very annoying already. I don't want more. Do I want you to track me? No. Do I want to sign up to your newsletter? No. Do I want to create account to see the links? Please no. Do I want to receive notifications? No.

I need to go through all that just to view a page on a website. Apps are much better, I don't have to install it before seeing what is it all about thanks to the App Store page that includes a description, screenshots and reviews.

If there's something that I need to be notified, websites will send me an e-mail and the mail client will show me a notification about that e-mail which will result in me being notified.


> I have tons of apps on my phone that really have no business being apps

I don't like them being apps because of the amount of privilege they gain. But if every site I visit on the web gains those privileges? That's a step backwards for me.


What is the hesitation about just downloading an app for a week or two? The allergies to apps ITT are very interesting to me


And as another user I care. I don't want to install yet another app to clutter my phone just to access something once that could be a webapp. The browser is an awesome delivery mechanism for stuff I'm going to use once/rarely: no install or uninstall needed.


I haven't seen a single website that doesn't require installation.

The website installation process is much more painful than apps because first you need to install it to see it.


What website requires installation? The ones I use all work perfectly fine just from the web UI with the option to install them?


All websites(with the exception of HN maybe?). Installation doesn't strictly mean putting the website on your home-screen. Whenever you encounter a new website you are required to go through installation process, you can't simply click a link and consume the content anymore.

For most websites you need to read and accept the tracking terms and if you are not happy with the default options you need to do custom installation.

For the majority of the websites, the next step is to decide if you like to sign up for e-mail marketing or create an account. Many will then ask you to follow them on their social media. On desktop, they will ask you to receive notifications too.

After 2 to 5 clicks you have your website installed and can start looking for the content in between the ads. They don't have much monetisation opportunities so you will get all the ads at once.

The installation is also not very persistent, if you are not using the website daily you will go through the same the next time.


I wouldn't call that "installation", just "setup". Installation implies some form of (local) persistence.

I avoid setup screens as much as possible. I close many of the websites linked to on HN but I'm not accepting these newsletter popups and neither should anyone else.

Websites will keep doing this for as long as people accept this behaviour. Depriving our browsers from useful features is not a solution, it's merely digging our head into the sand and pretending the real problem, problematic and aggressive marketing and data trade, doesn't exist.


> Installation implies some form of (local) persistence.

You achieve a local persistence to some degree through cookies and local storage.


"Installing a website" is something I might expect my 70-something year old mother to say. Oh brother.


Me and your mother might have many things in common but I doubt that the reasons of calling it "Website installation" is one of them.

I would say that if you were familiar with the web technologies you would have seen the parallels.

When you type an address of a website or click on a link your browser would download the website data from a server, which is very similar to what happens when you tap "Install".

Once your website data is downloaded, the browser would execute the instructions in that data. Usually further resources would be downloaded and, UI drawn and functions executed. This is again very similar to any apps that you run once downloaded.

Installation of an app implies making it ready to use. You can download an executable that runs right away or you can have an executable that prepares the environment for use. This is again exactly the same with visiting a website, it might be ready to use right away or in might need a set-up of the environment.

In 2021, most websites need a setup process before use, you need to set up the ways you are about to be tracked and what information is to be shared with their partners. You need to decide if you like to sign up for newsletter, if you like to follow their social media accounts and more often than not you will be required to create an account. You will be prompted to set-up notifications too, if the browser support it.

Often the native App installation is less tedious and less painful than the website installation process because the app can afford to assume that the device is a personal one and the stored data wont be erased accidentally, therefore the account creation process can be both transparent and persistent.

With native apps, you get the chance to see the app before installation(App Store pages with description, screenshots and reviews), with websites you first need to finish the installation and then you get the chance to see the website.

Anyway, I hope you see how a website could be installed and in fact is the standart modus operandi in 2021.


I understand your desire to build a web app with Bluetooth access, but there’s just too many scummy adtech/malware assholes out there that would use WebBluetooth as another way to do their nasty deeds. It cannot be done in a safe fashion, so better to not do it at all.


>We are not angry about the lack of toys. We are angry about not having the possibility of using PWA as an app model.

I, on the other hand, am angry that PWA is even considered as an app model.

A regression, if I ever saw one...


Fair perspective.


PWAs are useless.


== != === If every analogy was held up to this level of scrutiny, we wouldn’t have analogies. Something can be similar in nature without being an exact equivalence.

Like many others have started- it’s common these days for devs to create modern apps that work on modern browsers but break in weird ways on Safari. That’s enough of a similarity for the analogy to convey the idea.

Web-P / WebGL support aside- Safari often fails to render complex SVG animations for obscure reasons. It can’t even render a hex-based CSS radial-gradient that looks the same as it does on Chromium or Gecko.

It’s really bad, and Apple benefits from that. They are holding the web back as IE once did in the past.


Shouldn't that be == !== === if we're talking about our analogies with JavaScript?

Though maybe I just proved your point about missing the forest for the trees :-P


Brendan Eich doesn't understand equality.


Indeed, there's a difference between “looks equal enough if you squint sufficiently” and “actually equal”. Apart from the difference between equality and identity.


What does that mean?


It means Brendan Eich is opposed to same sex marriage: Brendan doesn't believe that same sex couples should have equal rights as opposite sex couples. There's something about equality that deeply disturbs and confuses Brendan, so much that he donated a large amount of money to the cause of tearing apart families by destroying legally married loving couple's existing same sex marriages in California, while Brendan and his own opposite-sex wife enjoyed all the benefits of marriage themselves. Brendan Eich believes that his own marriage is greater than, not equal to, same sex marriages of lesser people.


Biden, Obama and Clinton were against same sex marriage until they "supposedly" switched. You can't judge a person based on what's reported on them. Have you read or talked to Brendan about why he thinks and why he thinks it? Being against same sex marriage does not necessarily mean he thinks same sex couple are lesser people or that he's greater than them, it means he has an opinion, people with means and opinions use their means to argue for their opinion, and there's nothing wrong with that.

Your idea of equality are skewed by popular mainstream media. We should not strive for equality because life is not equal, but we should strive for fairness. If you want equality between me and an NBA basketball player you'll have to break their legs or something because I can't play basketball if my life depended on it, so there, that's an example of life's natural inequality, which is fair.

Fairness means equality of opportunity, not the equality of outcome. Arguing for the equality of outcome is tyrannical and pure evil.


> Fairness means equality of opportunity, not the equality of outcome. Arguing for the equality of outcome is tyrannical and pure evil.

You're right; fairness means equality of opportunity. Arguing for removal of equal opportunity is "tyrannical and pure evil".


>Fairness means equality of opportunity, not the equality of outcome. Arguing for the equality of outcome is tyrannical and pure evil.

Equal rights means equal opportunity though, not equal outcomes.


Glad you brought that up. Let me point out that Biden, Obama, and Clinton switched from bad to good. That's called evolving. Trump used to be pro-gay, but then switched from good to bad, selling out gays to get elected. That's called devolving.

I prefer people who evolve like Biden, Obama, and Clinton did, instead of devolve like Trump did, by selling out to haters in exchange for political power.

As far as I know (and I've asked people who've known him for a long long time), Brendan has always been badly against gay marriage, with no plans of ever evolving or apologizing or even explaining himself.

And yes I've asked Brendan himself about it, and he outright refuses to explain or justify or apologize for either his hateful opinions or his hateful actions that hurt other people.

Brendan Eich went way beyond simply holding an opinion when he purposefully contributed money to the campaign to tear apart other people's marriages and destroy their families.

Brendan's money paid for TV commercials blasting lies, stereotypes, prejudices, and attacks against gay people.

Brendan's goals were satisfied: Proposition 8 passed, and marriages were destroyed because of his support, thanks to the anti-gay propaganda campaign he helped fund.

Brendan Eich knew quite well what he was doing by supporting Proposition 8, and he totally meant to do it, and he has never apologized or evolved, or even "argued his opinion" as you falsely claim he uses his means to do. He never even meant for his opinion to be known, let alone argued for it in pubic.

https://en.wikipedia.org/wiki/2008_California_Proposition_8

But I strongly suspect he isn't thrilled that GamerGate and Alt-Right have made him their unwitting hero and martyr and poster child, and that they keep spreading the lie that Brendan was fired from Mozilla because of cancel culture, when Brendan actually resigned of his own free will, and Brendan has said repeatedly himself he resigned and was not pushed out, and the Mozilla board said "Brendan voluntarily submitted his resignation", and the board actually begged him to stay, and this is all very well documented and not contested except by batshit crazy conspiracy theorists.

https://blog.mozilla.org/en/mozilla/faq-on-ceo-resignation/

>Q: Was Brendan Eich fired?

>A: No, Brendan Eich resigned. Brendan himself said:

>“I have decided to resign as CEO effective April 3rd, and leave Mozilla. Our mission is bigger than any one of us, and under the present circumstances, I cannot be an effective leader. I will be taking time before I decide what to do next.”

>Brendan Eich also blogged on this topic.

https://brendaneich.com/2014/04/the-next-mission/

>Q: Was Brendan Eich asked to resign by the Board?

>A: No. It was Brendan’s idea to resign, and in fact, once he submitted his resignation, Board members tried to get Brendan to stay at Mozilla in another C-level role.


Brendan Eich is the creator of JavaScript.


Yes, modern apps will randomly break on Safari. They'll also randomly break on Chrome and Firefox in other fun ways.

Safari isn't perfect but neither are the other browsers. And as far as not supporting every little thing Google decides to shove in the browser, good! I'm glad Google is working to push out new ideas for the web, but I'm also glad Safari is keeping them honest by not providing wide support of those ideas until they go through a standardization process.


I call Safari the new IE because like IE, Safari requires specific code and workarounds for itself quite often. Unlike Chrome or Firefox.

You also need an iOs device and a Mac to debug Safari, like you did need a windows computer to debug IE.


>This is just false, or at best only half of the story to make a false equivalence. IE was cutting edge until Firefox.

After developing during the IE 5.5/6 years there's no way I'm giving you a pass in calling IE cutting edge.


At the time IE 6 was released, it was the best browser available. The problem was Microsoft acted is they had “won” and stopped updating it for years, where it calcified and became the hot mess we all knew and disliked.


IE 6 for Windows was never the best browser. At that time Microsoft's own IE 5 for Mac was way better. It was written by a different team (including almost celebrity programmers like Tantek Çelik). It had way better support for CSS. It even supported PNG alpha transparency.


At that time Microsoft's own IE 5 for Mac

Yes, it was better but it didn’t really count, as Apple just emerged from its near-death experience and Steve Jobs cut the deal with MSFT to make IE 5 the default browser for the Mac.

Apple’s marketshare was tiny back then and many people in the industry didn’t think Apple would be relevant much longer.

Ironically, after the 5-year deal had expired, Apple shipped Safari 1.0, which was better than the vaunted IE 5 for Mac.


From what I've heard, MS's whole idea was that ultimately, "the web" was broadly going to be scrapped. What people were actually going to be using, "in a few years", was going to still be IE, but the webpage would just be a thin wrapper that would load in ActiveX controls delivering actual windows apps, in the browser.

Apparently they got close enough to getting this working that a bunch of South Korean banks and such bought into it heavily, which meant you could only do online banking in that country, for a while, on either a windows phone or an actual windows machine.

However, they narrowly missed getting developer buy-in on that stateside, before the whole "XMLHttpRequest" thing went crazy, and we started building javascript apps for everything.


That I never got. You are a market leader and you just stop whatever you are doing? How did that happen? Arguably the biggest mistake of MS.


At the time, and perhaps still today but greatly lessened, Microsoft was used to both becoming the victor by crushing the opposition and then staying the victor because no one dared take a second swing. Anyone else who thought about challenging usually thought against and then went and carved out a quiet niche elsewhere.

Besides, nobody got promoted at Microsoft for staying the course and making incremental improvements. Once there were no more balls to be knocked out of the park, developers moved elsewhere to buff up reviews and get bullet points for major projects.


What a story… It’s crazy when no one owns something, it just gets abandoned. If there ever was a time for microsoft to have done things differently, it was then. Might even have helped transition to mobile better.


IE6 was wrapping up development when the US government sued to break up the whole company for giving it away with Windows. Perhaps not a coincidence they stopped work then?


At it's birth - IE6 was absolutely cutting edge.

By it's death - IE6 was absolutely stagnant and fallen behind.


IE was the blunt and dangerous cutting edge because you haven't thought to buy a new knife yet


Perhaps you missed it by a year or two, but they invented ajax.

If thats not cutting edge, I dont know what is.


People complaining about Safari also seem not to know how much it brought when first iPhone was released. Canvas, CSS animations, etc.


No single organisation has done more to advance the mobile web than Apple. Can you think of any?

The first iPhone and “desktop class” Mobile Safari was a watershed moment and everybody started using WebKit for their mobile browsers after that point, including non-Apple devices.

Look at what the mobile web was like before the release of the iPhone. Then take a look at what it was like the year after, when people had started making their websites “iPhone compatible”.


Maybe Blackberry ? It wasnt on html they innovated but I loved mine before the Samsung revolution.


> IE was cutting edge until Firefox.

I was there. I call bullshit. Competitors like Opera and IE for Mac (a completely different codebase and product) could do alpha-transparent images in the background, CSS sibling selectors, display:table-* and dozens more. For a few fortunate Web developers who were strong willed enough to heed the call from Alistapart to standards compliant code, we were able to circumvent the lack of "cutting edge" features – typically already published years ago – by employing HTC <http://enwp.org/HTML_Components> and hiding this from other browsers with conditional comments. Nowadays this would be called a polyfill.

The reality is that IE suffered immense neglect.


I recall having an orange iMac on my desk specifically for testing because IE5.5 on the Mac was the most standards compliant browser on the market. Cant remember the exact year, but it was pre-millenium I think.


Exactly. In order to call something cutting edge, it should be sharp. IE was frustratingly dull.


> The biggest problem with IE was all the insanely weird shit it did, not missing features.

This depends entirely on what decade you focus on. There was a significant period of time where even the "cutting edge" version of IE was the furthest behind on implementing new standards. When you combine that with the way IE handled updates and versions, older versions of IE created huge problems for developers.

So there is an era of IE that is very similar to Safari today.


I disagree. The article author is likely referring to the period when IE version 6 was the major browser on the market (from 2001 to 2009), a time many of us fondly remember as the web development ice age. It absolutely did lack features present in other browsers, such as SVG support, transparent PNG support, the full scope of CSS 2.1, countless standard JavaScript APIs (remember jQuery), etc. Rendering bugs added insult to injury, but the lack of features was real. Microsoft didn't release IE7 until 2006.


> IE was cutting edge until Firefox. The biggest problem with IE was all the insanely weird shit it did, not missing features.

This is completely incorrect. IE lacked CSS2, it lacked proper PNG support, it lacked HTML5 tag support for ages, its DOM implementation was missing functions for the DOM spec that it implemented, it lacked SVG support entirely, it didn't have meaningful developer tools (no, the browser toolbar didn't count because it crashed so much) until IE7... The list goes on. It's simply revisionist to say that IE wasn't missing features. It was nowhere close to the cutting edge, even when its releases were cut.

IE lacked just as much as it got wrong, even at the time.


Thank you. I have no idea what these people are talking about. They obviously didn't actually do development back then.

> Apple took years to finally add WebRTC support to Safari, far enough behind Chrome and Firefox that it practically became a running joke among developers and even industry observers.

Really? Because no one I know uses WebRTC and would rather use ffmpeg and sockets to do serious work like that over a desktop application.

> But at the same time, the lack of support for key web technologies and APIs has been both perplexing and annoying at the same time.

Yeah, what we all need is Web Bluetooth.


Lots of sites use WebRTC for video and audio calls. Google Meet, Discord, Slack for example. On desktop it's far more secure to use those in-browser than via desktop apps.


Yeah, and those all suck up my RAM. Color me nonplussed that WebKit doesn't follow in Chromium's footsteps.


The criticism was that "no one uses WebRTC", not that WebRTC was inefficient. Clearly millions of people do use it, so you aren't refuting the point in any way.

Perhaps if Apple had supported WebRTC earlier their engineers could have influenced the way it works to make it better. That's the main problem with ignoring standards in an org the size of Apple - you don't get to contribute to them. I don't mind that Safari runs behind the other browsers because progressive enhancement enables me to deal with that. I do mind that Apple are failing to be a meaningful part of moving the web forwards, and are basically letting Google do whatever they like.


Millions of developers are not using WebRTC. Not sure how you read my post, but clearly you wanted to make your own spin.

Are millions of developers using Widevine, too? What kind of stupid argument is that?


>Clearly millions of people do use it, so you aren't refuting the point in any way.

Are forced to use it. And millions more would be forced to use it if Safari handed it on a plate as well...


The overwhelming majority don't know, or really care, that they're using it. People don't care about tech. They care about what they see. User complaints will never be "WebRTC is bad!". They'll be things like "Google Meet kills my phone battery!", or "Teams doesn't work on my iPhone!". The solution to those complaints is usually to make the underlying tech better rather than chuck it out entirely and use something else.


>The overwhelming majority don't know, or really care, that they're using it.

Which is why as devs we have a responsibility to save them from bad tech.

That said: "PWA: because people don't care about tech" would be an apt marketing slogan.


Safari may be better than Chrome for backgrounded tabs or throttling when idle, but I can assure you its RAM usage is still quite high for many things, WebRTC included. I'm currently grappling with some OOM issues even on the latest iPhone 12 because Safari will wildly spike in RAM usage for certain things, especially around video and iframes (Nothing to do with JavaScript heap, this is webkit engine level stuff.)


Discord in particular, is an example of Electron done right. It's super performant and low impact on resources. MS Teams, made by a vastly richer company, is an example of the opposite.

It's not Electron that's eating your RAM.


Visual Studio Code was also done right but it seems to get worse with patches.


So, it's not just me that noticed it's getting worse.


I don't see how that's an argument for not supporting WebRTC...


Feel free to not use those apps. But a lot of people do want to use them.


When people say that Safari is like IE they aren't talking about 20 years ago when IE was dominant in its prime, they are talking about the geriatric IE era when it was an also-ran that you had to support but didn't implement modern features such as flexbox properly etc.


Literally hundreds of millions of people are going to make calls through Teams/Slack/Zoom & WebRTC today alone, grandpa.


> Apple took years to finally add WebRTC support to Safari, far enough behind Chrome and Firefox

"Far behind Chrome" should never be used in a sentence. Ever. Chrome releases up to 40 new "standard" features once every two months, many of them are just internal Chrome APIs with a "standards spec" in the form of a draft.

Specifically for WebRTC. It was shipped in Chrome 23, in May 2012.

Guess what? Here's the timeline:

- In 2010 Google buys a company called Global IP solutions with some proprietary tech

- In May 2011 it opensources this tech

- In October 2011 publishes the draft spec

- Just 7 months later ships it enabled by default in Chrome

Do you know when the final stable spec arrived? In May 2018, full seven years later after Chrome enabled it by default.

That's why Safari had it implemented since September 2017, because the spec was finally in a proper shape.

No idea why Firefox rushed headlong into it though.


how do specs get to 1.0? browsers heed the call to implement & do so. they work through what it takes to improve & finalize the spec.

your timeline, to me, indicates less that the spec was rushed & bad, and more that it took apple 5 years to get off their hostage taking kick & finally face the music & do what should have been done years ago, which is implement. once apple did start to implent, 1.0 came soon after: because that is how specs work. companies implement, they find what's not right, & they improve, then make a 1.0.

I would definitely be interested in a changes over time review of webrtc. my feeling is that the core ideas & protocols were largely stable, but needed tweaking & elaboration, needed conformance suites elaborated. I could be wrong. this is in contrast to something like spdy, which evolved seemingly a good number of times in major ways, before ultimately getting parlayed into quic+http3.


It seems strange ( and fascinating ) everyone have different memories of IE days.

My problem with IE was mostly with its ( lack of ) development.

>IE was cutting edge until Firefox

It was cutting edge for IE 5, and IE6 only fixes some of IE 5.5 issues. Remember this was the early days and there were thousands of low hanging fruit. It wasn't even standard complaint problem, ( as much as I would like to write one piece of DHTML code that work across all browser ) like many here have stated M$ decided to do it differently and I actually would not have a problem with that if their version was actually better ( Some features were indeed better ). But they stopped. IE 5.5 was already a relatively small changes from IE 5 which was the really big release. And IE 5 if I remember correctly was before 2000.

I would not have been so pissed about IE if M$ at least release some update. It wasn't until security issues got so serious did they decide to move their ass to do something. So from my point of view, Safari in this case, despite their slow release cycle, is still trillions times better than IE. I would normally have stood up for Safari on PWA and Safari as IE issues, but the pain of App Store lead me to change stance. And to add insult to injury they even claim PWA as an alternative to Native Apps in the trial.

Apple used to have a very coherent strategy and message, even to the way how things are being discussed in court. Now it is a pile of mess.


> Chrome is closer to IE than Safari because of market share AND developers developing exclusively for it forgetting anything and everything else.

^^^ This! Yes, exactly! ^^^

I'd have thought the world had learned it's lesson with developing websites exclusively for IE, but apparently not. When you develop a website to make use of a single browser and break in (m)any others, you're purposely attacking the way the Web works. It's all designed around standards for a reason.


Both stories are correct.

IE was far ahead of its time in the beginning, bringing us advancements like XmlHttpRequest which set the stage for websites to become webapps. Eventually though the entire team behind it was disbanded and dissolved into other divisions, which let the technical debt accumulate into the eventual mess.

What made it so bad is that web technologies started to progress faster and faster, constantly building pressure on IE to adapt, but it just couldn't be made to work across all of the integrations with Windows, plugins like ActiveX, historic compatibility promises, enterprise policies, and newer rendering stacks used by Firefox and Chrome. Perhaps they should've retired several years earlier or brought out Edge sooner, but the IE story has both a terrific rise and a crazy fall.


Anyone remember mobile safari's CSS 100vh 'feature'?

https://medium.com/rbi-tech/safaris-100vh-problem-3412e6f137...


Yeah, I hadn't needed to target safari as a supported platform until I was building a public facing website (as opposed to intranet tooling), and this was an amazingly frustrating feature to work around. Solidified my hatred for safari, that's for sure.


> biggest problem with IE was all the insanely weird shit it did, not missing features

Biggest problem with Internet Explorer was it was slow. That's largely the result of the weird shit you mention. But if Microsoft had kept IE fast, it would have denied the field to Chrome.


Was IE outdated or was it the version the laziest IT department/sysadmin kept deploying that was old?

I remember someone telling me they couldn't use anything else other than IE6 at a non-tech job years after it was deprecated and at least two major versions behind.


Is it fair to say that Safari is Apple Maps Bad?

https://www.youtube.com/watch?v=tVq1wgIN62E&ab_channel=Dames...


Could coin it Reductio ad IErum


Mobile Safari still has a bug in web audio when connecting an analyser node with a mediaelement (where freq array is blank all the time) that's been reported afaik since 2015 if not earlier.

Mobile safari might not be the IE of the modern day, but the lack of support for core standards and focus on random shitty features (like hiding full urls in address bar) is akin to how M$ managed IE.


Apple has its own opinions about what web standards it should implement. It’s as simple as that. Google has been extremely successful in convincing developers that browsers that don’t behave like Chrome are behaving wrong.

Don’t get me wrong, I wish Safari had more features, fewer bugs, and an evergreen release cycle. It’s clear Apple do not see themselves as competing with Chrome, largely because of iOS, and I do want them to take it more seriously. But the people who write Chrome explicitly do not care about other browsers, and think that the world would be a better place if Chrome were the only one. Between that behavior and the whole AMP fiasco, I’m surprised so much of the blame gets thrown at Apple for destroying the web. Am I supposed to be writing PWAs now, or AMP pages? If Reddit can’t figure it out, how the fuck am I supposed to?


You’re talking as if the W3C or even Mozilla don’t exist. This isn’t an Apple vs Google thing. This is an Apple vs web standards thing. That’s why Safari is outdated for developers.


W3C hasn't been involved in HTML "standards" for over ten years now (they're hosting the CSS WG, though).

Don't get me wrong. I think WHATWG (Domenic et al, and Hixie before) worked very hard to bring HTML forward. And the idea to have a loose workshop of "browser vendors" agree on how to evolve browsers might've looked like a good one at the time. But is nevertheless not what standards are about, especially when it drives "browser vendors" out of business or out of producing browsers alltogether (such as Opera, Hixie's original employer, and even Microsoft).

Also, Mozilla lending credibility to such an effort hasn't helped Mozilla either when Firefox has fallen from its position as dominant and innovative browser to low 1-figure usage.

At a certain point years ago, work done on "web standards" has become a Google stealth ops to own the Web even more than they already do. As it stands, WHATWG will be remembered as the ones to have killed the Web, and oddly enough, Apple as the last ones standing in opposition to a web owned by Google. Certainly not webdevs caring about fscking PWAs and push notifications that absolutely nobody wants.


For the most part W3C is where standards go when they’re finished (often long after developers expect them to be available everywhere, especially these days). WebUSB for example is currently a draft, not a standard. Many of these complaints are about technologies that Chrome implements that have yet to be fully standardized.


> Many of these complaints are about technologies that Chrome implements

The article mentions explicitely: WebRTC, VP9, WebP, notifications, home screen icon shortcut, Local data storage, camera, microphone, USB port, and some more.

The only one in there that is not supported by w3c, as far as I know, is USB. All others are proper W3 standards yet unsupported by Apple WebKit/Safari.

Many of these complaints are about technologies that are fully standardized and supported by all other mayor browsers.


Notice how these are all Google technologies?

That doesn't contradict your point per se, but it basically feels to me like Google is dictating the standards right now. So I'm not sure Apple needs to play along.


It seems like the bigger issue is that Apple isn't playing a bigger part in developing web standards. If Apple doesn't like certain features, they should fight to keep them from being standardized, instead of just refusing to implement them on their own browsers. Where else are web developers supposed to look towards when trying to figure out what's standard and what's not?


How is push notifications, local storage, camera, location, microphone and USB "Google tech"?


This isn’t an Apple vs Google thing. This is an Apple vs web standards thing.

It’s actually and Apple vs. APIs and technologies that allow fingerprinting on the web thing.

Nobody mentions that Mozilla has sided with Apple much of the time to not implement some of these same APIs and technologies.


> It’s actually and Apple vs. APIs and technologies that allow fingerprinting on the web thing.

Of course it isn’t. Apple had longstanding (years long) bugs with IndexedDB. Their PWA implementation has been buggy from day one. They don’t implement push because it threatens their native app ecosystem, not because of privacy concerns.

Yes, there are some APIs with privacy concerns. But that’s a small part of the story of Apple’s neglect.


Then apple should focus on improving the APIs to be privacy safe (how they successfully do in their own APIs). But they are not.

It is not about privacy. It is about revenue.


hen apple should focus on improving the APIs to be privacy safe (how they successfully do in their own APIs).

If you follow some of the conversations on GitHub when a proposal comes up from Google, which I sometimes do, there’s no simple way to make some of these APIs private. And when Apple proposes ways to do so, Google balks at them because… their revenue is based on advertising that tracks users across the web. It’s not rocket science.

Remember, Google is the only company that ships a mainstream browser that doesn’t come with privacy settings by default. There’s no features in Chrome to block trackers or limit how 3rd party cookies are used unlike Safari, Firefox and Edge.

Apple’s market cap is around 2.5 trillion dollars; while they make revenue from apps, it’s a rounding error compared to their overall market cap or their iPhone revenue.


I don't know why you think the market cap is relevant here. But Apple's net income in 2020 was about $60B. The total App Store revenue was about $64B, out of which Apple extracted $20B in rent.

Not all of that $20B was profit, but based on what we know about the economics of running such stores, it was about $15B of profit. That is 25% of all of Apple's profit in 2020.

Apple will do anything to protect that income stream. If the way to protect it is to cripple the web, well, you can't make an omelette without breaking a few eggs.


And yet nobody is writing such articles about Firefox. There's plenty that's not privacy-invasive and that Firefox thus does implement, that Safari does not or does buggily.


Perhaps it has to do with Firefox not being the only browser(/browser backend) available on an entire class of Mobile devices.


> It’s actually and Apple vs. APIs and technologies that allow fingerprinting on the web thing.

A sibling comment says:

> The article mentions explicitely: WebRTC, VP9, WebP, notifications, home screen icon shortcut, Local data storage, camera, microphone, USB port, and some more.

Some of these things enable (more) fingerprinting, but VP9 and WebP? Nah.

I have all kinds of Apple stuff--phones, laptops, accessories--but a lot of what they do is strategic in service of enlarging their profit margins (look at right to repair, for example). I don't necessarily blame them; capitalism is a race to the bottom here. But let's not cast Apple in this light of noble privacy defenders. They're doing that to make money too.


For all pratical purposes they don't.

W3C nowadays only retifies standards that are in widespread use, and Firefox is no longer relevant in our browser matrix for deliveries, with a single digit usage.


>You’re talking as if the W3C

The W3C doesn't exist, hasn't mattered for 15+ years, haven't you heard? It's not even the HTML WG that matters anymore (which took over from W3C in an ad-hoc way). It's basically Google, and that's that.


Yes. From Google's point of view they don't exist.

https://webapicontroversy.com/ and WebHID timeline: https://twitter.com/dmitriid/status/1419436102373584905

There are countless other examples


Web standards stopped mattering in 2004, these days they are just whatever Chrome implements, and like Mozilla they are kept around so that Google can pretend they don't control the web.


The fact web standards exist doesn't not mean they are to be used or implemented. If that were the case, the web would have became the semantic web by now given how much standards the W3C wrote about that.


If we’re talking about mobile browsers, Mozilla functionally doesn’t exist. That space is just Safari vs. Chrome(ium).

And Apple are not obliged to have 100% feature parity with Chrome.


And if we’re talking about iOS browsers, Google doesn’t functionally exist. That space is just Safari.

…hence why people complain about Apple’s neglect.


Why do you say that? Firefox on Mobile exists, and I was under the impression that on Android, it uses its own rendering engine.


It exists, but its market share is negligible. According to StatCounter, Firefox has a market share of 0.5% on mobile: https://gs.statcounter.com/browser-market-share/mobile/world...

Even on desktop, Firefox is sliding towards irrelevancy, with just 7% market share.


Honestly it's weird to me that more people don't use Firefox on Android. Only mobile browser that supports uBlock Origin.


Apart from that, it's also fast, rock solid, has UI on the bottom, which is much easier to use, syncs with desktop (if you want that) etc...In general, it's excellent and I don't understand why it isn't hugely more used - unless I concede to the argument that people just don't care about privacy and Google's monopoly...


The difference in the amount of attention invested in spec support in Chrome and Safari can be explained by the difference in business models between Google and Apple.

For Google, the browser is closely associated with their revenue streams. If you profit by showing ads across web properties, you’d naturally prefer to have more control over how those properties are viewed.

The motivation can be benevolent in some aspects (to show more ads, we want people to want to browse the Web more—so we want to support fancier tech enabling fancier websites to be developed more easily) and less so in others (we want to control the viewing mechanism to ensure people keep seeing ads; we want to become the de-facto standard of browsing so that we don’t have to worry some newcomer eating at our revenue stream; the more complex the spec, the bigger our moat).

To Apple, the browser means relatively little on desktop; on the more sandboxed mobile OS it matters more due to being responsible for a lot of the attack surface.


Even worse for Apple, if there was a more capable browser on iOS, the need for their app store would be reduced.


Reddit is a pretty low bar. I don't think they care much about the site except to make it objectively worse than the app at every convenient opportunity.


> Reddit is a pretty low bar

Reddit is a multi billion dollar product and one of the most trafficked sites on the internet.


If you've ever used their mobile site, it's clear they go out of their way to sabotage the experience so that people will use the native app instead


Shit they even admit as much with the popup for the app saying "it's better in the app".


The oddest thing is it's still better on the web than the Android app that's practically unusable. You click open in app and then it loses your thread you're trying to read. I don't even have any strong objections to the app but happy to dance though the dark ux pattern to read the posts!


I have a curated list of subreddits I like to read. The moment old.reddit and i.reddit stop working I'll drop them like a bad habit. I'm not going to use their absolutely terrible "new" UI nor their insipid app.


As of late, if you’re not logged in, many features get disabled, like the ability to expand/read more comments.


Use old.reddit.com/i.reddit.com and many of those restrictions go away.


They sabotage the desktop site as well. I don't understand how anybody can see the current atrocity as an improvement. It's borderline unusable.


Why do they continue to degrade the mobile/desktop default web experience? Promoting the app is one thing, but then actively harming the user experience? Do most people not care? Insane either way. The day old.reddit dies Ill blacklist the site permanently.


Most consumer companies don't necessarily care about delivering the best UX, just gamifying it to get people to spend more money or time. Just look at AirBNB. You used to be able to sort by price but that option is gone completely and now you have to rely on their algorithm. There is no reason they would neuter the search unless it lead to more bookings.

As far as Reddit, I'm surprised the 3rd party apps using the API haven't gotten ads yet. They're heaps better than the official Reddit app.


They are degrading the web experience because it's easier to block trackers and advertisements in the web. When users are driven to the application, Reddit owns the whole experience and only people who are capable of configuring home firewalls have any hope of privacy.


Gordon Geckos are running the show.


A bit of a fallacy don't you think?


So learning a new language and OS (ObjC/Swift and iOS) is easier than learning a new web API when you're a web developer? And it's cool because everyone simply has their own _standards_? That's... a take I guess.


Why would you lump PWAs and AMP pages together? They're hardly the same problem. PWAs are a threat to the App Store model. It's obvious that it would spawn a new category of apps that effectively enable sideloading. You've got access to all the native functions too (sensors, camera, storage). Apple doesn't want that. So they drag their feet when it comes to support.


There is a tremendous bias in favor of Chrome in developer communities. E.g., things that aren't implemented in either Chrome or Firefox (well, it's their site) aren't even listed on MDN. (The CSS section is somewhat better on this.) Which creates this whatever-Chrome-is-doing effect…


Apple's opinions are simple: If they have a good browser, then the app store - their cash cow - becomes unneeded.


Google's opinions are just as simple: if the web loses even partly to apps, their cash cow becomes unneeded (80% of Google's revenue is online advertising).


Yes, exactly. Apple’s parochial interests are in opposition to the web being useful, Google’s are aligned with the web being useful.


No. Google's are aligned with the web being useful to Google. This is not the same as "the web being useful".

And these days more often than not Mozilla (which has no stakes in either apps or online advertisement) sides with Apple: https://webapicontroversy.com


I don't buy Mozilla's/Apple's concerns for those APIs.

Sure if seems sketch to access USB from the web, but as a user if I'm trying to achieve something and I can't do it on the web, I'll end up installing a native app which has way more access to my system.

As a developer I would much rather be able to just deploy cool things on the web instead of having to package up native apps for every platform just to do things like read/write NFC tags, or send notifications to users who want them.


> I don't buy Mozilla's/Apple's concerns for those APIs. Sure if seems sketch to access USB from the web, but as a user if I'm trying to achieve something and I can't do it on the web, I'll end up installing a native app which has way more access to my system.

There's a third possibility: that you give up achieving that something. The "activation energy" of installing an app is higher than loading a website, so if the user's desire for achieving that something is not strong enough, the native app won't be installed.

And besides, most users know (or should know) that installing apps can be risky, while a website is supposed to be sandboxed. That is, absent exploitable browser implementation bugs, going to any website should be safe.


How is giving up on achieving something a win for users?


Take Background Sync for example.

Mozilla considers the periodic sync API "harmful" because you could use it to track users IP and consumer resources when it's not clear they're interacting with the site

Their stated position:

> We're concerned that this feature would allow users to be tracked across networks (leaking private information about location and IP address and how they change over time), and that it would allow script execution and resource consumption when it isn't clear to the user that they're interacting with the site. We might reconsider this position given evidence that these concerns can be safely addressed, however, addressing them for periodic background sync appears substantially harder than doing so for one-off background sync.

But you could achieve very similar behaviour by using Web Push (which they _have_ implemented) and just sending periodic push message. So now as an honest web developer if I want nice background sync behaviour I have to implement some Rube Goldberg system of periodic push messages from a backend rather than just asking the system to wake me up every now and then.


> But you could achieve very similar behaviour by using Web Push

The fact that something was implemented, shipped, and has security/privacy hole is not an argument to implement and ship something else with a similar hole.


If the hole exists and is not going to be closed, why refuse to add other useful features that use the same hole?

It's like taking a room, and refusing to add a back door for "security" when the front door already exists.


Why expand the surface are of the hole?

Besides, it's quite possible that the hole in WebPush does not allow for some/many scenarios that a background sync would.


Because if the hole exists at all, malicious actors will find a way to abuse it. But if non malicious developers have to jump through too many hoops to provide useful functionality to users they will just give up and users lose out.


> Because if the hole exists at all, malicious actors will find a way to abuse it.

Yes, they will. So the question remains: why expand the hole?


Because it makes no difference if the hole is small or large. To an attacker its the same size.


It does make the difference. You don't deliberately increase the surface of attack if you can help it.


I feel like your speaking in atitides, but not looking at this situation specifically.

If I can unlock your computer with your password or with the word "hello" and you have no intention of removing the "hello" feature, would you not agree that we might as well remove the password entirely?

How do we increase the attack surface of service workers by adding background sync, when we can get nearly identical behaviour using push?

If you goal is purely to not increase the attack surface, you might as well never add any new APIs ever.


If you're going down the analogies rabbit hole: let's say your front door is unlocked. Should you then just open all your windows because, you know, everyone already has access to your house?

> when we can get nearly identical behaviour using push?

The devil is in the details: is it nearly identical behaviour? How nearly is it identical? I personally don't know.


One wakes up the background page on a timer, the other wakes it up based upon a external controllable trigger. It's trivial to fire that external trigger based upon a timer.


s/atitides/platitudes


> if I'm trying to achieve something and I can't do it on the web, I'll end up installing a native app which has way more access to my system.

This is not an argument to just go ahead and implement WebUSB/Bleutooth/Serial/whatever.

> As a developer I would much rather be able to just deploy cool things

Ah yes. It would so very nice if we could just trust all developers to not be malicious actors.


The fewer features the browser has, the more code ends up as native apps or on cloud backends instead of the user's device.

How many IoT devices could have just used bluetooth for interactive with them, but instead they all phone home to some central cloud server because the user experience of just hitting a web page is better than installing some clunky native app.

Should we "just go ahead" and expose everything? Of course not. But there's room for thoughtful implementation of these features that users want. A super secure platform that nobody uses provides security to no one.


> But there's room for thoughtful implementation of these features that users want.

Yes, there is. And that's exactly the position of both Firefox and Safari: don't rush in and implement stuff just because you can.


There is a rush to implement stuff though. The demand for functionality continues to grow, and if the platform stagnates developers and therefore users will move to other platforms.

It needs to be thoughtful, but the firefox position appears to be that web apps have no business accessing bluetooth or usb. Which as I explained earlier leads to user installing clunky native apps to interact with their bluetooth/usb gadgets, or the gadgets being deployed with phone home functionality so that user can still access them through a web portal.

If firefox wishes to remain relevant they're going to have to implement the things that developers and users want. Nobody wants a dumb document browser anymore, that ship has sailed. Browsers are a ubiquitous application platform.


> Chrome explicitly do not care about other browsers, and think that the world would be a better place if Chrome were the only one.

They are gunning for monopoly. No shame in that. The chrome app is the biggest gateway to the internet globally and winning that war means complete dominance as the world moves towards web applications. Using the carrot rather than the stick.

But therein lies the rub. Safari is betting on being bundled with macOS/iOS. That's a cool few million users with nary a breath of effort. I can at least empathise with Apple. Why put in the effort like a schmuck selling fried chicken, chips and skateboards door to door when I can lay back and just collect the meatball bacon cheddar pizza rolls falling from the sky?


> They are gunning for monopoly. No shame in that.

It is pretty shameful if you’re simultaneously claiming to be the biggest defender of the open web.

> Why put in the effort

It’s pretty clear why at this point: developers hate them and antitrust regulators are looking for the right opportunity to go for the throat. Apple screwed up when they killed Safari on Windows, they could have had a solid shot at being the #2 browser in the world on the merits ten years ago and they totally blew it. Eddie Cue even wrote about this in some of the emails from the Epic discovery: https://twitter.com/PatrickMcGee_/status/1389632382244847623...


My sarcasm was missed.

In any case, I'm the furthest from supporting Google or anything they do. But they aren't better or worse than Apple Inc.

To defend google, yes their software has the windows problem. They have to support an absurd amount of devices. So them pushing standards is just a big whale trying to get ahead of itself. No harm in that as long as devs don't go for bleeding edge features and harbour any expectations of cross browser compatibility.

In not blaming Google or the regulators. They can duke it out as long as the public benefits.

Safari wasnt all bad when IE was a thing. It's just that they took IE's spot by being obtuse and generally the rich kid who's a wierdo.

Thanks for the tweet link. Is there a site where leaks like this are common? I really liked it more than I should and how much Apple surprisingly looks at Google. I'd always thought Apple is above looking at companies serving peasants with peasant devices.


I believe I got that link from Daring Fireball originally.


All capitalists really want to be monopolists. There is some shame in that yes.

But the real shame is for the system, the laws, that allow that to happen.

Shaming Google or Apple does not do much. Legislating against monopolies like AppStore would be the solution.


> Legislating against monopolies like AppStore would be the solution

I'm sure Microsoft (Xbox game store), Sony (PSN store) and Nintendo (eShop) will be on board with banning proprietary app store monopolies owned by the hardware vendor.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: