Hacker News new | past | comments | ask | show | jobs | submit login
Make the Web Work for Everyone (hacks.mozilla.org)
251 points by nachtigall on July 6, 2016 | hide | past | web | favorite | 175 comments

As a anecdata-point, Stanford's CS 142: Web Applications course was previously focused on Rails, now it teaches a MEAN stack:


It requires students to test on Chrome:

> Unfortunately, Web browsers are still not 100% identical in their behavior, so Web pages may behave differently on different browsers. For this class, the reference browser is Chrome: your project solutions must work on Chrome, and the CAs will use Chrome to test them. Your solutions need not work on any browser other than Chrome. You may use a different browser to develop your solutions if you wish (Chrome, Firefox, and Safari all have very similar behavior), but please test on Chrome before submitting. We do not recommend that you use Internet Explorer for development: historically, its behavior has been quite different from the other browsers, so things that work on IE may not work on Chrome, and vice versa.

I have mixed feelings on this. Sure...open web, and everything. But in academia, Chrome is one of the more ubiquitous dev environments to be used...I remember being locked into Visual Studio. And purportedly, the class isn't just about job skills...it's attempting to teach HTML/CSS/HTTP/JS to students who may have not had real exposure to any of those concepts, on top of Angular/Mongo/Node/Express. Restricting the environment allows at least for more consistent instruction/grading.

I don't have mixed feelings about this. A course name "Web Applications" should start by explaining the fundamental difference between "Applications" and "Web Applications", the second of which is accessed via browsers, which the developer has no control over. It's a pretty crucial concept.

As far as grading, I wouldn't expect responsive design or old IE support, but it should work in all current, major desktop browsers. It's really not hard, you just have to be conscientious of the fact that there are multiple browsers out there.

To use another technical analogy...I teach data analysis using SQLite...the purpose of the course is for students (most of who have never actually programmed) to be introduced to SQL, but more fundamentally, the concepts of databases and data joins. Ignoring the profound differences in database architecture...SQLite follows most of the SQL standard [0]...but I think there are enough syntax differences between it and Postgres that the gulf might be roughly comparable to at least the disparity in vendor prefixes on the browser side. As for SQLite vs. MySQL...well, I used to teach lessons that allowed the use of either SQLite and MySQL, and gave that up after a year.

In addition to being SQLite focused, I tell students to just use one open-source, cross-platform client: http://sqlitebrowser.org/

Sure, you could make the argument that I'm woefully preparing my students for jobs in which PG/MySQL/MSSQL is by far the standard...whether you're talking about web applications or dealing with legacy data files. But job viability is not in my scope...I argue that I'm teaching students the fundamentals of data and logical joins...that I use SQLite is merely an implementation detail.

That's how I feel about the CS 142 class...if students come out of that class being able to understand even an intermediate level of all those concepts...when they get out into the "real world", learning how to write for cross-browser compatibility will be well within what they can learn on the job. Just like it's trivial to learn the other variants of SQL once you've grokked SQL itself.

[0] https://www.sqlite.org/lang.html

I understand what you're saying, but I think you're confusing tools and deliverables.

If I hired someone to do data analysis, and they delivered accurate results using SQLite, I could not fault them since the deliverable is correct, whether they used SQLite or PG.

If I hired someone to build a web application, and then found out that it only worked in Chrome, that's a broken deliverable.

The specific tool (SQLite/PG/etc or Chrome/FF/Lynx/etc) used for creating each deliverable doesn't matter.

The point was made that, in one course, they're already learning JS/HTML/CSS and the full MEAN stack. Adding on the requirement that it work in each browser is just too much, especially when you consider how you'd want to grade such a thing.

FeatureA works in Chrome/FF but looks weird in Safari and is outright broken in IE. 7.346/10?

It's not "make it work in each browser", and it's not "just too much". If you spend an additional couple minutes per assignment checking functionality in major browsers (ignoring IE), you're set.

For your FeatureA, I'd ignore IE (that's a whole college course in itself), but I'd grade that 9.8/10. -0.2 cause it looks weird in Safari.

As a real world analyst, I'd love a job where you just sit there and pay me to come up with "insights/deliverables" and not have to worry about implementation details.

Indeed, in my experience, the thought that this is how it's done/what analysts should do is a chief complaint about academic-esque analysts, because they can't actually implement anything or their solution is completely unworkable for real scale/tech environment/software stack.

But I would happily take a job where I can be paid to live in my ivory tower if you know a place that views that as the deliverable of analysis :p

Indeed, I think the comparison of someone delivering a webpage that doesn't run on other browsers is quite an apt analogy...

That's not entirely true, what about when you bring someone in and say "here's my mysql server that I haven't upgraded in years, analyze the data in it"

Tangential: please stop abusing ellipses, it makes your (otherwise fine) writing unduly difficult to read.



Don't listen to the other comments. Your job is to teach the concepts. Implementation details are a separate concept to be learned on the job. You're teaching crafts(wo)men, not Unsullied.

I remember my first job not knowing the difference between #includes with brackets versus quotes, and where the IDE was searching for them. I was so pissed that my CS program didn't prepare me adequately for my first job. But the truth is this was one of thousands of "figure that shit out for yourself" that they glossed over in favor of "don't write n^n algorithms" and "cartesian products will be the death of you"; I'm currently taking over a code base where simple requests from an ORM are issuing thousands of queries.

So, please continue to teach SQL. As for browser coding, I learned all that on the job too, and it has no place in an academic setting. It's entirely dependent on feature needs, your target market, accessibility concerns, etc. To say that some new graduate should have all the skills necessary to navigate that minefield is madness.

I wonder how many people bitching about Chrome/Firefox/IE are ARIA compliant.

Not the first time that I see academia teaching their students the wrong way of doing things. I had to work with some graduates with certain strange ideas about approaching the work and tried to teach them an alternative, at least open their eyes and consider the possibility that what they are doing might not be the right way. Tough job. Sometimes it feels easier to hire somebody with no programming skills and teach them from scratch than to change the mental pathways of stubborn graduates.

On that note, my experience has been that academic environment often does more damage than good to its students. It sometimes takes a monumental effort to make the graduates unlearn all the wrong things they learned in the academic environment before they can proceed further in their professional development without all that bad baggage that keeps pulling them in the wrong direction.

It's like children mistreated by their parents in their youth who will carry their psychological trauma from the childhood through the years and that would detrimentally affect their lives until they get some professional help from a skilled counselor.

I'd been programming for nearly 20 years when I made it to University, everything about the way they taught programming was pretty much hilariously wrong* (not the language choices or any of that, that's just noise in the grand scheme of things) but the approaches and the way they explained how to decompose a problem, write it, test it etc.

Not helped by lecturers who'd either always been lecturers or lecturers who'd left industry 15 years earlier (this was 2005 so they'd have left in 1990).

It was still a valuable experience since a lot of the other stuff was valid but I took all the programming stuff with a massive pinch of salt.

* Except Charlie, Charlie was an ex-telecomms C/Unix God he didn't like the way a lot of the programming stuff was taught but he did point us all to where we should be looking.

>> Not helped by lecturers who'd either always been lecturers or lecturers who'd left industry 15 years earlier (this was 2005 so they'd have left in 1990).

Exactly what I'm talking about. I've met graduates with very weird ideas about how to approach problem solving, and they would eventually confirm it had been taught to them at the university as the one true method of doing things. I remember that I myself had to "accept" some of the ideas being taught to us and recite them at the examination, pretending I agreed with them. Was the only way to get through some courses, and I recall many other students suffering from wrongful instruction. Some tried to argue with the academic staff and get those wrong things fixed, many of them would pay the price later by failing the examination. Apparently, those lecturers and professors didn't like their competence questioned.

About 15 years ago we had a new lecturer come to teach our group. I remember my thoughts after a couple of his lectures: this guy must have been thrown out of every shop that's out there. Was completely useless, even if senior by age (around 50 I believe). Tried to teach us about computers by reading aloud a book similar to the "Computers for dummies" series. Was not a good reader either. From the rumor that was circulating around he had a buddy in the hierarchy and used that to land a job at our university. Was truly pathetic.

As the common saying goes: Those who can do, do. Those who can't do, teach. Those who can't teach, manage.

> It's like children mistreated by their parents in their youth who will carry their psychological trauma from the childhood through the years and that would detrimentally affect their lives until they get some professional help from a skilled counselor.

Please, your analogy just doesn't make any sense. Just because someone is stubborn and doesn't want to relearn something doesn't make it professor fault. Nor does it mean he/she suffered child mistreatment. Parents beating their children leads to mental illnesses if the right genes are present. You are reducing child abuse to banality.

I currently work in academia so I'm obviously biased, but I'll argue that wrong things learned in college is quite a bit different paradigm than child abuse. Hard to unlearn child abuse, it seems.

The long term effects may be just as devastating.

Consider that as a lecturer every your word will carry a meaning. Even an innocuous remark may turn out to be something that will stick with one of your pupils for years and will lead them to make a sequence of wrong decisions that will eventually ruin their life.

Having access to younger minds and influencing their personal and professional development is not a joke. It's that kind of a job which if not done properly may potentially produce the next tyrant.

Being taught the wrong thing for a couple months by a shitty professor when youre a young adult isnt comparable to your parents hitting you as a child. The professor is not your entire world, your brain is much more developed, and youre unlikely to consider all your waking actions in terms of your professional skills.

Your comment saddens me and only confirms what I've been suspecting: a lot of people including those working in academia don't take education seriously and do not comprehend the far-reaching repercussions of their actions.

Generally speaking, a person only begins to reach psychological maturity at around 30 years of age. Until that time all incoming ideas generally fall on a fertile ground, as the person is not yet capable of telling bad apples from the good ones apart and can't always discard the wrong ideas. Sometimes it's exactly those ideas that take and derail somebody's life, unnoticeably, one step at a time.

I mean Im just having trouble understanding a substantial link between wrong information learned in an acedemic envirnment (diverse peer group, diverse set of professors, bulk of learning being self directed, having already developed a worldview thats slightly more robust than "mommy knows everything") and trauma. Yes, it has negative economic reprecussions. Sure, in very rare and extreme cases, doing some wrong due to something you learned in college miight lose you a job (I say rare because it should be obvious that most work processes are wrong or out of date and there aint alot of firing done because of it.) But to equate that one source of incorrect knowledge (as opposed to everything else you hear or see which is true somehow?) with your whole fucking world regularly hitting you, telling you youre worthless, being absent in your life, etc, doesnt really gel with my understanding of childhood trauma.

As a solid example of this... A fellow student was working on a problem that had a key-value mapping, but instead of storing them in a hash table/map/whatever the language wants to call it, he was storing them as pairs in a list and complaining about the performance on a large data set.

I asked him why he wasn't just storing it in a hash table instead. His answer: "Because I sometimes need to iterate over it and get both the key and the value"

My brain had to sit and parse what he'd said for a few moments, before I realized what the heck he was talking about. In all of the data structure courses, he'd been taught "A hash table hashes the key, and stores the value in the corresponding slot". In that mental model, there's no way to retrieve the key, only to retrieve the value given the key.

I explained to him that real-world implementations do do that, but also store the key along-side the value, so that you can still iterate across it. You could see the look of amazement wash over him, and then he frustratingly declared "Why the hell didn't they mention that?!"

What does that have to do with academia? That's just having an overly simplified mental model of something, which all humans do when learning any new thing, whether through a lecture, a random stack overflow question, the compiler manual, or through pure experimentation.

Academia carries with it a larger amount of authority than a random programming forum. Students have the expectation that the academic staff is generally knowledgeable and wise and what they say in all probability is very important, has to be remembered and followed (even if you don't quite understand why yet but perhaps you will in the future).

I don't think "Academia" that doesn't teach critical thinking and only practices its authority over students should be called academic institution.

I agree. And yet, this seems to be the rare type. Often it's just a few people in the staff that are trying to do things the right way, for the others it's simply a nice job (with few responsibilities and seemingly no obligations). That's what I've been trying to get across. Teaching is a sensitive job with far-reaching consequences, yet more often than not people don't take it seriously and don't care what their students will carry with them into the life when they leave the walls of that academic institution.

They probably did mention that, perhaps he wasn't paying attention? Besides, unless you have perfect hashing, you're bound to have a hash collision at some point. How did he expect a hashmap to work in that case, without also keeping track of the key? If he never asked that question he doesn't understand hashing either.

I was in the same lecture, it definitely wasn't mentioned. I'd been hacking on Perl for years before starting school, so I already "knew it in my heart" how it worked in practice. Just one of those little but critical details where something important got missed.

And yes, when you've got a complete understanding, it seems quite obvious that the key has to be stored with it. That's why I got it. He'd just learned it though, and there were still those gaps in his knowledge that he would have carried for a long time until someone else helped point out the missing piece (or he had decided to sit through and make sure he understood everything)

If one thing a STEM professor says can lead to ruining your life, then you probably have a lot worse things going wrong in your life.


"You should apply to graduate school!" Didn't ruin my life, but it did waste 3 years and $30k.

Ok thats a decent point lol..

And apparently hard for the GP to unlearn his academic abuse.

What/who does "GP" refer to?

>> What/who does "GP" refer to?

Grandparent. Two posts up the hierarchy from the person who used the term "GP".

Thank you.

So this is a personal attack then. Frankly, I expected better from the HN community. It looked more evolved when I was just an occasional visitor prior to joining. Rather disappointing.

>> So this is a personal attack then. Frankly, I expected better from the HN community.

It seemed like a bit of a snipe maybe (or a joke where they forgot the smiley). I went back and re-read your original "GP" comment and it seemed like you had encountered a number of students who had trouble as a result of such teaching. The snipe implied your experience was just your own personal problem, which seems false in looking at it again. Relax, it's just people you don't know responding to someone they don't know.

You have some pretty bold claims here and not much to back them up. If you gave us some examples or studies you'd be highly upvoted, in my experience.

Anyway, I appreciate your viewpoint.

>On that note, my experience has been that academic environment often does more damage than good to its students.

How bad are we talking? My small bubble doesn't constitute mountains of data, but I have friends across the US from top-tier and not-so-top-tier schools that aren't damaged goods from academia.

Better yet, are you sure it's academia that screwed up or stubborn individuals who worship the ground their professors walk on? Those are the types of people that see academia as the end all, be all of what's right.

This is really bad considering MEAN is a) much less popular (IMO it's dying down), b) much less beginner friendly, c) much less easy to work with than Rails.

And I say this as a JavaScript dev.

I was surprised by your statement of MEAN dying down but I checked out Google Trends and it seems you're right: https://www.google.com/trends/explore#q=%2Fm%2F0_v9b5j. I wonder why that is and what's taking its place?

Honestly just personal experience as someone who's heavily in the more "cutting-edge" JS world.

While Angular 1.x is still big, most new things are made with React. Mongo has gotten a ton of bad press over the last few years that it's hard to find positive things about it any more. This has more of an effect that you'd think. Express and Node are fine though.

What I mean is that Rails, though older, is still much more popular than MEAN.

Ironically my work place uses Express and Mongo, but React!

I don't see anything dying down in that trend.

The last point on the graph is for July which is very partial data, so it's probably best to ignore it.

Without that point, the trend seems to be upwards.

I wish it was dying down - I do not have much love for any part of that stack - but Google trends does not tell us that.

Chrome, Angular, Mongo, Node, and Express? To teach HTML/CSS/HTTP/JS?

Damn. I would definitely not take that class.

Right? Angular is far from regular HTML. It makes sense to use, when you want people to get started with interactivity in the web... after that, though, playing with "dependency injection" and "transclusion" is just not good for you.

> It makes sense to use, when you want people to get started with interactivity in the web...

I don't think it makes sense even then. The toolset is just way too complex both for teaching, and even for practical use for small projects.

I mean, when people are supposed to learn about things like HTTP, or the idea that server-side programming involves writing programs (the normal, regular kind) that output text - which will later be sent over the wire to the browser. Or even how the control flows through your application. You know, the basic, fundamental stuff - understanding of which is kind of required to do this job well. All the "we'll teach non-programmers coding with current framework du jour" courses and classes I've seen involve typing in magic invocation that generate shit ton of folders and files that nobody tells you what they do or - more importantly - why they're even there, and that's all for a "hello world" app.

I feel that all the current sexy web classes are aiming to produce code monkeys, not programmers. Which might be an effective way to get someone to bullshit their way into the first job - but I doubt it serves the students long term.

Personally, when I'm teaching someone server-side development, I start with simple programs that print stuff to terminal using the regular printf()/whatever calls, and then show them that websites is what you get when you send that text to a browser instead...

Thanks for the response. That's really useful.

I think that Stanford tries to be trendy, since it's located in Silicon Valley and surrounded by startups.

At least it's not Rails anymore /s

Bravo! As a Firefox user, the trend of web developers only testing in Chrome (and ending up using features that only work in Chrome) is getting really annoying. Most of the time, the fixes are really simple, like including more prefixes than -webkit.

Very happy to see Mozilla raising awareness around this issue.

Personally, I have Firefox and Chrome open side by side for development. If those two work, Safari and Edge generally will too. I'll add IE specific rules/hacks afterward.

If your app only works in Webkit / Blink, you've written a Chrome App and not a Web App.

Your CUSTOMERS do not want to see "To use this site / app, stop using the web browser you've chosen to use (or your company has chosen for you) and go install this instead!"

Here's a thought exercise: how will this trend be exacerbated by the Electron/Chromium stack? Should Mozilla develop a competing engine for "native" apps?

Mozilla is developing a competing engine for native apps. The word "embedded" is part of Servo's [1] elevator pitch:

Servo is a modern, high-performance browser engine designed for both application and embedded use.

Sponsored by Mozilla and written in the new systems programming language Rust, the Servo project aims to achieve better parallelism, security, modularity, and performance.

[1] https://servo.org/

EDIT seeing Etzos's answer, I see better what you meant (de-facto "standardizing" around electron). positron looks like a better answer then. But long term, there's clear interest from Servo developers, see http://blog.servo.org/2015/05/01/forward/ and Ctrl+F for "Chromium Embedded Framework"

Great. I'd prefer to see Servo and browser.html replace the heavy Electron stack.

How light is servo + browser.html? you seem to be implying it's much less than electron. Any figures? I haven't looked at Servo / browser.html / rust at all apart from seeing a few news articles pop up on HN the last few months.

See this demo comparing Chrome, Firefox and Servo WebRender: https://www.youtube.com/watch?v=u0hYIRQRiws Servo WebRender uses the GPU making it much faster than classic renderers (besides CPU parallelism), think of it like a Webbrowser using Video Gaming technology.

There's a talk about this at https://air.mozilla.org/bay-area-rust-meetup-february-2016/ (HN discussion: https://news.ycombinator.com/item?id=11175258)

browser.html is just the current "skin" for Servo.

I think the grandparent was referring to things like startup speed and memory consumption when they said "lighter", not rendering performance.

Any tests done on integrated graphics such as those in normal laptops?

In https://air.mozilla.org/bay-area-rust-meetup-february-2016/ ,

- At 03:00: pcwalton explains how this experiment leans on the GPU-ization of our Intel CPUs since Haswell.

- At 14:50: slide says "WebRender supports OpenGL ES 2.1 and OpenGL 3.x"

- "Benchmarks" at 26:00 running on his macbook, which may fit what you are looking for

TL;DR, yes this early work is for "integrated graphics such as those in normal laptops". Or try it yourself on your laptop with a nightly build: http://blog.servo.org/2016/06/30/servo-nightlies/

It works awesomely on both integrated and discrete graphics for me. IIRC Patrick and Glenn, the people who wrote webrender, by default use integrated graphics anyway (at least one of them does).

Software rendering makes it choke sometimes (other times it works surprisingly smoothly, but it depends on the load), but that is to be expected :)

Hard to say since it's still in development. Maybe someone with more knowledge can chime in. But it's focus is on parallelism and performance, so I have high hopes. Slack (a flagship of Electron apps) is currently using 850mb of ram, idle in the background. Firefox with 50 tabs open is using the same amount.

Holy crap! 850MB for a chat client!? Just made the argument for truly, native apps right there.

Servo is super long term; it is more of a testing ground than a product. It is possible that it will become a product in the future, and embedding is a really lucrative space for us to try, but no plans for a product right now.

Positron is indeed the thing you are looking for.

They are actually already doing this[1].

[1] https://github.com/mozilla/positron

Don't forget Cordova where most views are also Chrome based. (The exception being Windows, by default, and many don't seem to test on Windows.)

Obviously Mozilla tried and failed with Firefox OS to play in that market at the mobile level.

It might be nice to see an Electron and/or Cordova-compatible view engine from Mozilla, but I'm not sure how much adoption or testing it would see unless it became the default (and even then you probably would have a bunch of web developers revert to Chrome views simply because its comfortable to what they know).

Mozilla actually had one for years, called XULRunner.

It's not good; people use Electron instead of XULRunner because XULRunner was barely maintained. It would randomly stop working on major Firefox releases, and would stay broken for weeks, because nobody noticed or cared that XULRunner didn't work, as long as core Firefox kept working.

Even when it did work, working with XPCOM sucked.

Positron https://github.com/mozilla/positron is the new XULRunner. WIP, doesn't actually work yet. :-(

The trouble with that is what do you do with older versions of the browsers? Edge is okay with features, but then you have people who complain that your website looks funny in your browser because they still run IE 8 or 10.

I didn't realize Chrome was becoming the default testing environment. I find it weird, given its terrible font rendering. I tried using Chrome but just couldn't. Smaller size fonts render weak, pale, bleak, like drawn with a dull pencil. I found it rather difficult to read large bodies of text in Chrome which was giving me quite a bit of eye strain. Not sure why anyone would want to test their pages in that in my opinion broken rendering engine.

I'm aware there is a setting which has to do with subpixel rendering or whatever this is called, I tried switching it on and off for no perceivable visual change. So I gave up on Chrome.

IE also has had broken font rendering since they moved to subpixel rendering several years ago, but at least it draws fonts in a strong and dark fashion making them more readable than in Chrome.

Personally, I design for Firefox which I consider the gold standard of web development. Then, at some intervals I check if things are okay in IE and Chrome and usually they are fine. Chrome was useful once in helping me spot some sort of a race condition, its developer tools also conveniently allow you to quickly bypass caching for testing purposes, but other than that I found it useless and unusable.

I mainly use Chrome, with some testing in Firefox. It used to be the other way around, but Firefox's single threaded nature just makes it very painful when you have multiple windows with multiple tabs open. Having it use only 12% of the available CPU in the system when I am doing lots of work is not acceptable especially when I need the browser to be responsive.

Yes I know about e10s and servo, but they aren't in the regular Firefox right now, and especially weren't several years ago when I was forced to change from Firefox to Chrome. I'd love to go back ...

You're probably seeing this bug https://bugs.chromium.org/p/chromium/issues/detail?id=146407... and it is indeed a sad story, especially because Chrome 21 worked just fine. Then they rewrote some gamma correction code in the Skia that shipped with Chrome 22.

Thanks for the link. Based on some of the screenshots posted over there it indeed looks like what I've been observing on my machine (version 47-something).

Not a problem really. Never used Chrome before. Only installed it out of curiosity and also for testing purposes. Within the first day of using it on my development machine it became clear I wouldn't be using this piece of software in the future either.

>I didn't realize Chrome was becoming the default testing environment. I find it weird, given its terrible font rendering. I tried using Chrome but just couldn't. Smaller size fonts render weak, pale, bleak, like drawn with a dull pencil.

I find the opposite: never could stand Firefox's font rendering -- and I've used the thing for years back in early 00s.

I think it is in certain circles. For Google, obviously, in particular. Many of their new sites launch completely incompatible with Firefox and Edge. Some of the frameworks they push, like Polymer and Angular, have been repeat offenders in the past, which leads web apps built with them to also be broken.

Edge actually pretends it's Chrome in its UA

To be clear, it still specifies "Edge" in it's UA string and can be specifically detected. And just about every browser now mentions Mozilla, Chrome, and Webkit in their UAs.

Right on. The tools we have at our finger tips as web developers are so accessible and comprehensive that there is little excuse for not testing an app across a range of user interfaces/platforms. I shiver to think how much deep DIY happened at every layer of development and deployment in previous software generations.

The problem is caused by the web spec being way too complicated.

Instead W3C should have chosen simpler primitives from which developers can build complicated (formatting) rules themselves.

Simpler primitives is also better from a security perspective.

By designing the web for average users instead of for developers, W3C has shot itself in the foot.

> The web spec

There are hundreds of web technologies, each with their own spec. None of which are actually that complicated if you read them.

> Simpler primitives ... formatting rules ... better from a security perspective.

So you mean CSS? What does that have to do with security?

> Designing the web for average users

Not sure what you're trying to say.

>There are hundreds of web technologies, each with their own spec. None of which are actually that complicated if you read them

The first part is true. There are indeed hundreds of web technologies. The second is absolutely false. A lot of them are (and have historically been) horribly complicated to implement, with lots of edge cases and strange interactions. In fact prominent web standards people have talked about such cases many times.

>So you mean CSS? What does that have to do with security?

No, he means "simpler primitives" across the board. And even CSS has to do with security (e.g. loading third party fonts, etc).

>>Designing the web for average users >Not sure what you're trying to say.

He's obviously tries to say that W3C et al piled features to please end users and satisfy end user needs, without giving much care about how to make them more consistent and coherent for web developers.

Complicated to implement, yes. Complicated to read and understand why the browser is doing x? No.

I'm not sure if end users are the people W3C is trying to satisfy. Browser vendors do enough of that. If anything, they're trying to satisfy media corporations that want DRM standards, or ad corporations that want pervasive tracking.

>Complicated to implement, yes. Complicated to read and understand why the browser is doing x? No.

CSS, for one, is notorious for being complicated to understand, with tons of complex cases, especially in the layout department, where whole cottage industries of "tips" and "workarounds" for the simplest of stuff -- and I'm not talking browser incompatibilities -- thrived for decades (until, at least, flexbox and the like).

>There are hundreds of web technologies, each with their own spec. None of which are actually that complicated if you read them.

I agree with your overall point, but there is a reason sites like MDN exist. It's because some of the HTML/CSS/JS specs are crazy complicated, contain years and years of edge cases and bugs that have become standard, and are written in standard-eze (which is easy enough to read once you know it, but it can be a bit of a learning curve for someone who just wants to know the order of function parameters).

MDN will be a huge legacy for Mozilla. Of all of their projects, I think MDN will have the biggest impact on the web. It's absolutely needed for someone who just wants to know the order of function parameters.

For a more nuanced understanding of browser behavior, you have to read the spec. Worked on a high performance network app, and I think I know the XHR spec by heart now.

> Worked on a high performance network app, and I think I know the XHR spec by heart now.

I am curious, can you elaborate on this? :) What part of the performance related work involved peeking at the XHR spec so much?

For mapping applications, tiles are loaded to display data. These requests have to be handled carefully, especially when there is a lot of data, because every map movement causes 10s of requests to be fired at once. Their lifecycles have to be managed so they can be cancelled as soon as their result isn't needed (user panned away, or changed zoom level).

I couldn't agree more, and I also think it influenced the standards bodies. New ES[year] specs include lots of very MDN-esque explanations, examples, and "polyfills" for the new features.

If I may contribute to the article somehow, I'd tell people this: stop using JavaScript for basic markup. I sigh every time I visit a site whose entire navigation structure is spit out in front of me as a disorganized wall of text, and only when I enable JavaScript that mess gets cleaned up and arranged back into something organized. It's just no way to design pages. CSS is for visual appearance and styling. JavaScript is for interactivity. If you can't give your site a proper look without resorting to JavaScript then you have no business designing web pages yet and need to go learn the basics first.

Most of enterprise development is on rich web apps that couldn't work at all with javascript. Saying that people who write javascript and don't use semantic html need to "learn the basics" is like complaining about people typing emails instead of hand-writing letters in cursive.

People like yourself who want HTML fallbacks for everything have a fundamental misunderstanding about the nature of technology. At some point, it doesn't make sense for the biggest roads to have support for both cars and horse-drawn wagons.

Any road will support a horse-drawn carriage. Unless you take extraordinary measures and start building vehicles using wheels with special profiles for which custom-designed roads will be necessary where those wheels will be held on track like with trains.

Naturally horses will stumble on those, and people like you will say the others don't understand the nature of transportation and should get rid of their old horses and conventional cars.

But perhaps the problem is just with you and you newly designed fancy wheels.

Except you're a a tiny and economically irrelevant minority who own cars but are intentionally disabling them due to some sort of mental block about it not being the year 1994 anymore.

If you want to be Cyber-Amish, that's absolutely your right, but much like the actual Amish, you will need to create your own society that serves your needs instead of trying to drag us back into the past with you.The Amish are not demanding that the International Space Station build a wooden module to accommodate them, nor do they buy plane tickets with the expectation that the pilot will still get them there but won't turn on the engine due to the religious beliefs of a tiny minority. Everyone has the right to unreasonable beliefs, but when they try to be smug and condescending about their own backwardness... yeah. You are the human equivalent of IE6.

> Most of enterprise development is on rich web apps that couldn't work at all with javascript.

Doesn't enterprise development stay within enterprises? As for the public facing web, even if the functionality requires Javascript, having it look like crap without Javascript does seem kind of lazy. I'm even for being lazy, I don't test everything and on every device, but when something is pointed out to me, at least I realize it's suboptimal, and don't rationalize regression into progress.


Nothing really changed to make these best practices moot.

Most of the enterprise internet ends up becoming the public facing internet. Enterprise isn't a moral agent for accessibility, it's saying "why should we spend hundreds or thousands of developer hours accommodating less than 1% of the market?"

With rare exceptions, (such as supporting IE6 clones in China) there's no business case for doing so, so it isn't done.

W3 is great, but the very page that information on violates their standards--it's div soup and won't work screenreaders, there are images without descriptions, and even the sidebar is using javascript to open the menu and change classes when it could use pure HTML.

So yes, those best practices are obsolete when the people recommending them can't even be bothered to follow them on their own page.

Thats because CSS is not powerful enough, not helped that you can't use something like flexbox if you need to support old browsers but even the latest version of $YOUR_BROWSER does not have a way to say that I want a list element to layout its elements in one vertical column that is also centered vertically on the screen if it has fewer elements than there is space for on a screen, but having it split into additional columns and align all elements in all columns to the top if there are more elements than space on the screen. It is a fairly simple layout and I ended up having to do it in JS because flexbox only has a few standard modes for what to do when the content overflows and those don't effect the margins of the element.

CSS needs to be replaced with a more capable scripting language, or at least one based on constraints but until that happens we are stuck with javascript.

> Thats because CSS is not powerful enough

This kind of objection is totally missing the point.

And this isn't an argument that CSS doesn't have painful limits (it does) or even about what constitutes "enough" (within its limits, CSS offers enough possibilities that your own ideas of what the app "should" be like may be as much as a limit as the problems of CSS are).

The power of CSS is largely orthogonal to the underlying issue.

Ideally, your web site/app is still usable even with every last stylesheet completely ignored. It should work with Lynx, or with entirely non-visual user agents (screenreaders, search bots, Siri etc).

CSS should enhance via layout and other presentation rules where that's possible.

JS should provide more convenient application behavior where that's possible.

Instead, we've slipped into a space where the browser (and, frequently enough, one browser) is simply considered The VM That Lived™, just another runtime target. And that's a sign of its growing power, and that power isn't a bad thing because there are in fact some applications that don't fit the hypermedia model well and the browser's ability to play the just another runtime role opens a space for them. But the rush to get into that space is considerably overdone and apparently executed without a lot of awareness of what's been lost in moving that way.

Except the vast, vast majority of people don't care and use a normal desktop browser with JS on

My mom just bought a new phone and asks why it's still too slow and clunky like her old one. I found out she likes to click Buzzfeed-esque links from Facebook and most of those sites slow the device to a crawl because of the massive amounts of Javascript running.

People care about the downstream effects, they don't know the how or why, but they care.

In 2014 I bought a ~50$ Android while traveling Latin America. I didn't expect much more of the browsing experience than reading mostly text based articles. But I didn't even get that. Literally half the linked articles I clicked on hacker news led to sites which where close to or impossible to navigate because of JS shenanigans and the assumption of an least a 4G internet connection.

It is especially annoying if one claims to target a world audience, but expects them to have a +500$ phones and vast data subscriptions.

The market fixes this, though. If your websites don't load on a low-end Android over 3G, you won't attract the sort of audience that uses that setup. Conversely, if you know what a large portion of your audience uses specs like that, you'll go through great lengths to ensure your sites are performant and usable.

It seems that most 'western' sites have decided focus on people with desktops or +500$ phones and vast data subscriptions.

So it's really a Western web and not an open web. You've proven exactly what everyone on here is saying: there's so much focus on marketing and audiences that many users get completely shafted. That's sad to see given that the web is often heralded as something that gives under privileged people access, and the ability to share, to powerful and useful information.

In your example, most of the time the "market" won't fix that though, because there's no money to be made in servicing the users with "cheap" phones.

In some places, $500 is about a half-year salary from which one has to live and feed their family. Nobody in their right mind would waste it on an electronic toy.

Indeed. I think it’s important for engineers working on any kind of end-user facing software to remember that a lot of people aren’t using cutting edge or even recent hardware. There’s lots of Core 2 Duo laptops and 3+ year old smartphones floating around in use out there and their users need to be able to use the websites and apps we make as much as the guy who buys new everything every year.

I think this is particularly easy to forget for those of us living in tech hubs where everyone is using almost-new Macbook Pros and iPhone 6Ss.


I've never seen that client devs test their apps on lower end devices.

If my users care about and/or use Lynx that I will support it. Until then, it's just a feature that doesn't have the ROI to invest in.

Sure, sometimes you have to make tradeoffs for what you have time to invest in.

But this isn't really about supporting the world's Lynx users specifically. There's a reason I put a whole class of non-mainstream UAs in there (and in particular Siri, given how likely it is that entirely non-visual UAs are to become more mainstream at some point, perhaps soon, on top of the awareness that intermediary UAs like GoogleBot are of course ubiquitous and clearly important). And which UAs you try to support has a chicken and egg effect on which UAs get employed as intermediaries in accessing your site/app... and therefore which UAs you think you see using it. Usage follows accessibility perhaps even more surely than accomodation follows demand.

If that weren't enough, the decision to do the things that would mean that your site is usable in Lynx is not just a feature -- it's also a technical decision. And like other technical decisions (database, language, library, application architecture, idioms and patterns you use, and data serialization/interchange format... which is actually part of what this decision actually is), it matters to how well your product performs and how tractable it is to develop and maintain, as well as how widely it can be consumed.

If you're having to make cold hard decisions about ROI for features, chances are decent that you're better off for thinking about whether it works in Lynx and why, even if not one single user visits with that specific UA.

Fair enough in a way, but on the other hand.. will that be like JPEG2000, which browsers might support when websites use it, which might be when browsers support it?

Or put differently, how would someone who can't use or even see your thing become your user?

When I was a beginner I also tried to design all kinds of fancy, flexible and auto-adjusting stuff. With years I came to the understanding that most of it is just unnecessary and wouldn't diminish the user experience in any way if you didn't do it the fancy way and instead went with a simpler route.

CSS like any technology has its limits. In reality it is often good because it forces you to rethink what you're doing and reconsider whether you really need it. If we had a technology with unlimited capabilities, a lot of developers would be lost there forever and wouldn't be able to accomplish the larger goal completely lost in pursuing all the secondary details they could possibly imagine.

Most of the places I see javascript being used for layout are not doing complex things that can only be done in javascript. At the very least, there should be a non-javascript fallback so the content is visible and somewhat layed out.

All mediums have limitations.

CSS is good enough. It's overdesigning that makes the web the crap it is today.

For a long time in the early 2010s, Chrome had superior developer tools built-in, while Firefox needed addons like Firebug and the Web Developer Toolbar. While these addons were good, you needed to know about them and download them separately, whereas Chrome shipped with better out of the box.

It took several years for Firefox to catch up [1], and in that time, Chrome's market share exploded, and Chrome's underpinnings V8 and Webkit found new uses outside of Chrome. Furthermore, Firefox was late to Android, where the Android Browser, and later Chrome dominated.

Part of the problem is a website/webapp developed and debugged using Chrome will work with Firefox more often than not -- so some people have been cultured to not even bother checking it in Firefox. Ironically, if more things broke spectacularly, more people would test in another browser.

This PR effort is nice, but it won't reverse the tide on its own. However, once Servo is released and sees use in an embedded setting, developers will no longer be able to get by without developing (or testing) on a Mozilla engine.

[1] https://techcrunch.com/2013/03/18/mozilla-promises-to-improv...

> For a long time in the early 2010s, Chrome had superior developer tools built-in, while Firefox needed addons like Firebug and the Web Developer Toolbar. While these addons were good, you needed to know about them and download them separately, whereas Chrome shipped with better out of the box.

I agree regarding the relative quality, but developer tools have no place in a default Firefox install; the whole point of Firefox was to streamline Mozilla by moving as much as possible to plugins. It's unfortunate that they've recently started bundling stuff again (developer tools, PDF reader, pocket (whatever that is), etc.).

> Chrome's underpinnings V8 and Webkit found new uses outside of Chrome

To be clear, WebKit came from Apple (as a fork of KDE's KHTML), so this wasn't really due to Chrome devs; they were just part of the WebKit bandwagon.

Gecko (Firefox's rendering engine) was already quite widespread before Chrome existed, e.g. it was/is used in Gnome and GTK programs (Epiphany for sure, and I assume other HTML-consuming applications like Liferea); similar to the way KHTML is used in KDE programs, although it was more painful to integrate. Some of these programs were cross platform, so may have had a reasonable userbase (I don't know; I've used Linux exclusively for about 15 years).

Programs based on the XUL toolkit presumably had a much larger installed base; e.g. Thunderbird, RSSOwl, Songbird, Miro, etc.

I didn't notice any conspicuous users of Mozilla's Javascript engine except for Gnome 3, although there were other JS engines around for those who wanted them (e.g. Rhino for JVM applications). V8 certainly opened the floodgates for embedding JS though; I think I first started treating JS as a serious language when the D8 repl came out; node.js followed quite soon after.

> Furthermore, Firefox was late to Android

Well, Chrome and Android are both Google projects; we could say that Chrome was late to FirefoxOS ;)

> Part of the problem is a website/webapp developed and debugged using Chrome will work with Firefox more often than not

This is tricky; when I was doing Web dev around that time, I'd do a quick checking in FF or Chrome (whichever I had open) to spot glaring problems, then switch to IE6 for my serious testing.

> but developer tools have no place in a default Firefox install

Completely agree! But the fact is, Google shipped them anyway, and so people began using them. Now Firefox has to live with the (unintended) consequence of their decision.

> Well, Chrome and Android are both Google projects; we could say that Chrome was late to FirefoxOS ;)

Sure, but marketshare! My point is, Mozilla is doubly disadvantaged by having to battle against a vertical entity (Google), AND also being very late to the game on that platform.

> I'd do a quick checking in FF or Chrome (whichever I had open) to spot glaring problems

I'm pretty sure this is what most people do, and that's the problem, because most of the danger is from stuff that's only subtly broken, that a quick look won't catch. The non-web world has automated tests to help with this. Maybe we need tests that can run inside the browser script engine to test our webapps.

I remember that CouchDB used SpiderMonkey before switching to V8.

Even today the Firefox dev tools are quite far behind in my experience. For example, source maps are completely unsupported in the web console [0], so you get a useless trace pointing to app.js:22312 while Chrome will tell you the target is actually indicator.js:21 right out of the box. That's a showstopper right there for any dev that's using Babel, Webpack, Browserify etc.

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=670002

Agreed. Chrome even manages to work with 2 layers of source maps for my project (TS->JS->require). Not to mention that the FF "source file" view is this huge dropdown that is terrible to use when you have hundreds of files in various folders.

> Not to mention that the FF "source file" view is this huge dropdown that is terrible to use when you have hundreds of files in various folders.

They are working an another JS Debugger prototype with a different UI, see https://twitter.com/jlongster/status/737831071379783682 and https://github.com/jlongster/debugger.html

Another thing benefiting WebKit (and now Blink)’s dominance is its portability. WebKit runs on more platforms than I care to number, even finding its way into things like pre-smartphone Symbian devices. It can run where XULRunner is impractical or undesirable and has bindings for just about any popular language, and both of these factors have proven beneficial to its takeup.

Speaking of accessibility: the "tota11y" bookmarklet by the Khan Academy people is a useful tool to check your website


That looks really cool. I don't think that accessibility will get too much traction unless the tooling for checking it is much easier.

As a side note I wish that there was a link to "installing" tota11y near the top of that site.

And ironically enough, whatever they use to set width of the text doesn't work for me.

Web working for everyone is easy, you just don't add crapton of pointless script bloat and use what's been working for decades - unless you need to do something special, but then you can't expect it to work for everyone.

This is what's so frustrating to me. A perfectly working web is easy. Most sites actually put more effort into making it work less. This is insane to me. Keep it simple and minimal (in size, features, frameworks, etc, not "lots of whitespace") and it will work for more users. Most sites are not special snowflakes. There are exceptions where lots of formatting/scripting needs to happen, but most sites are better off without.

Yes, but — as soon as Cascading Style Sheets existed, browsers should have had a sane base so that http://motherfuckingwebsite.com/ looked like http://bettermotherfuckingwebsite.com/

I actually think styling should all be done client-side. What I mean is: A site should display only text. Then the browser should have style done via user-defined settings that apply to all sites.

There are exceptions where the style is part of the content, but those sites are rare.

There comes the question, why not go a bit further and make it like https://thebestmotherfuckingwebsite.co/

And then we start displaying ads. Maybe add analytics? Probably a developer with 300 stars implemented scrolling better in JS than the boring old native. Then you look at the calendar and it's already 2016.

I was only familiar with the original, and enjoyed finding out about these "better" and "best" variants. But I'll note:

http://bettermotherfuckingwebsite.com/ is really quite good, and it's a single 4KB request

http://thebestmotherfuckingwebsite.co/ is 19 requests, 1.37MB, and the appearance and functionality are terrible

Yeah it's what everyone does these days, but it's everything I hate about the "modern" websites:

  * it shows no content without scrolling
  * full background photos with no relationship to the content
  * gratuitous transitions and layout changes
    with only the most tenuous relationship to the content

There's also a link to http://evenbettermotherfucking.website/ which cuts the contrast too far, but otherwise is fine, and only 5KB.

So really, the "best" variant is not "a bit further", it's a full hop/skip/jump into the deep-end of "modern" anti-patterns

I think that's what Reader Mode does in most browsers.

It's kind of odd that there seems to a consensus that user stylesheets are out, but that "reader mode" is a thing.

On might have thought that the modern browsers would come with a sane css baseline, making simple html-sites with headers, paragraphs etc look at least as good as a typical Medium.com site (or indeed, like "reader mode") - rather than insisting that every single author add completely redundant reinvention of basic styling.

These are the 7 declarations + the bonus one suggested in the text.

    body {
        background-color: #EEEEEE; /* bonus */ 
        color: #444;
        font-size: 18px;
        line-height: 1.6;
        margin: 40px auto;
        max-width: 650px;
        padding: 0 10px;
    h1, h2, h3 {
        line-height: 1.2;

Is there any research about whether users prefer web "applications" (SPAs and everything related) or good old web pages?[1] Hackernews and sites similar to this work on pretty much all browsers and devices. So, other than clients wanting to put big CSS banners and animations and using bootstrap and The Theme™[2], is there any valid reason to intentionally bloatify a page?

Edit: Sorry for not being clear, I meant websites where SPAs are not required, and plain old websites would do. In other words, do users (unconsciously?) prefer eye-candy over readability?

[1]: The benefits which users will prefer "applications" for like progressive enhancement with offline first thingy isn't here yet, and virtually no one is using it in real life. So please don't count on that yet. [2]: http://adventurega.me/bootstrap/#

I think it's going to vary wildly to what the website actually does. Something like Gmail or Slack? I'm probably going to be happier with a (well-constructed!) SPA. If managing state across things is important to the core functionality of the site, then use the paradigm that's appropriate.

The ticketing part of Zendesk is a tabbed SPA of sorts, while the help center is a wiki-style set of webpages. It works well.

While I personally prefer HN to many similar forum software, I actually think it's not the best example for "plain web site". Partly because of the very real (sometimes intentional) UX issues, like no native collapsing of threads, and the silly voting-arrows that almost impossible to use on any device -- and partly because HN sits (IMNHO) a bit between the simple/old and the new. A better example of "plain web forum" might be the D-lang forums: http://forum.dlang.org

That said, I never liked web forums in general - I prefer mailing lists (and the D-lang forum, as far as I can figure out, support usenet access (is a web usenet front-end) - so it should be possible use a native client rather than a web gateway -- which might be better.

But as the web has shown again and again, colourful backgrounds, smilies and avatars wrapped up in sloppy html and css wins every time - multimedia wins every time. Eg: slack.

I think it should be perfectly viable, and desirable, to meet both: an open (preferably federated/self-hostable) (set of) protocol(s) - a web client - and native clients. For me, the fact that I don't really thing gmail is any good, is kind of the only nail the "web app" coffin needs -- if Google can't make something half-way ok to work with in terms of UX -- for something that's arguably simple -- what hope does anyone else have?

As for your real question: I think wikipedia is a great example of both "it can be simple" and "complex provides value". Reading wikipedia works fine from lynx or w3m - but editing, viewing revisions, etc - benefits a lot from a richer "web app" client. (I would still prefer to edit in vim, use "real" version control and push changes - and I'm not sure what options are there for wikipedia, the project, or mediawiki, the software, to do that via apis etc -- but I'm sure it'd be possible to hook something up if it's not already there - I haven't really looked).

Everything in the web technology stack has been intertwined to the point where incomplete browser support for some JavaScript API could cause a bug with any level of severity, ranging from “site is ugly” to “nothing works at all”. (And it drives me crazy when some site can’t even show me a critical button because of its fanciness.)

There needs to be more work separating a critical base set of features from the rest. There needs to be more separation of the fancy from the functional (i.e. instead of requiring some bizarre mixture of JavaScript support in order for anything to look right, you have two choices: “turn on animation support” or “turn off animation support”, or some such division).

Maybe we should go back to XHTML and get serious about syntax enforcement. Suggested XHTML 3 rules:

- Everything is UTF-8.

- All tags must balance and nest properly.

- Any syntax errors or JavaScript errors generate a visible error message for the user telling them the site is broken. Then everything renders in the default font with default formatting.

- When something goes wrong, the site gets a HTTP PUT request with an error report, so sites can tell this happened.

That last feature would make it possible to fix things.

Maybe you should go back and lesrn why XHTML was never a good idea. You will discover a lot of interesting things, like MIME types, rendering modes, etc. Browsers did show errors when illegal XHTML was served with proper MIME type, made no one happy.

The web is popular because HTML is sloppy. Your XHTML3 stuff is never going to work. It failed before it will fail again.

> Any syntax errors or JavaScript errors generate a visible error message for the user telling them the site is broken.

This would make users not use the browser that implemented it.

> When something goes wrong, the site gets a HTTP PUT request with an error report, so sites can tell this happened.

If I bother to implement a handler just to get error reports from browsers, I surely took first the easier step to open my website in such a browser to check how it looks.

The parent article says that users leave the site, not the browser, when there's a problem.

If some ad code or third-party Javascript library blows up, your site may never know.

>> Any syntax errors or JavaScript errors generate a visible error message for the user telling them the site is broken.

>This would make users not use the browser that implemented it.

I think the idea behind this is that

1) the web developer will now the site is broken and

2) the client who is paying the web developer will know the site is broken so

3) the site will be fixed before it even gets to the end user.

Making sure a website works for everyone is exactly what the company I'm working for [1] is focussing on.

We run a large grid of all different browsers, both old and new versions. You create a test where you verify certain aspects of your website and then run it on all these versions on our grid.

With the feedback we provide (screenshots/videos) you can fix any issues that may have come up across different browsers/versions.

[1] https://testingbot.com

While I understand and agree with what the article is saying, I think a primary driver for other browsers catching up was that they were forced to because developers refused to spend the time dealing with quirks and browsers that didn't correctly implement standards.

Necessity is the mother of invention.

If developers catered to the other browsers like they used to we wouldn't have seen the dramatically improved compatibility we now have.

Good article. Web standards and portability should be a much higher priority. A little off topic, but I like to use Chrome for anything touching Google, Facebook, and Twitter, and use Firefox for everything else. Another good practice is setting browsers to delete all cookies when the browser shuts down. With password managers, this is not much of a nuisance, and makes me feel like I am getting extra privacy and safety.

"it was found that there are more hard of hearing users in the United States than the population Spain and more users who are blind and low-vision than the population of Canada."

Really? There's more than 35 millions internet users with vision impairment? The previous section says there are 8 million people with vision impairment in the USA.

If they include world-wide users, I can see 35 million users being a reasonable statistic. If we're talking about just the USA though, depending on the definition of "blind and low-vision," I found figures between 8 million [1] and 22.5 million [2].

[1]- https://nfb.org/blindness-statistics [2] - http://www.afb.org/info/blindness-statistics/adults/facts-an...

They mean in the USA, it's based on a misintepretation of the graph below the quoted phrase, or on poor wording on their part.

The graph says there are as many vision-impaired American internet users as there Canadian internet users - not the entire population - around 20-25 millions (the graph is hard to read). I'm sceptical of that number because it suggests ~10% of the USA is vision impaired, seems too high.

Anyway the only reason I care is that I'm Canadian ;)

If the numbers include older people who undergo the usual loss of near sight vision, it isn't so hard to believe. I think a lot of web developers are young and forget that older people use the internet and that older users may have a hard time seeing the small font some webpages use.

Yeah ok, but am I supposed to find workarounds for your bugs to support your browser? Or should I just tell users we don't support firefox until the bug is fixed?

My website has a position:fixed canvas area that works in every browser except Firefox. For firefox I have to re-apply the fixed property regularly for it to sick, otherwise the canvas scrolls with the rest of the interface.

The bug is logged with Mozilla, but nobody over there cares. If they don't care, why should I care? https://bugzilla.mozilla.org/show_bug.cgi?id=1258911

Your users.

I really believe Mozilla, Google, Apple, and Microsoft should consider working together. A single open source code base - rendering engine would be a better result for the web than multiple competing engines. The linux kernel takes contributions from thousands of organizations and it's true we do have BSD and it's great - linux as a unified open source code base serves everyone better than when we had multiple fragmented unix variations. There will always be room for experimental rendering engines and JavaScript runtime engines. But having a unified rendering engine and even unified JavaScript engine IMO would be a huge advance!

I recommend looking up the term 'software monoculture'. What you're talking about is, at least in my opinion, one of the worst things that could ever happen to the web, and already unfortunately, a serious risk with the prevalence of other browsers also being Chromium-based.

Competition in the software space is what encourages innovation to happen. Problems being tackled in different ways by different groups often leads to the best solution being found among them.

Which is why I applaud the Microsoft team open sourcing Chakra (their js engine for IE/edge), and their efforts to get it working as an alternate engine for Node.js

More competition is ALWAYS a good thing. It prevents vendor lockin, prevents bugs from becoming defacto standards, and prevents developers trying to "optimize" for the platform and instead has the platform try to optimize for developers (that was a weird way to say that, but I mean things like avoiding "performance bottlenecks in one engine" from becoming "performance bottlenecks in javascript")

A good example of that last one is try/catch blocks. In V8 they trigger a deopt for the whole function making it run slower, in most other engines it doesn't. Because of V8's large usage, many people avoid try/catch in performant code to avoid that penalty, even though it's just an issue with v8's implementation. And because of that you start seeing "avoid try/catch" as a general performance "tip" for javascript.

The corollary is that after a best/good-enough solution is found and the underlying platform unifies and stabilizes, most of the innovation moves up in the stack. Things that used to be important and hotly debated become mundane things that you can rely on simply being there, a new space then opens up for new types of software to compete in, and the pattern repeats.

See hardware architectures -> operating systems -> browser engines -> frameworks -> services etc. The first two are effectively fully commoditized, few people get excited about them anymore, even as Linux development is more active than ever. Similarly the browser engine is now entering the unification phase, with Chromium swallowing competitors. Meanwhile the battle to define how higher level components are created on top of the browser is in full swing with rapid innovation.

Lack of choice in hardware architecture and operating systems, particularly in mobile, is a huge issue, and I'd argue, a poor support for unification being a good thing. It's pretty hard to get a device that isn't running Android and a Qualcomm processor, other than an iPhone.

As Google becomes more of a looming threat on privacy, the idea that they win in the mobile space and the browser space, and we suggest that nobody should compete with them is a terrifying situation.

Having multiple implementations also means any problems in the various web interfaces and specifications are much more likely to be found.

Meaning that innovations like Servo would not happen.

You have serious misconceptions about the unix sphere, and your analogy doesn't work.

Not to mention respect for meta-standards. Trying to parse RSS feeds is an impossible job. Let alone the RSS/atom dichotomy, nobody respects the standards, or purposely violates them.

> Millions of websites have compatibility problems on one or more of the major browsers, leading to a poor user experience.

Can we just let these older browsers die out by not developing for them? Surely the users using them will update once every website they visit is broken.

Personally I only add the html5shiv and make HTML5 elements display block.

> The 18.2% of IE users running IE8 are on a browser that Microsoft no longer patches

Where does this figure come from?

I don't find this all that surprising. My work transitioned to IE 11 late last year - we were running IE version 8 before that.

I work for a large company. Opposite of agile. A huge chunk of the people who work here are non-technical. Most machines are heavily locked down and users do not have permission to install their own applications. Upgrades (like changing versions of the browser) are rolled out from our Information Services department. I imagine there are many many "large corporate" environments around the world which have similar application policies to us.

The transition to IE 11 was was not all that smooth a large number of internal sites broke (including our intranet which runs Microsoft's own SharePoint product). Even today some months after upgrade some sites still behave oddly and you have to go through the enable/disable "Compatibility Mode" dance to get some sites to wok correctly.

I know that :-) I still had to maintain and test our main application for some corporate users running IE 8 until a few months ago. And I can still see some IE 8 (less than 0,5 %) visiting our main public website. But it doesn't explain where does this figure come from. The percentage seems really high, compared to stats I've seen elsewhere.

It looks like the article was updated. The new figure looks more sensible:

> The 2.07% of users currently running IE8 are on a browser that Microsoft no longer patches

When I think about this problem, I feel like there are 2 major priorities that tug from different ends of the spectrum on this issue.

1) I want innovation. I want new features for the web. I don't want innovation to be hampered by the priority of making features cross-browser compatible.

2) I want cross-browser compatibility. I don't want to have to develop N apps (N representing the total # of browsers my website / app supports).

Personally I don't care if X browser has feature Y. As a developer, if I find a feature I want to program into my website / application, I don't want to have to wait for N-1 other browsers to implement this feature before I start using it.

I want to be able to "declare" or send a "hint" in my HTTP headers (or meta tags?) telling the browser that my site is supported by X,Y,Z browser engines. Each browser would implement a bare bones interface that decouples the "chrome" of the browser window from the "application" window, and makes the application window interchangeable (plug n play) with other browser engines. This would make browser engines interchangeable without requiring a user to close one browser and open another.

Granted, if a user doesn't have browser engine 'X', and refuses to download / install it, that's my loss as a developer / website owner. Or of course I could write some graceful degradation into my app / website. Already I hear some folks saying "but this is what we do now". Not quite. It's a subtle but I think important difference. Most people are willing to install (or already have installed) all major browsers on their computers. The point is, currently we must develop for / test for all major browsers to get the desired reach (or to prevent a user from having to switch browsers). This alternative approach moves the needle when a user installs a browser, not when they use the browser. So if there are a substantial # of people who have both Chrome / Firefox installed on their computers, but 'X'% of those people regularly use Firefox, I can still develop ONLY for Chrome and still get the desired reach into the world's population who have internet access.

There are of course going to be pros / cons no matter what you do, but given the above priorities, and the predispositions of all actors involved (web browser developers, web browser users, website / web app developers) I really think this is going to be the most scalable, least intrusive / obstructive path forward.

Dear AndreyErmakov,

I agree with a lot of what you say but I won't upvote a comment that compares bad practices in academia to child abuse.

For the kids that are beaten, burned, neglected, abused etc: please change that.

We detached this subthread from https://news.ycombinator.com/item?id=12044758 and marked it off-topic.

Dear AnonymousUserWhoLikesToHideTheirName,

That's just hilarious. For as long as I'm saying things that conform with the majority's opinion, I will be upvoted. But when a have a different opinion, then I will be downvoted until I censor myself back into conformity. Is that how things work here, on Hacker News?

Have you by any chance heard of concepts like pluralism of opinions, freedom of expression and that sort of things? I'm not sure where you come from, but in some parts of the world people are freely able to have a civilized discussion around complex and sometimes controversial matters. You may consider going abroad, living there for a while and learning how a liberal society works. Will be a great help to you.

As said, I agree with the ideas, it just so happens that that one comparison seems to overshadow the important parts of your message.

BTW: I did not downvote you.

What if it happens to be my opinion, the way I view things? I should be able to express it, right?

I can't really crop my ideas if some parts of them are not well received. They come as packages, I either tell what's truly on my mind regarding some matter or I don't tell it at all.

If I can't say what I mean, then I can never mean what I say. And then the entire dialogue becomes meaningless.

The layout of that page is borked in IE11 - the text takes up a narrow portion on the left of the browser window.

Not much better in Chrome where the body of the article is not-really-centre-aligned.

In IE the search box is also of insufficient height, meaning that lower portion of the watermark text is not visible.

Maybe the author was being ironic.

I can't comment about the IE situation but the page looks nearly identical in Chrome for me.

I don't see any alignment issues at all actually; the text is supposed to sit just to the left of the center of the page.

Am I really seeing this debate again? Really, fuck you web professionals. In 2005 this debate was over, and the standards were understood as positive, accessibility was recognized as important.

Now what we have got is again proprietary code, app just for smartphones, and one browser taking over all the others. Bravo. Great progress here...

I wish the web could stay open.

Actually, the web is more open than it was a decade ago. Back then I had to use IE-only tricks to make my pages look okay in IE. Today I don't care anymore. It's been several years since I last used an IE trick, and I have no plans of using tricks specific to other browsers in the future. If something doesn't work out of the box in all browsers, I consider this functionality not implemented and do not use it. That's something you could do too and educate others to approach it the same way.

You are both absolutely correct. It's better than it was AND we have to be very careful to keep it that way.

I use the exact same strategy you do: if the feature doesn't work in all browsers, it's not ready yet and I don't use it. It can be painful, but I will NEVER AGAIN use browser-specific tricks or hacks because I remember a time when we pretty-much had to.

As a sometimes web dev, I don't care what the consensus is. I care what my boss tells me to do. He cares about it making money.

You have the power and perhaps the ethical responsibility to push back and advocate for an open, accessible web.

  > You have the power
hehe if only...

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact