Hacker News new | past | comments | ask | show | jobs | submit login
[dupe] Dear Developer, the Web Isn't About You (sonniesedge.co.uk)
220 points by colinprince 4 months ago | hide | past | web | favorite | 182 comments



Previous discussion (8 months ago, 88 comments): https://news.ycombinator.com/item?id=16773398


That doesn't seem like a good reason for it to be marked as [dupe]...


Indeed. Is there some other more active discussion somewhere?


> But the cultural focus is on developer happiness, on developer fun and ramping up of code-related skillsets. "How does this benefit users?" has suddenly gone missing from our vocabulary.

No, it hasn't. It's still 100% about how to benefit users for most teams. That is quite literally the only thing that matters and everything else is a means to an end.

I'm not going to deny the fact that there are some shoddy websites out there. There are some sites that are using the latest and greatest JS stuff when really, they should just be using HTML and CSS. Fine.

The majority of the work that gets talked about here, by software and tech startups, will not be effectively built using HTML/CSS or server-side rendering. Since we're on the topic of users, users have come to expect a certain level of interactivity. They expect that if you right click on a row, you get a drop-down menu and you can delete it. They expect that you can do inline editing of an item without being redirected elsewhere. They expect that you can create a new thing on the page without reloading and redirecting.

Let's forget about that part, though. Most software features are, at the end of the day, experimental. We don't know exactly what is going to be a hit and what isn't. If new tools allow us to iterate at 2x or 5x or 10x the rate of old tools, this is a net positive for users.

No one would denigrate a carpenter for using an electric screwdriver over a manual one. No one would complain that the carpenter only cares about their "carpenter experience". At the end of the day, the "user" wants their cabinets finished.


I can't agree. Most sites I see nowadays are not in any way more effective for their deep use of frameworks. Best example of the craziness here is the blogging stuff Google has that loads the content after the page load. Why?

React is amusing, as it makes it look like you have to have a complex JavaScript framework to have hot loading. But you don't. Never have. Will you need a fancy tool that knows JavaScript well? Yes. Do you need to be writing to that tool, or with it?

So, nobody would mock a carpenter for using power tools. They would question one that only worked with a 3d printer. Even though that would certainly let them iterate faster.


I have to question some of this. Hot-loading is not why people use React, it's not even particularly reliable with React, and I often develop without it.

The primary reason to use React is that it's an easy API for working within the stateful component paradigm, and goes the extra mile in protecting you from state-related bugs. It still has value even if you drop the stateful part, it's not unheard of to build completely static or server-rendered websites with React without any client-side JavaScript.

There are performance penalties when it comes to the initial page load, you tend to win it back when it comes to subsequent user activities. Whether it's worth the trade-off is dependent on your use case. There are also downsides to progressive enhancement, an obvious one is that if your enhanced JS version changes appearance/layout in a noticeable way, you could end up making the website feel slower even though it's technically usable sooner. In addition, what happens to any existing interactions with the non-enhanced UI after the enhanced version suddenly appears?

The fact is, you just have to decide what's important. I do agree that not enough developers are giving real-world accessibility and performance enough consideration. But I'm less convinced that there's a way to approach these needs that is superior in all circumstances.


As soon as you find yourself storing a lot of state in your UI, you have already lost the complexity battle.

Granted, for some UI, this is hard to avoid. Those are much fewer and far between than given credit. Most state for a browser should be easily holdable in the address. If you find you can't do that, are you sure it is relevant state that your user cares about?


Users want responsiveness instead of page reloads between every action. It’s pretty simple, and it’s what almost all customers (and their designers) ask for when getting a web app made.

The idea that there’s a mythical user out there who absolutely demands serverside rendering is a bizarre figment of HNs collective imagination.


> Users want responsiveness instead of page reloads between every action.

Then implement that. There are much more lightweight ways to achieve this than going full react. Especially with modern JS this can even be implemented from scratch.

Take HN as an example. Inline replying, voting etc. is undeniably convenient, but there is no way there shouldn't be a full page load when going through paginated results, or opening the comment thread to a post. That requires a couple lines javascript, no bloated js-framework-of-the-month. And the page stays completely usable without JS. A lot of stateful webapps out there aren't any more complex than this, structurally. Bonus: You get a fully working, predictable back button in your browser for free.


I would just like to chime in to say that users expect links in webpages and not in apps. I fully expect the webpage of hacker news to behave the way you describe, because it's a webpage.

I recently used the best webapp I've ever used - LucidCharts. That was an app, and, as such, I didn't have the expectation that everything I did generated a new link, or that I could even link to certain things. BUT, they also struck the balance of providing web behaviors for things where I expected them.


Nah. For the users that like your JS-based responsiveness, a properly designed server-side HTML is likely to render faster. There have been demos that show that on good connection, a server roundtrip and re-render of styled HTML will be faster than AJAX and client-side renders.

And for the users on weak machines or connections, the reload version at least works, unlike the JS one.

The "users love AJAX" argument is both silly and honestly also a bait&switch. It's not really because of AJAX that people build their sites in JS frameworks.


As Rails + Turbolinks has shown, unsurprisingly, users don't care what renders your page. As you said, they just want it fast.

Server side rendering HTML and serving it up works perfectly fast for pretty much every site which is read only content or forms. The of 3rd party scripts running on every page are the issue.

Also as a major negative against SPAs, once a script is loaded - it's going to be ever present on every page afterwards - enjoy explaining why payment details page needs 20 3rd part scripts running on it. Not possible to unload scripts.


I know this is a cheap answer, and it tends to get a bit philosophical for me, but you have to ask the question "would that site exist without those frameworks?"

If using that framework means that the developer can churn out something much faster, and therefore has time for more features, even if they aren't perfect or fast, is the user really worse off for it?

Continuing the carpenter analogy, it's the difference between using a nailgun and a dovetail joint.

Sure, the nailgun'd version probaly won't last as long, but if that means that the one carpenter was able to make 5 different shelving units that all looked exactly how the customer wanted them to look and was able to do it for cheap enough that they all happily bought it, is that really worse than only being able to get one expensive "better" version out and not being able to customize it per-customer because of time constraints?

And I know first-hand that those kinds of trade offs happen all the time. There are tons of times where "can we afford to add feature X" is a "yes" because we are using React in the app and spinning up a new component becomes extremely easy, but in applications which aren't using a framework that is often a "no" because we need to do so much more work to get it fully integrated and working.


"If using that framework means that the developer can churn out something much faster, and therefore has time for more features, even if they aren't perfect or fast, is the user really worse off for it?"

I can only speak for one user, myself, but my answer is -- yes, the user really is worse off for it.


Can't you just not use that product then?

I get that there may be network effects that trap you, and you could be worse off because that means that by prioritising another feature they made one of your personal favorite features worse, but that really doesn't have to do with the technology used and more with what the developer is prioritising.


> Can't you just not use that product then?

I can and I do, for as long as it's possible.

The problem is that JS-first fashion infects web shops, and one by one, websites are being rebuilt in "modern ways", rendering them less usable than they were before. Such sites would be easy to avoid if they were the exception, but you can't do that if everyone follows this trend.


> Most sites I see nowadays are not in any way more effective for their deep use of frameworks.

It’s both amusing and sad to see modern web developers criticize Java for being “bloated” because they don’t understand the Abstract Factory-pattern (fairly light-weight, 2 levels), while themselves are using infinitely more complex and heavy frameworks to render what is more often than not is just static HTML.


The problem with OOP patterns is not that they are complex. It's that you gain incredibly little by using them.


You can gain a huge amount using OOP patterns, if your problem domain maps well to the OOP paradigm. I know this from personal experience on many projects.


Even on problems that map well into the OOP paradigm, the gain does not come from the standard patterns.

Proof of that is that in languages where there are alternative patterns (like Python), people rarely create a factory, they use functions that return classes; they don't use singletons, they just use module static variables; they don't create observers, just just gather callbacks on a list; and so on.

There are many less formal patterns that do add a lot of value. And there are syntactic and type level patterns that are great, but typically OOP programs use those less than code in other paradigms. OOP has one common less formal pattern that is very useful and less used elsewhere - it's the abstract API, but this one people on the internet mostly recommend not using, because it requires inheritance.


There is no such thing as "standard patterns". There are a large number of different popular patterns which many developers have found useful for a broad range of uses.

Don't get too hung up on the implementation details in any particular language. A Python function that returns a class is just a factory by another name. The essential point is deferring the decision of which implementation class to instantiate until run time.

Inheritance is one way of handling abstract APIs, but not the only way (depending on language). And don't believe everything you read on the Internet. In some limited circumstances inheritance can be a great way to simply code and eliminate duplication, far outweighing any additional complexity it introduces.


> OOP has one common less formal pattern that is very useful and less used elsewhere - it's the abstract API, but this one people on the internet mostly recommend not using, because it requires inheritance.

Or just use interfaces. As in "design by contract".

And that's just common sense programming wise, not some OOP specific solution.


Yup. The actual issues with the design patterns come from a) most problems not mapping well to OOP, and b) patterns that exist only to compensate for low expressiveness of a language (e.g. all the patterns that compensate for not having proper closures).


I agree with point B, but A is objectively wrong, borderline trolling.

You can successfully apply OOP and OOP design-patterns to a lot of problems and come up with equally functional solutions as you would with FP.

And for certain domains, better.


React has nothing to do with hot loading. It simply gives a retained-mode DOM an immediate-mode interface that is either to program but still retains the efficiency of retained mode. Yes, it is true, that having anything immediate-mode makes achieving hot loading easier (among many many other things), but it isn't the main reason behind the framework.


People use react because:

1. It is designed to be testable. It makes testing the components much easier than testing jQuery soup. 2. It has an opinionated way for developing javascript applications 3. Will validate input types passed to controllers 4. Has nicer syntax than jQuery soup


> jQuery soup

I'm not sure if you're blaming the tool or not, but at least for 3rd parties, I want to clarify: jQuery doesn't mandate "soup" - In fact, devs learned over time that the best approach development with jQuery was the same as without - have application state tracked as variable(s), have routines to translate the current state into the the DOM. That's the unidirectional flow that React likes, and both vanilla JS and jQuery can run with it.

But jQuery makes it EASY to use your DOM as state storage, which works well at first and quickly grows nastily complex. Meanwhile, React makes it EASY to update the DOM from state without performing unnecessary re-renderings.

So jQuery doesn't mandate, or even encourage "soup", but there's nothing guiding you away from it. React, OTOH, makes the path of least resistance be towards this particular best practice.

To your points as to why people use React, I don't think those are bad points, but those aren't my primary points. React best practices mimic the best practices of coding in general: Small, single-purpose components with minimal coupling. This allows me to bring in experience from outside React, and allows me to apply React experience outside of React.


> application state tracked as variable(s),

Model.

> have routines to translate the current state into the the DOM

Views. Triggered by #changed notifications.

> That's the unidirectional flow that React likes,

Aka MVC.


> People use react because: ...It has an opinionated way for developing javascript applications

I really don't know what world that statement would be true in.


React, Flux, JSX, GraphQL, Jest.

It is a full tool-chain, language, architecture, API spec, and testing framework. They are all closely tied to each other.


> opinionated way for developing javascript applications

> React, Flux, JSX, GraphQL, Jest.

Opinionated implies there is "1 true way" to do things a la angular or ember.

react is most definitely not the embodiment of opinionated.

ref the multitude of libraries which were flavour of the month before something else came along.

> It is a full tool-chain, language, architecture, API spec, and testing framework.

Ok, now I know you're either trolling or have never used any technology but React to solve problems.


Well, as a user I find Javascript heavy websites to be a toxic wasteland full of dissapointment and failure.

Just yesterday I was trying to use the Sendgrid support site. I was logged in. There was a "click here to login and send a support request" button (and no other way to create a ticket). clicking it just reloaded the page. There was a login link (even though I was logged in). It may as well have been a static image, you couldn't click it.

After a few minutes of trying the button finally worked. When writing my ticket the support system would scroll the page down on every single keypress so that the text field was aligned with the bottom of the browser and I was typing on the bottom few pixels of my browser and display. It was awful.

Unfortunately this "certain level of interactivity" is what I've come to expect from Javascript heavy websites. News sites where the content doesn't load. The re-rise of loading spinners, delays, errors, my CPU fans being exercised While. Displaying. Text.

> No one would denigrate a carpenter for using an electric screwdriver over a manual one.

No, but I'd complain if it took an hour to open my front door, took all my strength to do so, then when I finally got into the house it just fell into a pit and I had to start again.

I hate the rise of Javascript, and I resent that it has made the web a significantly worse place.


"Well, as a user I find Javascript heavy websites to be a toxic wasteland full of dissapointment and failure."

And copious security issues. This is why I block all Javascript by default.


As a user who occasionally uses rural (slow) internet, I wish more developers would just redirect to a new page for editing.

It often takes longer for the initial JS-heavy page to load than it would for me to click & load 2 simple HTML pages. It's so bad that I just skip using certain websites when I'm on the slow internet.


Feel you. I live in an area with not even very slow internet, just not full 4G speed, but the experience of navigating heavy-js websites is... extremly bad. Like often I try for 5 seconds and then just quit the tab - bad.

I feel like a lot of devs do not even try testing their websites / progressive webapps in slow speed situations or on low end devices. If they tried they would understand why here if you want people to try your service you must build an app, put it in the store and keep network messages at minimum. Webapps are a good way to get a "Not interested, thanks anyway"


Been a little while since I looked into this, but some devs do test for slow connection usability. The problem is that browser tools for simulating a slow connection don't drop and delay packets semi randomly like radio tech will. They just limit your max speeds.


Try Network link conditioner, built-in part of macOS and iOS. It includes options to set bandwidth, packet loss % and latency in each direction.


Iptables can be used to make a radio imitation.


> If new tools allow us to iterate at 2x or 5x or 10x the rate of old tools, this is a net positive for users.

Honestly, I disagree with this. Iteration is great for developers, but not users. To use your tool analogy, users are attempting to use the toolbox you made to perform a task, but the content of the toolbox and how you use it changes every day. Users of the toolbox can never become expert users, because they can't ever rely on their past experience with these tools - the usage has probably changed in subtle ways.

Personally, I really hate having to re-learn how to use a screwdriver (or a blogging platform) every few days - and I'm a professional tool builder myself! It sucks even worse when I can't use your screwdriver if I only have 3G internet access.


I could not agree more. Iteration is a wonderful thing while development is taking place in-house. Frequent iteration is a terrible thing if it's being done in products that actual people are actively using.


Your correlation between iteration cost to net positive for users is completely false. It might be a net positive for developers but certainly not users. You sound like the kind of developer who has only ever worked with an electric screwdriver. Loading 2 megs of javascript so you don't have to redirect and open a dropdown is NOT a benefit to the user.

Relevant: http://suckless.org/philosophy/


The fact that I spent time working with the manual screwdriver is how I understand the iterative difference. The turnaround time for features is not even remotely close.

One of my first jobs was maintaining and extending an ASP.NET Web Forms application.

A dashboard style application has to do way more than open a dropdown. People keep the page loaded all day long - the fact that it takes 2 seconds to load the first time is irrelevant.


You understand the difference it makes to YOU as a developer, not the user! We don't need more features, we need lightweight software that everyone can use, that was the point of the article.


Super anecdotal but in my experience I hear developers complaining about bloated websites an order of magnitude more than ordinary users.

It's almost like for some the argument is more about some abstract concept of artisanal purity than true care for user experience. HN basically has a front page post decrying the state of the modern web every day...

Before anyone jumps down my throat about this I should add that browsing the web with JS disabled is a truly wonderful experience on sites that support it. It would be great if everything did. I don't see that happening though.


> Super anecdotal but in my experience I hear developers complaining about bloated websites an order of magnitude more than ordinary users.

That's because developers are the only people who know what the fuck is going on, and who to blame. Regular users don't have a mental model to correctly identify the source of their annoyance. So they end up blaming "the computer". It often manifests in requests like "could you come over one day and clean my computer? I think it's full of viruses." No, it's not really full of viruses, just the websites you're using the most went through another redesign, and now consume 10x resources for zero added utility. But what can you do. I install adblock and sometimes buy them another RAM stick, so they can throw their laptop in a garbage bin a year later than they would without my help.

No, users do complain, you just have to know where to look (and actually talk to them).


> some abstract concept of artisanal purity

As opposed to what, some abstract concept of ordinary users? What the sales department wants? The idea of craftsmanship isn't that abstract to me. Performance and cacheability aren't abstract at all, that a website can get reloaded many times even for just one reader, and usually shares memory and CPU and HD cache with many other tabs, is also a really obvious observation. Stuff that makes a noticeable difference even in isolation makes a giant difference multiplied with a triple trillion, I've done the math.

> HN basically has a front page post decrying the state of the modern web every day...

That doesn't mean there isn't a problem. There's also articles decrying environmental destruction every day, not on HN but in general - should that make one care less? Would you say that biologists and climatologists are more concerned than the "ordinary person" means the ordinary should be heeded? I'd be surprised to see scientists in a science forum talk like that.

Web development is an "art", maybe like architecture is art, and we are the artisans. If having any sort of ideals to strive towards is "abstract" to us, that says more about the people involved in web development and how much genuine excitement and care for detail was destroyed by money and marketing, but not much about web development as an art form as such.

Does a flood of crappy action movies say anything about the state of the art in movie making, at all? And yes, the "average audience" maybe likes them, but who cares? If they got something else instead, there's nothing they could do about it, and it would be better for them. I don't care if that's arrogant, but don't call it abstract :P

> I don't see that happening though.

That is like talking about sports or the weather, even if its true, it's pointless. The question is rather, what do we think should happen, and how can we make it happen.


This is a whimsical belief.

No - users actually do want more features. They want more features than you can ever hope to deliver. Figuring out which features they will actually appreciate is the tough part. Your competitors are working very hard to build those features before you can. Your competitor's sales team is already cold calling your users.

Lightweight software is easy. As an engineer, I appreciate it, but it's not what the average user wants.


The user wants the site to actually work. The features don't matter if the site won't load because it's trying to shove multiple megabytes of JavaScript through a congested DSL or dial-up or 2G Internet connection (and especially not if that mass of JS is in turn trying to pull multiple megabytes of graphics before deciding to finally even add the actual content to the DOM). A prospective customer who can't use your website is a customer who's jumping ship to a competitor (as the article described with rather realistic examples).

And for the vast majority of websites, the users don't actually want (let alone need) that degree of interactivity. A blog shouldn't depend on JS to be usable. A storefront shouldn't depend on JS to be usable. A restaurant page (even one with a reservation system!) shouldn't depend on JS to be usable. Even something like Twitter shouldn't depend on JS to be usable. Unless you're building something like Trello that's heavily dependent on drag-and-drop as a basic workflow, there is no good reason to make JavaScript a hard dependency on using your site.

If users are forced to allow arbitrary Turing-complete code to run on their machines because your site is inexplicably incapable of running without it, then you have utterly failed. That's harsh, but that's reality. Your users want oodles of JavaScript only in the sense that a penguin wants to swim through an oil slick.


> No - users actually do want more features. They want more features than you can ever hope to deliver. Figuring out which features they will actually appreciate is the tough part. Your competitors are working very hard to build those features before you can. Your competitor's sales team is already cold calling your users.

I'll believe it when I see it. With notable exception of collaborative editing, most software on the web is essentially a subpar reimplementation of desktop software from 10 years ago, missing half of the features, but consuming 10x the resources.


"The turnaround time for features is not even remotely close."

True, but turnaround time is something that is primarily beneficial to developers, as it's all about reducing development costs.


> [...] users have come to expect a certain level of interactivity. They expect that if you right click on a row, you get a drop-down menu and you can delete it. They expect that you can do inline editing of an item without being redirected elsewhere. They expect that you can create a new thing on the page without reloading and redirecting.

I like that you speak in the name of "the users" who expect a certain level of interactivity, when that same level of interactivity on a page means many of these very same users won't ever see the page load fully, either because they can't be bothered to wait enough time on a slow connection, or because their browser/device simply won't do.

I see quite a clear contradiction here. And it kinda proves the point this post make about overly eagerly assuming users have a fast enough network and/or device.


it also shows how quickly people make the "Good for developers = Good for users" mental leap.


I largely agree with the points you are making, however I think there's a fundamental difference in the carpenter analogy: regardless of the type of screwdriver used there is no material difference in the UX of the cabinet, whereas that does not hold for web development decisions.


> users have come to expect a certain level of interactivity.

Users don't expect interactivity, they tolerate it. Think of websites as obstacles or puzzles users have to solve to get to what they want. They would rather not interact with those puzzles or wait for them to load at all. If you understand that - you can't justify interactivity or most front end development for that matter by claiming this is what users want.


In my experience, ironically, a lot of web tooling tends to actually bog you down, especially at scale. For example, I've seen people get blocked from shipping code due to a million lint errors after running some codemods, people wasting a month configuring webpack (and still not get it quite right), people wasting weeks cleaning up flow errors after a patch release. Bikesheds about webpack loaders and babel plugins and lint rules and typescript-vs-flow. Nasty workarounds for babel parsing bugs for non-standard syntax. Erroneously-strict mode bugs in bundling. Minification bugs. `node_modules` peer dep hoisting bugs. Bugs in `yarn`. GYP issues. Stuff that only breaks in CI. VS Code plugin configuration issues. Etc etc etc.

Large companies literally have entire teams dedicated to maintaining tooling. And don't even get me started on monorepos.

At a previous company, a co-worker that had been setting up tooling infrastructure for a project was astonished at my iteration speed when he saw that mine was written with just a micro-framework and ES5. Conversely, when I had to jump in to help get his project back on schedule, we could never figure why the hot reloading setup had a 2 second delay on my machine (among other things).

Having worked on two projects of very comparable scope (two documentation sites for different projects), one written with the "latest and greatest" Gatsby and friends, and the other with just a dirty script, I honestly can't see any reason to do the former: it takes longer to do anything, the final payload is bigger, and now the codebase is using a version one major out of date, meaning a migration looming in the horizon. On the upside... GraphQL; yay I guess?

With this point in mind, your carpenter analogy seems a bit off IMHO. Of course, consumers don't care if the carpenter uses an electric screwdriver, but they do care if a knob gets loose and falls off because it was held by a single rushedly-drilled screw. Also, the employer should presumably be worried if said carpenter keeps spending a significant portion of his time sanding his hammer handles and replacing his workbench legs, or needs another carpenter to do so on an ongoing basis.


Carpenter here. Metaphors suck and are almost always deceptive ways to reinforce a point. Someone would denigrate a carpenter for using an electric screwdriver on finish screws where precise movements and protection of the surface area is important. You would be fired immediately.


Those are almost decorative though


> No, it hasn't. It's still 100% about how to benefit users for most teams. That is quite literally the only thing that matters and everything else is a means to an end.

I can't agree with this statement. It's 100% about how to benefit the company, which means satisfying the customer. And serving users up to the customers is the name of the game.

Most sites are heavy with trackers. You could argue that *one& might be in the user's interest, but can hardly make the same claim for the huge number of ad-related ones) and full of devious antipatterns. Just enough value for the user, one hopes, to get the page view.


It's still 100% about how to benefit users for most teams

I’m sorry, but pages that pull in several Mb of ads and tracking to show a couple of Kb of text the user actually wants are by no means constructed for the benefit of the user, any more than the line is baited for the benefit of the fish.


Pulling in those megabytes of ads didnt increase developer satisfaction either. Those are marketing decisions, not technical decisions wether to render serverside or clientside


The developer gets paid partly by the adverts. Getting paid is integral to my job satisfaction...

I actually don't work on any ad supported products. That improves my job satisfaction even more.


> The majority of the work that gets talked about here, by software and tech startups, will not be effectively built using HTML/CSS or server-side rendering.

The majority of work that gets talked about here is technical work which is N levels below the business use case. Without the business use case you have no idea if it can be effectively built using server side rendering. Because of that I'm not sure anyone here is really able to make this statement.


> It's still 100% about how to benefit users for most teams.

If that were true, things like Amp would never have even been proposed.

Modern web browsers are absolute marvels, but shitty developers and business people are driving people like me into apps.

I'm far less optimistic about the web now than I was 20 years ago. I more bullish than ever on the internet in general, but the web is a mess.


"No, it hasn't. It's still 100% about how to benefit users for most teams."

I have to say, though, that if this is the case then far too many teams utterly fail. Quite a lot of (most, I think) web sites don't appear to be designed primarily to benefit users to me (user benefit generally seems to fall to third place, behind "make it pretty" and "use the sexy new tools"). And that problem has been getting worse over the last decade or so.

"No one would denigrate a carpenter for using an electric screwdriver over a manual one. No one would complain that the carpenter only cares about their "carpenter experience"."

I absolutely would if the result is that the carpenter's work is worse for it.


> I absolutely would if the result is that the carpenter's work is worse for it.

I don't know by what metric the work is, on average, worse for it.

I've been on the web since '91 and I've been more or less an obsessive user since then. I've seen the changes.

The web - again, on average - is better to use than it ever has been. Some websites suck. Some websites are awesome. The worst websites in '95 were unusable and had broken layouts/styles, but people act like every website in '95 was some perfect embodiment of minimalist engineering.


"The web - again, on average - is better to use than it ever has been."

My opinion is very different. Fair enough -- we're different people.

"but people act like every website in '95 was some perfect embodiment of minimalist engineering."

I'm certainly not asserting any such thing at all.


The '30% of rural Americans still not on broadband' point is, understatedly, one of the most important points of the article by far.

It is so, so easy, to be living and working in big cities like Toronto (me), SF (a bunch of you), NY, Chicago, et cetera, and truly forget the context of the absolutely awful experience a lot of people have online outside of these urban epicentres.

I remember living on a farm in rural Illinois (Marengo), and the best internet we could get at the time was satellite. Which (or, worse, dial-up) is the only option for a lot of Americans, apparently.

Satellite internet, in particular, causes these poorly-optimized web sites with hundreds of tiny JS files and images to take forever to load on this type of connection, because Satellite internet works by pinging the satellite for every connection, meaning downloading a large file can be fast, but opening Facebook can take up to 3-4 minutes. (My ex-partner who still lives on that farm confirmed this for me today.)

The author, then, brings up the next, most important point: 'Your website had better be amazing for it to justify that length of download time.'

For those making the audacious claim here that this isn't considering the dev's time put into it, I'd argue immediately back that the devs need to be considering the actual users.

People living in the country aren't 'edge cases'. They're usually extremely hardworking people who simply don't have access to the luxury of broadband like we do. Just because we don't necessarily see these people, or travel to other countries with shitty connections, doesn't mean they don't exist, doesn't mean they're not actually a potentially significant number of people, and doesn't mean we should not be arsed to trim 1MB-2MB off our bloated image-and-video-wieldy sites, or even that it would necessarily take a lot of time in most instances.

I agree. The web isn't about us. It's about who uses it.


> It is so, so easy, to be living and working in big cities like Toronto (me), SF (a bunch of you), NY, Chicago, et cetera, and truly forget the context of the absolutely awful experience a lot of people have online outside of these urban epicentres.

To be fair, many urban companies are designing for urban and suburban users. Your advanced analytics dashboard product is not for Cletus in rural Alabama.

If 30% of your target users don't have broadband, then you should design with that in mind. If 0.01% of your users don't have broadband, then you should ignore them.


If you make your site completely unusable for 30% of people, no matter how many try to use it, they'll end up making up a miniscule portion of your user base, which you'll then use as justification for ignoring them.


Which can be ok. There's no ethical duty to make websites for literally everyone.

Accessibility is a different issue of course, but to illustrate my point: I have never translated my websites to Chinese, making them unusable for 1/7th of the world population. I don't think it's necessarily a failure on my part.


Well, yes, you didn’t communicate it in Chinese. But did you make the text easily machine identifiable? Was the source language indicated? That would allow their browser to offer the ability to translate.

You didn’t intend it for a blind audience, but they’re certainly capable of reading the text. Did you avoid making the text actively toxic to screen readers?

Many of the whiz bang features on new sites, ostensibly because “users want features!”, hinder even this functionality. It’s fair to criticize this as a regression.

I use a browser extension that lets me click links and buttons from the keyboard. The more bleeding edge they make the sites, the less I can use this feature. That’s not an improvement.


It's not worth it to gain 30% more customers by spending 10x the original development time micro-optimizing everything.


But it would be better to gain that 30% of customers by not doing a shit job in the first place. This was the point of the presentation. Not doing a shit job doesn't take that much more effort.


A really important addition to this would be: it has to have been from the start.

The issue is that so many projects have not been designed with these ideas from the outset, that it indeed usually does take a mammoth amount of re-engineering and dev time to make these kinds of changes and optimizations after the fact.


Which is exactly why I cringe - often visibly - at any suggestion that a duct-tape-and-bubble-gum "MVP" is in any way/shape/form acceptable. That "prototype" will end up being the final product because dev teams have a tendency to accumulate technical debt like gangbusters.

To continue with the mediocre carpentry analogies floating around: a (competent) carpenter doesn't build a sturdy table by first building a shitty table and reinforcing it later.


I’m going to take a bigger view of this and you can say it’s “rather convenient”, but I’m not concerned about the experiences of a group of our society that is the biggest resource drag. I’ve had it with rural inhabitants. We already subsidize the shit out of their lifestyle, and now we’re expected to accommodate their Luddite-centric tech access? We need fewer people on the outskirts and more people in cities. The fucking planet is dying.

We should be incentiving people to live closer, not spending 90% of our time on them.


The "fucking planet is dying" because big corporations in their big-city HQs are strangling it to death in the name of short-term profits. Rural areas don't (typically) have the sheer abundance of smokestacks and cars and refrigerators and spray cans Swiss-cheesing the ozone layer. They're certainly not helping by any means (agriculture-driven deforestation is certainly a problem, but not really one in the US, at least not anymore), but that pales in comparison to urban pollution.

But sure, go ahead and blame the dirty country folk for not being able to reliably access your shittily-designed-and-implemented website. God forbid there are people who don't feel like breathing smog all day and existing in perpetual abject misery.


> The '30% of rural Americans still not on broadband' point is, understatedly, one of the most important points of the article by far.

I didn't care for the metric. How many rural Americans are there? Is it 1% of the population? 50%?

According to the 2010 census, roughly 20% of people in the US are classified as rural. So that means 6% of all Americans are not on broadband. That metric would be a lot more useful to most people.


Yeah the world rural population percentage has dropped steadily for the last 50 years, dropping below 50% in 2007.

The rural population has actually been growing the entire time; but as a percentage its dropped.


Do we actually care about these users? Are they going to buy the products we advertise on our websites? Are they going to subscribe to our platform?

No. So why should we optimize for people who do not matter to our bottom like?


> Are they going to buy the products we advertise on our websites?

Why wouldn't they? Physical location aside, they're not any different than you or I. They have purchasing power like everyone else.

> Are they going to subscribe to our platform?

Why wouldn't they? Because they can't load your page? Doesn't sound like their problem.

> No. So why should we optimize for people who do not matter to our bottom like?

Why don't they matter to your bottom line? Because you present them with tools and platforms they can't use, so they can't contribute to your bottom line? A self-fulfilling prophecy if there ever was one.


Well the answer could be yes. There is no reason to think they won't use those services if they had access to those services. If you have no way to access a service, then obviously you won't use a service. It's like an untapped/under-served market.


You have cause and effect mixed up. The answer to all those questions might very well be "yes" if you built a site that's actually conducive to them doing business with you.


With 30% having bad internet you'd think there would be a push in those communities to get better internet.

Then again, the lack of internet usability may create an image of the internet as not being worth it. If you have no access to the better things on the internet, then you may think that there is nothing of real value on it so you won't push for better internet.


"the lack of internet usability may create an image of the internet as not being worth it."

People having slower internet speeds is a serious problem with the web (because of modern web design), not with most of the other things the internet is used for.

Personally, as websites have become more dangerous and less usable, I've been finding the web to be a smaller and smaller space as the years go by. For the first time in my life, I can see that it's entirely possible that I will largely stop using the web.

However, that in no way means I'll stop using the internet.


I get what you're saying, but for most people the internet, and the web are nearly synonymous.


Yes, I know, which is why I push back on this every time it comes up.

That people equate the web with the internet is a very bad thing for both the web and the internet.


> We’re back to building sites that are not for everyone - huge, bloated sites, running fragile imperative code on the users local device. We have started to explicitly say "I think you should have this level of tech, processing power, and bandwidth before I think you're eligible to use my site".

This isn't in and of itself the problem, per se. Sometimes a more powerful and robust UI / UX / application is necessary. The problem is when said "solution" is applied to __every__ need. The problem is, when screwdrivers are being used to pund nails.

Much "like publish of perish" there's no (ego) glory in saying "Yeah. It's relatively lo-tech but that's the solution that best fit the business need." Rare is praise and/or hiring based on good analysis, appropriate tools and smart solutions. Nah. Size is all that matters. How people who can't do what you do it more important than doing the right thing(s).


We have started to explicitly say "I think you should have this level of tech, processing power, and bandwidth before I think you're eligible to use my site".

I actually worked with a pair of developers who believed this wholeheartedly. It seemed to be part of some kind of design philosophy they picked up where it was the responsibility of developers to build complex web sites and apps in order to "further" technology and encourage people to upgrade for the good of... something. I'm not sure.

They picked some baseline iPhone level (two generations removed from what was current, IIRC) and decided that anyone who wasn't "smart" enough to own that model, or better, shouldn't be allowed to use the product.

Then when they were ready to deploy, it turned out that the product only worked on their phones, which happened to be the latest/greatest available at the time. So they built in a bunch of kludges and trimmed features to make it load on an older phone, and that was "good enough."

Not surprisingly, the company went out of business.


The other thing I notice too often is that the UI / UX was designed and/or built by someone with a cutting edge resolution screen that was likely connected to some corporate high-speed mega-pipe.

I keep a older crap laptop for a reason. I'm not sure why I feel so alone for doing so.


Don't feel alone, I do the same


I have seen this...so many times.

I've been an iOS developer since the App Store became a thing 8 or whatever years ago, and the amount of people I've seen willing to only test applications on devices only 1-2 generations behind, if that, is shocking.

What's more offensive, shocking, and actually confusing is that, yes, management is usually the one saying this kind of thing.

The reason that confuses me, is that you'd think, I mean, if I was in a management position at a mobile-related startup, one of the first things I'd do is have someone on my team do some market research to provide statistics on the current state of iOS device and version usage, and buy a freaking heapload of test devices, in all the different resolutions and variants.

Dismissing your users because they're not 'smart' enough to have a newer model is, as mentioned in the article, essentially blaming these people for the circumstances in their lives that leave them unable to, or is simply insulting their intelligence.

I still use an iPhone 6S. Why? I need a headphone jack. I run an independent music label and rely on GarageBand on a multiple-a-day basis. Bluetooth audio's latency is too awful to be usable when playing instruments or mixing/EQ'ing tracks, and furthermore, I'm not about to replace my $300 studio-quality headphones for some shitty bluetooth alternative, or drag some tiny dongle around I'll need to adapt the headphones.

Does this make me 'not smart'? I think the exact opposite. I couldn't use a new iPhone the way I have for the previous 6-8 generations before it.

There are dozens of reasons why people prefer to keep older models of devices. My 2011 MacBook Pro is still my daily Logic Pro driver, and I even still do a bunch of iOS development on it, because, gasp, I was able to open it up and upgrade the RAM to 16GB, remove the original hard drive, put in a 512GB SSD, remove the optical drive, and replace it with a 2TB HDD...which, if I'd bought a 2017/2018 and felt like the 8 or 16GB that came with it wasn't enough, I might as well throw out the window for all it's upgradeability.

Then let's talk about software - most of the people I know actively avoid software updates on their devices because they either don't know if it will hurt their device, or they simply actively like it the way it is already, and don't want to be forced into whatever changes the devs and managers felt like this time around.

People like what works for them. And they don't like change.

tl;dr - Highly intelligent people have highly intelligent reasons to stick with their older devices and software. Hipster managers who think they're 'too cool' to see this piss me right off.


> Highly intelligent people have highly intelligent reasons to stick with their older devices and software. Hipster managers who think they're 'too cool' to see this piss me right off.

A better way to put this might be: The newness or hipness of your device doesn't add any IQ points to your IQ score. The irony is those who need to understand this the most are often the last to get it, if they get it at all.


Saying you use a 6S as if you’re using old slow technology is a bit of a stretch. The 6S from 2015 is actually faster than most modern Android phones in single core performance - including the Samsung Galaxy s9


I'm not saying a 6S is 'old technology'. My point is, it's the newest that is available/usable to me, and my path to an upgrade to 'new technology' is non-existent.

I am saying, that the impression OP gave me is they were literally throwing it on the CEO's iPhone X, they felt like it worked there, and didn't bother to even go a couple generations back.

This wouldn't be the first time I've actually seen this kind of business-destroying, frankly elitist behaviour that does not benefit business.


It’s a lot more about how fast we can get stuff done. Wordpress exists for a reason and lots of apps are build with redux - getting what you need done at a cheap cost, not bikeshedding new processes when the old works fine.


Ha! That's in line with an idea I'm considering - a modern HTML-only browser. A bit like one of those terminal browsers, but with nice proportional fonts, images and other basic things. Customizable entirely up to likings of a user. A user agent, not a webdev agent. It could also support Gopher protocol. CSS got out of control, so sorry no more position:fixed bars that are more and more in fashion.

Ech. But first I need to get back to work on a search engine that would _not_ index ad serving pages with possible option of filtering out JS serving pages.


Build it on links. It supports advanced terminal functionality like graphics already.


> Ech. But first I need to get back to work on a search engine that would _not_ index ad serving pages with possible option of filtering out JS serving pages.

Please do, I would be all over that :)

https://news.ycombinator.com/item?id=18708325

I already use duckduckgo and fall back to !g, I'd love to try an ad/bloat-filtering search engine first, and fall back to ddg and google from there.

As for filtering JS, I guess the main point for me whether it uses JS to render anything in the first place, or just to enhance. E.g. even HN has some Javascript, but it's fully usable without.



Here's the real problem. The JS tools got really powerful and the majority of users now experience faster dev cycles and richer features. Now users cannot go back. If we started serving smaller sites with less js people would complain about all the little features they liked that don't exist anymore.

A developer now has a choice: use the big framework where you can satisfy those now entitled users or give up some success and make a tiny site with limited use of js. I'm all for the free web but I know which choice I would make when running a for-profit company.


Ya this kinda sumarizes the Crux of the problem pretty well. I'm getting a bit worn with seeing anti JavaScript what nots on HN. It's here to stay guys for better or worse (and will proceed to consume UI development _everywhere_)


Has anyone tried? Gone back to a simpler interface, fewer "little features"? I think I would like that, and others would, too.


HN seems to be doing fine


Yes, I do that for my own websites. If it has hurt the popularity of the sites, it hasn't been so severe that I've noticed.


I think it's true that many sites are over engineered for what they are. Does a site that's primarily for reading articles really need to be an SPA? Why can't your sign in fallback to being a simple form? Etc.

Then again it can actually be pretty easy to over engineer nowadays. Once you have your node/npm environment set up with webpack and all the rest you can quickly iterate a new site based on an earlier project. And if you hear of something new and cool it's only an npm command and a few hooks away.


You're right, it is easy to over engineer sites these days. It boils down to tooling for SPAs vs traditional static sites. The tooling for making static sites, even the most modern tooling such as Hugo or Jekyll, simply pales in comparison to libraries such as React, Vue, or others. The author is absolutely right that developers are attracted to shiny new objects, and the interest of the greater community is focused on SPA libraries when we should be equally (or arguably, more) focused on static site generation libraries as well.


Aaaaaannnd right there's the problem -- easier for the one or few developers, harder for the thousands or millions of users.

Where are the tools for making streamlined sites easier to build?

What happened to the old contest for teh best site in 5kb [1]?

We need those brought back -- seriously.

[1] https://the5k.org/


What I've been thinking about ever since I learned about web brutalism is - how can I make simple, HTML focused websites, static? That is to say, not requiring a backend I build to generate the necessary HTML and serve it as a document to a given url?

So index.html is easy, but is there a way to make it such that I can go to my website/articles/my-happy-day, and have that page be generated, rather than a separate handwritten my-happy-day.html file?

I typically "deploy" simple shit by uploading it directly to my bluehost filesystem, or Amazon s3. The only thing I can think of is a precompile thing, and then uploading the build.


I'm sure you know about static site generators like https://jekyllrb.com/. Are those what you have in mind?


I have heard of them but I've yet to try them. I guess I'm trying to figure out if there's some way to basically duplicate the idea of an SPA (using client-side templating or whatever) without all the overhead. I think you're right, that static site generators fill that gap, or perhaps just server-side rendering with very little JavaScript on the frontend.


An npm command that leads to dependency hell because jsland has to have a thousand packages. A complicated python codebase can be less than 20-50.


I'm mildly annoyed that images won't load properly with JS disabled due to lazy-load.js not running.


You'd think she would've thought of that on such a post. :P

EDIT: Isn't there also a way to pre-cache images with just CSS? o.o


While it's a bit long and I don't agree with everything, the article raises important reminders for web development with empathy and accessibility. This part felt especially relevant:

"[Progressive enhancement] won't work with something like React though, and I don’t cry about that. This is a distinct robust design pattern, which we feel is durable and doesn't lock us into a limited-life framework that will get replaced by something else in a year or two." (emphasis mine)

I imagine this last line gets some people reacting emotionally. I've spent the last few years in React-land (in addition to a soon-to-be-legacy stack slowly transitioning), and benefited much from the paradigm shift. It has moved the web (and web dev) forward in many ways, especially conceptually. At the same time, I can't deny that it will eventually get replaced, just like the rise and fall of jQuery. That means we must keep in mind not to be locked-in, to emphasize the core concepts which will outlive any framework du jour.

Progressive enhancement is certainly doable with server-rendered React, but it does seem like the recent trends have favored developer experience at the cost of user experience.


This article makes the assumption that every decision you make as a web developer is unaffected by co-workers.

I'm all for progressive enhancement. However, if the expectation is a web _application_ that has a certain level of interactivity designed for, with a real world deadline, and a targeted demographic of "people with supercomputers in their pockets", I can guarantee you'll make some compromises.

Bringing up Springer Nature and the BBC as examples of the Web Built Right™ is short-sighted when they are web _sites_ that serve static content. Of course you should send HTML from the server that represents the entire content of the page, apply some CSS, and sprinkle in JS. And of course, there are a lot of developers out there that would reach for the latest trending JS library + framework combo to pull it off, and they'll implement it poorly.

There's very little point in bashing "npm install exciting-tech", when I can npm install gatsby-cli, and do everything the article is crying out for in a "modern" stack. Oh, also, the BBC example? They use Node, and React, and server-side render React[0]. You can bet they're npm installing exciting-tech.

However, the web is not just static sites anymore. People expect a lot more functionality, you work with people that design and promise a lot more interactivity, and there's a lot more people coding without enough experience to reach for the tech that is the most efficient solution to a problem.

I don't think a condescending article like this is beneficial for our community. Maybe we should try educating (nicely?) on how to pick the right tech for the problem[1]. Maybe we should ditch "user" for "person" (or "surfers" lol), when talking about the consumers of our output. And finally, maybe we should have more empathy of the people that work with our trade and work together towards a better internet.

[0] http://www.bbc.co.uk/blogs/internet/entries/47a96d23-ae04-44...

[1] http://mcfunley.com/choose-boring-technology


I think the observation necessary for appreciating this article is that most sites that are built as SPAs these days aren't really web apps, they're still web pages. They're more similar to BBC and Springer Nature than to Google Sheets or LucidCharts. That's a property of the problem they're trying to solve. Building a web app where a web page would do shows total lack of interest about providing value to the user.


(1) That's a clickbait headline

(2) I think "management" deserves some criticism here too. I think the average developer doesn't want to stick 15 different third-party trackers in a page, but they don't get much empathy when they bring it up with the "business" people.


This is something I've wanted to mention for ages. A lot of modern web development seems to be about what's 'fun' for the developers rather than what works best for the company or user, and you can see it in the framework/tech obsession present in many companies and organisations. A lot of the time they don't need all React/Angular/Vue/Docker/Serverless/blockchain/whatever, they just need a simple CRUD system that lets them input content and have it shown to the user in plain HTML/CSS. But that's not 'fun', it doesn't get them respect on Twitter or at conferences (or on Hacker News) and makes it hard to hire programmers for, so hey, let's overengineer everything and pretend our blog is freaking Google.

Still, over engineering sites and products because you want work to be 'fun' isn't exactly exclusive to web developers. Quite a few game designers have similar issues in that field, and I suspect there's a bit of a clash in ideals between creators and audiences overall. It's rare you find a creator of any kind who just wants to do the same thing over and over and works solely for the money, and that's the cause behind all these issues.


This is a great article, even though the irony of the layout seeming to waste a lot of space/have a lot of whitespace/doesn't look/feel right on my mobile isn't lost on me, the content makes up for it. :)


And this server-side website takes 20 seconds to load. Imagine every interaction with a web app taking this long.

At this point the web is documents and also an application platform. Conflating the two causes trouble - sure, simple documents don't need JS, but non-trivial webapps really benefit from dynamic UIs/data vs old-school form post interactions.


And if you disable Javascript, all the images are really blurry. This can't be for page loading speed, as they are all small images that would be not more than a few kB.


I really like this article but I struggle with implementing it now.

I've learned a lot of React recently for creating sites. I test all pages that can be static with javscript disabled, (I use gatsby or next for html ssr), but some parts can't be static, or worse there's still the "webpack inlines from hell" problem.

I would LOVE to replace my javascript powered pages with a templating language, and offer feature parity. But using a template language like Jinja2 for building component-based sites just doesn't work as well, unless I'm missing some magical secret sauce.

This is mostly for B2B web applications. I work on a lot of analytics dashboards that are pretty data dense. Even with react though, my sites never go over 400kb on load.

Okay, essentially my question is: what templating language/server framework can replace react, jsx, and graphql for webapps? I really want something like that.


I'm not sure I understand your requirements, but couldn't you just use JSX as your templating language and render React server-side? https://stackoverflow.com/questions/25777931/server-renderin...


The last time this went up, someone made a comment about disabilities being relatively rare. I wanted to add a point here:

In the United States in 2015, 11 percent of noninstitutionalized adults reported having a disability (National Council on Disability (NCD). The Current State of Health Care for People with Disabilities).

Disability in mobility and in cognition were most frequently reported (5 percent). Data from the 2009 to 2012 National Health Interview Survey found that 11.6 percent of United States adults 18 to 64 years of age reported a disability (defined as serious difficulty with hearing, vision, cognitive ability, or mobility [walking or climbing stairs]).(MMWR Morb Mortal Wkly Rep. 2014 May;63(18):407-13.)

That makes disability more than 5x more common than being a natural blonde.


This whole conversation baffles me. The web is a sloppily-evolved designed-by-committee sofa-bed, nobody will ever feel completely happy with it, and it shouldn't be any other way.

Most of the time I work on "web applications", where the difficult part is managing the necessary complexity of the domain model and the user interfaces. JS and framework-happy architecture isn't perfect (not even close), but adding constraints like "the application must weigh less than a megabyte" or "the application must provide basic functions (?!) without JavaScript" turns what can be a fairly serious project into a clown-car pileup.

There's a faction of web people who want web applications to compete feature-for-feature with native apps. I'm closer to that camp for historical reasons, but I also understand that compromise on application features is completely worth it if the benefit is an (even partially) "open" platform for applications to build on.

But then there's this other faction of HTML purists who see the "true" web as necessarily document-based, declared with markup, typeset with CSS, and compatible with absolutely every computer system of the past 30 years. This makes perfect sense if you work for a magazine and see the web as a platform for open (and often static) content, and I'm sure some of them do understand the difficulty and the value of compromise, but I never hear that.

So now it's 2018, the "beat native apps" camp and the "documents for all" camp talk past each other through the years and nothing really changes. Maybe some people are in both camps and suffering from cognitive dissonance, who knows.

But there are good reasons why web applications don't do what Photoshop does, and good reasons why Photoshop doesn't even try to provide copy-and-pasteable URLs for every unique application state. There are good reasons why "serious" declarative application frameworks (like XAML) are a terrifying mess, and good reasons why building a CMS from scratch in C++ would be insane.


I generally agree with the author's statements, but the analogies to cars and doorknobs to me were just inaccurate. There are a number of reasons why you want round doorknobs instead of levers, one of which is security, as levered doorknobs are trivial to turn from the outside of the door.

Secondly, just because everyone has access to every website on the internet does not mean you need to build your website for everyone. If someone says their website is targeted at a specific group, they build it for that specific group. You wouldn't tell a formula 1 engineer to raise the suspension on the car he's building so it can clear speedbumps in the parking lot.

Again, The core concepts of the article are common-sense, but the tone and specifics don't resonate with me.


>I generally agree with the author's statements, but the analogies to cars and doorknobs to me were just inaccurate. There are a number of reasons why you want round doorknobs instead of levers, one of which is security, as levered doorknobs are trivial to turn from the outside of the door.

Wait, what? I can't find anything about door levers having this kind of security risk. This kind of sounds like voodoo security where "worse UI = better security, because fewer thieves know how to use it".

>Secondly, just because everyone has access to every website on the internet does not mean you need to build your website for everyone. If someone says their website is targeted at a specific group, they build it for that specific group. You wouldn't tell a formula 1 engineer to raise the suspension on the car he's building so it can clear speedbumps in the parking lot.

Sure, but most sites aren't narrowly targeting people who can see and who have the latest hardware as part of their core reason-for-existence. News sites certainly aren't, and yet their mobile and screenreader experience is cancer.


>Wait, what? I can't find anything about door levers having this kind of security risk.

I specifically remember this talk: https://www.youtube.com/watch?v=rnmcRTnTNC8

at around 18 minutes in he shows one of his employees(?) use a long bit of wire to pull the lever and open the door in a matter of seconds.


He shows the result of it, yes, but it's not that easy to have a wire go through a small hole and reach up to grip something and pull. Most outside apartment gates operate on that assumption.


There are a handful of videos where other pen-testers show the process from start to finish, and the space needed to get the wire through is not as much as you think.


Getting the wire through isn't the problem, it's controlling it afterward. Not surprisingly, the 90 minute video you linked didn't find time to show that part.


This starts off a bit get-of-my-lawn, but is a fantastic article with some very important points. Do yourself the favor of finishing it.


Regarding the invention of the internet, they should have mentioned:

https://en.wikipedia.org/wiki/Minitel


Dear old lady shouting at clouds,

The web is not about me. But the websites I create are not for everyone. They are for my users.

My users are professionals, on desktop, with a mouse and keyboard.

Knowing who are my users allows me to do better design. I make the best experience for THEM. Not for you, nor for all your friends using different hardware.

It doesn't mean I don't care about loading time or how things are displayed on the screen. It means I do so knowing in which environments it will happen. And yes, it might means you will be excluded.


As the article mentions, what you’re really saying here is that “my website is not for poor people, or disabled people, or even people who are trying to kill some time before their bus arrives”. Is discriminating against these people ok with you?


>> My users are professionals, on desktop, with a mouse and keyboard.

>> Knowing who are my users allows me to do better design.

>> I do so knowing in which environments it will happen. And yes, it might means you will be excluded.

Obviously the poster here is more than okay with this. They're explicitly stating it will happen, and it's part of the plan.

The problem is when it's not part of the plan.

Imagine the author is writing an internal web tool for a corporate enterprise environment.

Imagine all of those computers and devices are standardized, and the software will only run on these devices on an intranet.

The previous poster's point is valid here. We know the users, the users are professionals, we know who they are, and we know what environments we'll be deploying in.

Therefore, of course, beyond accessibility, of course the poster wouldn't care about people trying to kill time before the bus.


Exactly this. I would even hazard that the majority of developers are working on things for intranets in a narrow corporate context where, if it can run on non-standard gear, that’s an added liability with no significant business benefit. Businesses are only moving away from their decade old Flash and “IE only” web apps because the marginal security risk combined with maintenance burden has begun to outweigh the costs of new development.


> Imagine the author is writing an internal web tool for a corporate enterprise environment.

Let’s not, since this is not relevant to my experience on the web the slightest. I couldn’t care less what you did on your internal network, as long as it doesn’t affect me. If you use the same standard for your external websites, though, then we have a problem.


And then there was the internal app I saw that had a fixed width of 1400 px. Because, hey, everyone is on a desktop with that size monitor.


Dear Throwaway.

I am your potential user. A professional, on a desktop, with a mouse and keyboard.

I absolutely hate your slow, click-driven products. I have things to do, and I need the software to help me do them as fast as possible. I run a lot of software simultaneously, so your heavyweight resource hogs are actively preventing me from doing other things I need to get done. If I find a competitor that actually cares about providing value in exchange for money, be certain that I'll switch to them in a heartbeat.


It isn't slow, nor click-driven.

On the contrary, because I know my users are on a desktop, most actions can be done with keyboard shortcuts.

This means you haven't tried it. Which means you're talking about something you don't know.


Remedy?


The end result of this line of thinking is software guilds. Professionals being more accountable to their guild than to their boss. And the guild deciding who can and can't be a programmer.

It gets worse. In order to deliver on those promises, the guild now has to decide what framework to use, which UI elements are worthwhile. We're politicizing our jobs, and the only way to solve political problems is with government. Enter the web programming guild.


The flip side of this is: currently, most engineering fields require you to have some kind of professional certification (eg. a PE) from an industry "guild", eg. the ASCE. Software engineers don't really have a functional equivalent. We demand that professional engineers build our bridges, but we'll pretty much let anyone build our software. At some point, changing that may be the responsible thing to do.


I'll be the first one to get my guild card, but I really hope we think this thing through.


Why do developers always get blamed for the state of everything? Developers love creating efficient, optimized solutions. Developer happiness is rooted in a feeling of fulfillment that comes from creating a good product.

There are many parties involved in designing the web, and developers aren't typically viewed as the subject-matter experts on how the UI of the web should work -- just on how to implement what's already been decided.


Your point is well-taken. I think it depends on who you count as a "developer". In my view, while too many actual developers (as in those who write code) are very keen on using tools and techniques in an inappropriate way that results in a worse user experience, they aren't the biggest offenders. I think the UX and marketing people tend to be the biggest offenders.

People outside the industry don't know or care what the division of work actually is, though. To them, the group of people who collectively create the sites are "the developers". That isn't wrong, it's just not precise.


Why do developers always get blamed for the state of everything? Developers love creating efficient, optimized solutions. Developer happiness is rooted in a feeling of fulfillment that comes from creating a good product.

You give way too much credit to the motivation of developers. Developers are just like anyone else working for a living....our primary motivation is to make money.


Do they? I know several developers that love crafting incredibly intricate and clever solutions that show off how good they are at functional programming/algorithm design etc.

One place I worked at had an API with an "unfixable" 120ms delay on every request that persisted until someone ripped out the beautiful ORM system and replaced it with 3 SQL queries.


You're absolutely right, many developers seem only to want to pigeon-hole a particular use case into the newest framework or technology they want to learn. I guess it depends on the developer.


>It is this flirty declarative nature makes HTML so incredibly robust. Just look at this video. It shows me pulling chunks out of the Amazon homepage as I browse it, while the page continues to run.

This is actually really really bad. HTML renderers should have been designed to immediately fail when incorrect. This allows developers to more easily write correct and robust code rather than making the browser be robust. If the web were designed this way, you'd have a world where only correct html is written.

Also its not like we are not engineering a war machine here. Machines like the warthog are designed to keep functioning when components are destroyed. This is not necessary for html as we are not sending our html into a war zone. Sections of your html are not actively being removed/destroyed by users.

The author later compares html to javascript. Seriously. Javascript is a whole different ballgame. Javascript is bad because there are many hidden states of failure that aren't apparent. In HTML there are only two states, failure or success. If HTML failed as soon as it's incorrect, you will know immediately. Nothing is hidden. In javascript, incorrect logic can lay buried until a specific use case is hit.

That being said, he mentions stuff about the pyramid model for the web which is fine. I can agree with having javascript as the tip of that pyramid.


>> If the web were designed this way, you'd have a world where only correct html is written.

You'd also potentially be looking at a web that is far more unstable and crash-prone if you essentially treated a browser like a compiler or the like.


The creator of the html page will typically look at the website before he publishes it, so unlikely that someone will publish incorrect html code. It's not like html has branching logic so there's no hidden crashes, everything is explicit similar to the safety you get with type checking.

javascript is actually bad for the same reasons. It tries to correct your mistakes by doing type coercion making a null + 1 + 'blah'= "null1blah"


I believe the point was that if we had made stricter browsers, the web would be much more correct and standards-compliant.


> Sections of your html are not actively being removed/destroyed by users.

My adblocker does this.


Fair point.

From the developers point of view though, the site is incorrect without the ads. He wants you to look at the ads, he's not in the business of providing you website content for free.

Still the adblocker actually does a surgical removal that maintains correctness of the html.


> Despite what Tim Berners-Lee claims, the Web wasn't a single invention 1. Yes, we know that one man says that he invented it (grudging thanks to Tim), but he did so on the back of a thousand other technologies

He got that right.. Wish the media would do the same


"orangefuckface@whitehouse.gov" - that's really not necessary now is it?


Most of projects are shitty. Because devs are failed at reusing abstractions, which lead to increasing complexity as project grows. Reusing abstraction into useful libraries is what devs should do, not features implementation.


Dear writer, I am forwarding this to my product lead, and going back to Jira.


Excellent talk, and we need more of this. Look at downloading and image instead of installing something on a raspberry pi. Next thing I expect is needing docker to install a calculator.


Sheesh the format of this article is dreadful.


Clickbait title, but I'll support anything that encourages empathy toward users.


Hear, hear!


from http://slashdot.org/comments.pl?sid=5025417&cid=46736661

> The internet was our garden. And a beautiful garden it was. Sure, some fed agency created it, but let's face it, they used a fraction of the lot and we didn't really care for their supersecret bases they had littered about. There was so much empty space in between! And that lot we cultivated. We built a few nice trees and in their shadows we relaxed, we planted beautiful roses and yes, a few fruits and vegetables because, hey, it's always better if you grow it yourself. And ... heh, well, yeah, we had a few corners here or there where we grew that "special weed", ya know, but nobody really gave a shit, it was just us.

> We were pretty good gardeners. Well, you pretty much had to be in those days, if you didn't know your way 'round with rake and shovel, you didn't really get much out of it. Still, we were quite happy with it. So happy actually that we thought we should share that. I mean, there's so many people out there who don't even know just how great the garden is! And we invited them in. They looked around and, well, most of them didn't quite "get" it. Sure, it was nice, here or there, well, if you're into botany, that is, but it's kinda hard to get around and find your way through the jungle, and using a machete wherever you go, phew, hard work! But a few of them stayed. They didn't quite know what they do, but we handed them a few saplings and some seed and some actually managed to learn a thing or two about gardening. Sure, of course a few smartasses tried to steal our stuff, but we usually didn't have much of a problem to whack them with our shovel and get our stuff back. And, heh, yeah, we, too, went into each other's yards and played some pranks on each other, painted their roses black and the like, but it was all in good fun! And hey, they sure liked our ... ya know, "special stuff". They still had no idea how to grow it, but they were quite willing to help us share everything with everyone, as long as they got their share, too. And, well, why not, pass the blunt!

> That was about when the corporations noticed that, hey, where did all the people go? They took a look at the garden and they went batshit crazy. I mean, sure, we knew that it's great, but we never saw anyone go so insane about it. They saw it as the next big thing to make money with, and we laughed. Money? With this? Dude, you can't make money out of a system based on freedom and sharing! Everything in here is free. Yeah, in both ways.

> True. You can't make money in such a system. Unless of course you change the rules. And changing the rules, they could.

> I can't help but think that this must be how the natives of the US felt after they were "discovered". Because we had to face that there are suddenly areas in what we considered OUR garden where we couldn't go anymore. Worse, something that was the staple of our culture, going to a guy who did something great and asking him for a sapling of his wonderful tree. Became anathema. Instead of you SHOULD imitate and build on top of mine, the new creed was you MUST NOT. This rule, of course, did only surface after they themselves took from our gardens what they could possible rake together quickly. You might understand our utter disbelief and of course outrage when we noticed that turnabout is not fair game.

> Well, we have had our share of trolls and nuisances before. Long before we already had to deal with people who trampled through our gardens or were a general pest. Our solution was simple, we took our superior gardening skills and whacked them from here to next week with our shovels 'til they either learned to play nice or left for good. This didn't work out so well this time. No, not because they had the better gardeners. But they didn't need to. They had a much more powerful weapon in their arsenal: The law. First, they ensured that the laws would benefit them, and then they used it against us. And despite how despicable it may be, we have to admit that it is quite efficient to have others take care of your battles, especially when you know that you cannot win a conventional war.

> And now we're sitting here in what's left of our once beautiful garden. The once mighty jungle has been tamed and civilized, what used to be interesting and a land for explorers is now divided into lots that you may buy instead of simply use. You can get there easier now... well, if you prefer using long winding roads to a direct route, but the long winding roads are necessary so you pass by all the billboards that block your view to what's really interesting. Of course you may not step anywhere, only where you're allowed to, and don't even think about taking anything, rest assured it's for sale, not free.

> So we're sitting here now, at the edge of something we once knew as beautiful and free. We're looking at it and we wonder what we did wrong. Where did we fail? And I can only come up with one solution for when we try something like this again: Don't invite the masses in. Keep it to yourself. It's the only way how you can really keep it. And the only way you can do without a camo net over your herb garden.


Seeing the responses to this post just further adds to my conviction that the web is dead.

You can't rely on websites being futureproof or adaptable to different situations.

You can't rely on content being accessable to everyone. Nor can you count on content being available forever anymore.

Sites are blocked or throttled by ISPs and governments. Sites are expected to censore they're content. Even search results are being censored.

You can't trust in even the most basic assumption of privacy or anonymity. Everything you do is being tracked and sold.

So many social networks require your phone number. So many sites need you to login or need your email.

It's all one big get rich quick market.


Very negative and condescending article in my view. Why would you spend time and money optimizing for a tiny percentage of users? Also they always underestimate the amount of power you gain by using frameworks like React and the amount of features and interactivity required when building modern web applications.

I'm not going to spend weeks hand coding some progressive enhancement in vanilla JS so some angry blogger can view my stuff in a console based browser with no JS.

It's like these people think that developer time is an infinite resource. Old woman yells at cloud indeed.


The article doesn't propose "optimizing" for users with some form of limitation (connectivity, device, personal disability). It proposes not excluding them by optimizing only for the maximal case.

Your point about hand coding progressive enhancement is well taken. What we need are tooling and frameworks that make such an approach feasible. The author laments that the frameworks du jour tend to start with (and incentivize building) the resource-heavy, complex, fragile version of a site / app.


>> Why would you spend time and money optimizing for a tiny percentage of users?

I also didn't like the tone of the article. And some arguments were pretty far fetched if you ask me. But this is actually a very valid point. Its not very hard to optimize websites for accessibility -- but its almost never a hard requirement except for government websites.

Oh and, one thing I found particularly interesting about this site:

-------

90 requests

442.57 KB / 444.68 KB transferred

Finish: 4.43 s


Well, the page really loads in three requests, minus the images. It lazy-loads the images and player afterward (and drops them into the page without reflowing). It looks like the the images come in document order, too, so the visible part of the page is fully ready to view, images, included, before all the images load.

Something is a little screwy with the page though -- the main HTML page, which is 27K, takes two seconds. Some infrastructure somewhere could be running/configured better.


I think her point is that it's negative, dismissive, and ignorant of statistics to claim that we should ignore these things.

She brought up some extremely valid and common real-life cases for this:

• being in a foreign country • being at a place with a shitty wi-fi connection • that 30% of rural America is still dealing with 500kbps or lower internet

Basically, the ignorance to say 'I can't be bothered to take the time to optimize my web site' is literally basically what she's calling people out on. She's, from what I can see, stating that especially in a large organization - (she uses the example of an airline ticket site), the statistics behind this seem significant enough that depending on your number of users, their locations, and their activities, you could be looking at a significant number of users who could simply just ignore your web site or services because you couldn't be arsed to try to optimize your web images, or simply make the site work without having to display them. (Certainly not text-in-images, at the very least!)

I work as a senior iOS dev at one of the big banks in Canada on their iOS app. We won't stop supporting iOS 9 until that number falls under 2%. That still represents more than 10,000 active users for us. These 10k+ users rely on us for mobile banking, and we could lose them as customers if we don't support their devices, because some other bank will be happy enough to keep supporting it if we don't, and it just means more for clients for them.

The author clearly gave another example of this in the anecdote she presented about the individual who could not even log in to the airline's website at the airport itself, and ended up having to schedule a flight with a different company. They lost at least one customer there, and who knows how many more, all because they can't be arsed to check to see how the page behaves on a crummy connection, and optimize it thusly.

I remember living on a farm in rural Illinois, and the best internet we could get at the time was satellite. Which (or, worse, dial-up) is the only option for a lot of Americans, apparently.

Satellite internet, in particular, causes these poorly-optimized web sites with hundreds of tiny JS files and images to take forever to load on this type of connection, because Satellite internet works by pinging the satellite for every connection, meaning downloading a large file can be fast, but opening Facebook can take up to 3-4 minutes. (My ex-partner who still lives on that farm confirmed this for me today.)

In absolutely no way is this woman claiming 'developer time is an infinite resource.' In fact, I think she's saying 'dear developer, the web isn't about you.' It's about the users.

Plan your development for these kinds of cases to start, and you won't be spending 'infinite resources' on it, because it was always considered.


> people turning off JS (Yes people do this. Yes it's totally valid, and their right as a user).

Fine, and they can go somewhere else. I'm sorry but I'm not about to consider this TINY percentage of users the same way I don't consider the TINY percentage of people accessing sites I work on from horribly outdated browsers.

The next sentence applies to web APPS, not blogs/news sites/etc.

People that say websites should gracefully fallback to non-JS web app when a user accesses the website with JS disabled are crazy. FULL STOP. PERIOD. You either end up dumbing down the entire experience for everyone OR maintaining 2 websites. Neither of which is financially sound for the vast majority of companies/developers given the percentage of people who don't use JS.


A similarly tiny percentage of your users are blind. Another tiny percentage are dyslexic. How many different tiny percentages can you justifiably invite to "go somewhere else"?


A lot. Many developers don't even develop with blind people in mind at all. There are standards for sites that want to cater to those people, but most sites don't follow those standards.


For a business, at least in the United States, this is basically a financial projection problem. Unlike brick and mortar stores, there is no requirement to make web content accessible to handicapped people.


>there is no requirement to make web content accessible to handicapped people.

Yes there is: the Americans with Disabilities Act. Many private businesses in the States have been - and are currently being - litigated against under said act.

The mandate of the ADA is to provide "full and equal enjoyment" of a public accommodation’s goods, services, facilities, and privileges. Places of public accommodation include, "place[s] of exhibition and entertainment", "places[s] of recreation", and "service establishments". The vast majority of websites are therefore "place[s] of public accommodation", are bound by the requirements of Section 508, and - since roughly this time last year - are required to meet WCAG 2.0 AA (https://www.access-board.gov/guidelines-and-standards/commun...).

In the States businesses are also required to meet the requirements of various human and civil rights laws enacted by states and cities.

And if your company has an international presence, it's also covered under whatever legislation exists in those jurisdictions.

Basically: hell yeah there are requirements to make web content accessible to disabled people. Almost whatever you do, wherever you do it, will be covered to some degree by legislation designed to prevent discrimination.

Of course the existence of financial/technical incentives are good reasons for businesses to make sites accessible, regardless of any legal requirements.


Why does their need to be a requirement? Making a website accessible to individuals using screen readers is the right thing to do; it shouldn't need to be mandated.


> it shouldn't need to be mandated.

When have you ever know a for-profit business to care if it's not mandated? They only care if it affects the bottom line. As a developer at a company I don't have a ton of say over what we spend time on accessibility-wise and even if I was able to get that code in the QA time to test it would never be approved.


I'm not saying there does need to be a requirement.

Businesses are not charity. If it doesn't make financial sense to make their website accessible, then they won't unless they're forced to. I'm making no value judgments, this is simply the reality on the ground.


> Businesses are not charity […] I'm making no value judgements

… except that you appear to adjudge a negative financial value to accessibility compliance, when in fact the opposite is true.


>there is no requirement to make web content accessible to handicapped people.

Unless that web content is used or maintained by the United States federal government.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: