No, it hasn't. It's still 100% about how to benefit users for most teams. That is quite literally the only thing that matters and everything else is a means to an end.
I'm not going to deny the fact that there are some shoddy websites out there. There are some sites that are using the latest and greatest JS stuff when really, they should just be using HTML and CSS. Fine.
The majority of the work that gets talked about here, by software and tech startups, will not be effectively built using HTML/CSS or server-side rendering. Since we're on the topic of users, users have come to expect a certain level of interactivity. They expect that if you right click on a row, you get a drop-down menu and you can delete it. They expect that you can do inline editing of an item without being redirected elsewhere. They expect that you can create a new thing on the page without reloading and redirecting.
Let's forget about that part, though. Most software features are, at the end of the day, experimental. We don't know exactly what is going to be a hit and what isn't. If new tools allow us to iterate at 2x or 5x or 10x the rate of old tools, this is a net positive for users.
No one would denigrate a carpenter for using an electric screwdriver over a manual one. No one would complain that the carpenter only cares about their "carpenter experience". At the end of the day, the "user" wants their cabinets finished.
So, nobody would mock a carpenter for using power tools. They would question one that only worked with a 3d printer. Even though that would certainly let them iterate faster.
There are performance penalties when it comes to the initial page load, you tend to win it back when it comes to subsequent user activities. Whether it's worth the trade-off is dependent on your use case. There are also downsides to progressive enhancement, an obvious one is that if your enhanced JS version changes appearance/layout in a noticeable way, you could end up making the website feel slower even though it's technically usable sooner. In addition, what happens to any existing interactions with the non-enhanced UI after the enhanced version suddenly appears?
The fact is, you just have to decide what's important. I do agree that not enough developers are giving real-world accessibility and performance enough consideration. But I'm less convinced that there's a way to approach these needs that is superior in all circumstances.
Granted, for some UI, this is hard to avoid. Those are much fewer and far between than given credit. Most state for a browser should be easily holdable in the address. If you find you can't do that, are you sure it is relevant state that your user cares about?
The idea that there’s a mythical user out there who absolutely demands serverside rendering is a bizarre figment of HNs collective imagination.
Then implement that. There are much more lightweight ways to achieve this than going full react. Especially with modern JS this can even be implemented from scratch.
I recently used the best webapp I've ever used - LucidCharts. That was an app, and, as such, I didn't have the expectation that everything I did generated a new link, or that I could even link to certain things. BUT, they also struck the balance of providing web behaviors for things where I expected them.
And for the users on weak machines or connections, the reload version at least works, unlike the JS one.
The "users love AJAX" argument is both silly and honestly also a bait&switch. It's not really because of AJAX that people build their sites in JS frameworks.
Server side rendering HTML and serving it up works perfectly fast for pretty much every site which is read only content or forms. The of 3rd party scripts running on every page are the issue.
Also as a major negative against SPAs, once a script is loaded - it's going to be ever present on every page afterwards - enjoy explaining why payment details page needs 20 3rd part scripts running on it. Not possible to unload scripts.
If using that framework means that the developer can churn out something much faster, and therefore has time for more features, even if they aren't perfect or fast, is the user really worse off for it?
Continuing the carpenter analogy, it's the difference between using a nailgun and a dovetail joint.
Sure, the nailgun'd version probaly won't last as long, but if that means that the one carpenter was able to make 5 different shelving units that all looked exactly how the customer wanted them to look and was able to do it for cheap enough that they all happily bought it, is that really worse than only being able to get one expensive "better" version out and not being able to customize it per-customer because of time constraints?
And I know first-hand that those kinds of trade offs happen all the time. There are tons of times where "can we afford to add feature X" is a "yes" because we are using React in the app and spinning up a new component becomes extremely easy, but in applications which aren't using a framework that is often a "no" because we need to do so much more work to get it fully integrated and working.
I can only speak for one user, myself, but my answer is -- yes, the user really is worse off for it.
I get that there may be network effects that trap you, and you could be worse off because that means that by prioritising another feature they made one of your personal favorite features worse, but that really doesn't have to do with the technology used and more with what the developer is prioritising.
I can and I do, for as long as it's possible.
The problem is that JS-first fashion infects web shops, and one by one, websites are being rebuilt in "modern ways", rendering them less usable than they were before. Such sites would be easy to avoid if they were the exception, but you can't do that if everyone follows this trend.
It’s both amusing and sad to see modern web developers criticize Java for being “bloated” because they don’t understand the Abstract Factory-pattern (fairly light-weight, 2 levels), while themselves are using infinitely more complex and heavy frameworks to render what is more often than not is just static HTML.
Proof of that is that in languages where there are alternative patterns (like Python), people rarely create a factory, they use functions that return classes; they don't use singletons, they just use module static variables; they don't create observers, just just gather callbacks on a list; and so on.
There are many less formal patterns that do add a lot of value. And there are syntactic and type level patterns that are great, but typically OOP programs use those less than code in other paradigms. OOP has one common less formal pattern that is very useful and less used elsewhere - it's the abstract API, but this one people on the internet mostly recommend not using, because it requires inheritance.
Don't get too hung up on the implementation details in any particular language. A Python function that returns a class is just a factory by another name. The essential point is deferring the decision of which implementation class to instantiate until run time.
Inheritance is one way of handling abstract APIs, but not the only way (depending on language). And don't believe everything you read on the Internet. In some limited circumstances inheritance can be a great way to simply code and eliminate duplication, far outweighing any additional complexity it introduces.
Or just use interfaces. As in "design by contract".
And that's just common sense programming wise, not some OOP specific solution.
You can successfully apply OOP and OOP design-patterns to a lot of problems and come up with equally functional solutions as you would with FP.
And for certain domains, better.
1. It is designed to be testable. It makes testing the components much easier than testing jQuery soup.
3. Will validate input types passed to controllers
4. Has nicer syntax than jQuery soup
I'm not sure if you're blaming the tool or not, but at least for 3rd parties, I want to clarify: jQuery doesn't mandate "soup" - In fact, devs learned over time that the best approach development with jQuery was the same as without - have application state tracked as variable(s), have routines to translate the current state into the the DOM. That's the unidirectional flow that React likes, and both vanilla JS and jQuery can run with it.
But jQuery makes it EASY to use your DOM as state storage, which works well at first and quickly grows nastily complex. Meanwhile, React makes it EASY to update the DOM from state without performing unnecessary re-renderings.
So jQuery doesn't mandate, or even encourage "soup", but there's nothing guiding you away from it. React, OTOH, makes the path of least resistance be towards this particular best practice.
To your points as to why people use React, I don't think those are bad points, but those aren't my primary points. React best practices mimic the best practices of coding in general: Small, single-purpose components with minimal coupling. This allows me to bring in experience from outside React, and allows me to apply React experience outside of React.
> have routines to translate the current state into the the DOM
Views. Triggered by #changed notifications.
> That's the unidirectional flow that React likes,
I really don't know what world that statement would be true in.
It is a full tool-chain, language, architecture, API spec, and testing framework. They are all closely tied to each other.
> React, Flux, JSX, GraphQL, Jest.
Opinionated implies there is "1 true way" to do things a la angular or ember.
react is most definitely not the embodiment of opinionated.
ref the multitude of libraries which were flavour of the month before something else came along.
> It is a full tool-chain, language, architecture, API spec, and testing framework.
Ok, now I know you're either trolling or have never used any technology but React to solve problems.
Just yesterday I was trying to use the Sendgrid support site. I was logged in. There was a "click here to login and send a support request" button (and no other way to create a ticket). clicking it just reloaded the page. There was a login link (even though I was logged in). It may as well have been a static image, you couldn't click it.
After a few minutes of trying the button finally worked. When writing my ticket the support system would scroll the page down on every single keypress so that the text field was aligned with the bottom of the browser and I was typing on the bottom few pixels of my browser and display. It was awful.
> No one would denigrate a carpenter for using an electric screwdriver over a manual one.
No, but I'd complain if it took an hour to open my front door, took all my strength to do so, then when I finally got into the house it just fell into a pit and I had to start again.
It often takes longer for the initial JS-heavy page to load than it would for me to click & load 2 simple HTML pages. It's so bad that I just skip using certain websites when I'm on the slow internet.
I feel like a lot of devs do not even try testing their websites / progressive webapps in slow speed situations or on low end devices. If they tried they would understand why here if you want people to try your service you must build an app, put it in the store and keep network messages at minimum. Webapps are a good way to get a "Not interested, thanks anyway"
Honestly, I disagree with this. Iteration is great for developers, but not users. To use your tool analogy, users are attempting to use the toolbox you made to perform a task, but the content of the toolbox and how you use it changes every day. Users of the toolbox can never become expert users, because they can't ever rely on their past experience with these tools - the usage has probably changed in subtle ways.
Personally, I really hate having to re-learn how to use a screwdriver (or a blogging platform) every few days - and I'm a professional tool builder myself! It sucks even worse when I can't use your screwdriver if I only have 3G internet access.
One of my first jobs was maintaining and extending an ASP.NET Web Forms application.
A dashboard style application has to do way more than open a dropdown. People keep the page loaded all day long - the fact that it takes 2 seconds to load the first time is irrelevant.
It's almost like for some the argument is more about some abstract concept of artisanal purity than true care for user experience. HN basically has a front page post decrying the state of the modern web every day...
Before anyone jumps down my throat about this I should add that browsing the web with JS disabled is a truly wonderful experience on sites that support it. It would be great if everything did. I don't see that happening though.
That's because developers are the only people who know what the fuck is going on, and who to blame. Regular users don't have a mental model to correctly identify the source of their annoyance. So they end up blaming "the computer". It often manifests in requests like "could you come over one day and clean my computer? I think it's full of viruses." No, it's not really full of viruses, just the websites you're using the most went through another redesign, and now consume 10x resources for zero added utility. But what can you do. I install adblock and sometimes buy them another RAM stick, so they can throw their laptop in a garbage bin a year later than they would without my help.
No, users do complain, you just have to know where to look (and actually talk to them).
As opposed to what, some abstract concept of ordinary users? What the sales department wants? The idea of craftsmanship isn't that abstract to me. Performance and cacheability aren't abstract at all, that a website can get reloaded many times even for just one reader, and usually shares memory and CPU and HD cache with many other tabs, is also a really obvious observation. Stuff that makes a noticeable difference even in isolation makes a giant difference multiplied with a triple trillion, I've done the math.
> HN basically has a front page post decrying the state of the modern web every day...
That doesn't mean there isn't a problem. There's also articles decrying environmental destruction every day, not on HN but in general - should that make one care less? Would you say that biologists and climatologists are more concerned than the "ordinary person" means the ordinary should be heeded? I'd be surprised to see scientists in a science forum talk like that.
Web development is an "art", maybe like architecture is art, and we are the artisans. If having any sort of ideals to strive towards is "abstract" to us, that says more about the people involved in web development and how much genuine excitement and care for detail was destroyed by money and marketing, but not much about web development as an art form as such.
Does a flood of crappy action movies say anything about the state of the art in movie making, at all? And yes, the "average audience" maybe likes them, but who cares? If they got something else instead, there's nothing they could do about it, and it would be better for them. I don't care if that's arrogant, but don't call it abstract :P
> I don't see that happening though.
That is like talking about sports or the weather, even if its true, it's pointless. The question is rather, what do we think should happen, and how can we make it happen.
No - users actually do want more features. They want more features than you can ever hope to deliver. Figuring out which features they will actually appreciate is the tough part. Your competitors are working very hard to build those features before you can. Your competitor's sales team is already cold calling your users.
Lightweight software is easy. As an engineer, I appreciate it, but it's not what the average user wants.
I'll believe it when I see it. With notable exception of collaborative editing, most software on the web is essentially a subpar reimplementation of desktop software from 10 years ago, missing half of the features, but consuming 10x the resources.
True, but turnaround time is something that is primarily beneficial to developers, as it's all about reducing development costs.
I like that you speak in the name of "the users" who expect a certain level of interactivity, when that same level of interactivity on a page means many of these very same users won't ever see the page load fully, either because they can't be bothered to wait enough time on a slow connection, or because their browser/device simply won't do.
I see quite a clear contradiction here. And it kinda proves the point this post make about overly eagerly assuming users have a fast enough network and/or device.
Users don't expect interactivity, they tolerate it. Think of websites as obstacles or puzzles users have to solve to get to what they want. They would rather not interact with those puzzles or wait for them to load at all. If you understand that - you can't justify interactivity or most front end development for that matter by claiming this is what users want.
Large companies literally have entire teams dedicated to maintaining tooling. And don't even get me started on monorepos.
At a previous company, a co-worker that had been setting up tooling infrastructure for a project was astonished at my iteration speed when he saw that mine was written with just a micro-framework and ES5. Conversely, when I had to jump in to help get his project back on schedule, we could never figure why the hot reloading setup had a 2 second delay on my machine (among other things).
Having worked on two projects of very comparable scope (two documentation sites for different projects), one written with the "latest and greatest" Gatsby and friends, and the other with just a dirty script, I honestly can't see any reason to do the former: it takes longer to do anything, the final payload is bigger, and now the codebase is using a version one major out of date, meaning a migration looming in the horizon. On the upside... GraphQL; yay I guess?
With this point in mind, your carpenter analogy seems a bit off IMHO. Of course, consumers don't care if the carpenter uses an electric screwdriver, but they do care if a knob gets loose and falls off because it was held by a single rushedly-drilled screw. Also, the employer should presumably be worried if said carpenter keeps spending a significant portion of his time sanding his hammer handles and replacing his workbench legs, or needs another carpenter to do so on an ongoing basis.
I can't agree with this statement. It's 100% about how to benefit the company, which means satisfying the customer. And serving users up to the customers is the name of the game.
Most sites are heavy with trackers. You could argue that *one& might be in the user's interest, but can hardly make the same claim for the huge number of ad-related ones) and full of devious antipatterns. Just enough value for the user, one hopes, to get the page view.
I’m sorry, but pages that pull in several Mb of ads and tracking to show a couple of Kb of text the user actually wants are by no means constructed for the benefit of the user, any more than the line is baited for the benefit of the fish.
I actually don't work on any ad supported products. That improves my job satisfaction even more.
The majority of work that gets talked about here is technical work which is N levels below the business use case. Without the business use case you have no idea if it can be effectively built using server side rendering. Because of that I'm not sure anyone here is really able to make this statement.
If that were true, things like Amp would never have even been proposed.
Modern web browsers are absolute marvels, but shitty developers and business people are driving people like me into apps.
I'm far less optimistic about the web now than I was 20 years ago. I more bullish than ever on the internet in general, but the web is a mess.
I have to say, though, that if this is the case then far too many teams utterly fail. Quite a lot of (most, I think) web sites don't appear to be designed primarily to benefit users to me (user benefit generally seems to fall to third place, behind "make it pretty" and "use the sexy new tools"). And that problem has been getting worse over the last decade or so.
"No one would denigrate a carpenter for using an electric screwdriver over a manual one. No one would complain that the carpenter only cares about their "carpenter experience"."
I absolutely would if the result is that the carpenter's work is worse for it.
I don't know by what metric the work is, on average, worse for it.
I've been on the web since '91 and I've been more or less an obsessive user since then. I've seen the changes.
The web - again, on average - is better to use than it ever has been. Some websites suck. Some websites are awesome. The worst websites in '95 were unusable and had broken layouts/styles, but people act like every website in '95 was some perfect embodiment of minimalist engineering.
My opinion is very different. Fair enough -- we're different people.
"but people act like every website in '95 was some perfect embodiment of minimalist engineering."
I'm certainly not asserting any such thing at all.
It is so, so easy, to be living and working in big cities like Toronto (me), SF (a bunch of you), NY, Chicago, et cetera, and truly forget the context of the absolutely awful experience a lot of people have online outside of these urban epicentres.
I remember living on a farm in rural Illinois (Marengo), and the best internet we could get at the time was satellite. Which (or, worse, dial-up) is the only option for a lot of Americans, apparently.
Satellite internet, in particular, causes these poorly-optimized web sites with hundreds of tiny JS files and images to take forever to load on this type of connection, because Satellite internet works by pinging the satellite for every connection, meaning downloading a large file can be fast, but opening Facebook can take up to 3-4 minutes. (My ex-partner who still lives on that farm confirmed this for me today.)
The author, then, brings up the next, most important point: 'Your website had better be amazing for it to justify that length of download time.'
For those making the audacious claim here that this isn't considering the dev's time put into it, I'd argue immediately back that the devs need to be considering the actual users.
People living in the country aren't 'edge cases'. They're usually extremely hardworking people who simply don't have access to the luxury of broadband like we do. Just because we don't necessarily see these people, or travel to other countries with shitty connections, doesn't mean they don't exist, doesn't mean they're not actually a potentially significant number of people, and doesn't mean we should not be arsed to trim 1MB-2MB off our bloated image-and-video-wieldy sites, or even that it would necessarily take a lot of time in most instances.
I agree. The web isn't about us. It's about who uses it.
To be fair, many urban companies are designing for urban and suburban users. Your advanced analytics dashboard product is not for Cletus in rural Alabama.
If 30% of your target users don't have broadband, then you should design with that in mind. If 0.01% of your users don't have broadband, then you should ignore them.
Accessibility is a different issue of course, but to illustrate my point: I have never translated my websites to Chinese, making them unusable for 1/7th of the world population. I don't think it's necessarily a failure on my part.
You didn’t intend it for a blind audience, but they’re certainly capable of reading the text. Did you avoid making the text actively toxic to screen readers?
Many of the whiz bang features on new sites, ostensibly because “users want features!”, hinder even this functionality. It’s fair to criticize this as a regression.
I use a browser extension that lets me click links and buttons from the keyboard. The more bleeding edge they make the sites, the less I can use this feature. That’s not an improvement.
The issue is that so many projects have not been designed with these ideas from the outset, that it indeed usually does take a mammoth amount of re-engineering and dev time to make these kinds of changes and optimizations after the fact.
To continue with the mediocre carpentry analogies floating around: a (competent) carpenter doesn't build a sturdy table by first building a shitty table and reinforcing it later.
We should be incentiving people to live closer, not spending 90% of our time on them.
But sure, go ahead and blame the dirty country folk for not being able to reliably access your shittily-designed-and-implemented website. God forbid there are people who don't feel like breathing smog all day and existing in perpetual abject misery.
I didn't care for the metric. How many rural Americans are there? Is it 1% of the population? 50%?
According to the 2010 census, roughly 20% of people in the US are classified as rural. So that means 6% of all Americans are not on broadband. That metric would be a lot more useful to most people.
The rural population has actually been growing the entire time; but as a percentage its dropped.
No. So why should we optimize for people who do not matter to our bottom like?
Why wouldn't they? Physical location aside, they're not any different than you or I. They have purchasing power like everyone else.
> Are they going to subscribe to our platform?
Why wouldn't they? Because they can't load your page? Doesn't sound like their problem.
> No. So why should we optimize for people who do not matter to our bottom like?
Why don't they matter to your bottom line? Because you present them with tools and platforms they can't use, so they can't contribute to your bottom line? A self-fulfilling prophecy if there ever was one.
Then again, the lack of internet usability may create an image of the internet as not being worth it. If you have no access to the better things on the internet, then you may think that there is nothing of real value on it so you won't push for better internet.
People having slower internet speeds is a serious problem with the web (because of modern web design), not with most of the other things the internet is used for.
Personally, as websites have become more dangerous and less usable, I've been finding the web to be a smaller and smaller space as the years go by. For the first time in my life, I can see that it's entirely possible that I will largely stop using the web.
However, that in no way means I'll stop using the internet.
That people equate the web with the internet is a very bad thing for both the web and the internet.
This isn't in and of itself the problem, per se. Sometimes a more powerful and robust UI / UX / application is necessary. The problem is when said "solution" is applied to __every__ need. The problem is, when screwdrivers are being used to pund nails.
Much "like publish of perish" there's no (ego) glory in saying "Yeah. It's relatively lo-tech but that's the solution that best fit the business need." Rare is praise and/or hiring based on good analysis, appropriate tools and smart solutions. Nah. Size is all that matters. How people who can't do what you do it more important than doing the right thing(s).
I actually worked with a pair of developers who believed this wholeheartedly. It seemed to be part of some kind of design philosophy they picked up where it was the responsibility of developers to build complex web sites and apps in order to "further" technology and encourage people to upgrade for the good of... something. I'm not sure.
They picked some baseline iPhone level (two generations removed from what was current, IIRC) and decided that anyone who wasn't "smart" enough to own that model, or better, shouldn't be allowed to use the product.
Then when they were ready to deploy, it turned out that the product only worked on their phones, which happened to be the latest/greatest available at the time. So they built in a bunch of kludges and trimmed features to make it load on an older phone, and that was "good enough."
Not surprisingly, the company went out of business.
I keep a older crap laptop for a reason. I'm not sure why I feel so alone for doing so.
I've been an iOS developer since the App Store became a thing 8 or whatever years ago, and the amount of people I've seen willing to only test applications on devices only 1-2 generations behind, if that, is shocking.
What's more offensive, shocking, and actually confusing is that, yes, management is usually the one saying this kind of thing.
The reason that confuses me, is that you'd think, I mean, if I was in a management position at a mobile-related startup, one of the first things I'd do is have someone on my team do some market research to provide statistics on the current state of iOS device and version usage, and buy a freaking heapload of test devices, in all the different resolutions and variants.
Dismissing your users because they're not 'smart' enough to have a newer model is, as mentioned in the article, essentially blaming these people for the circumstances in their lives that leave them unable to, or is simply insulting their intelligence.
I still use an iPhone 6S. Why? I need a headphone jack. I run an independent music label and rely on GarageBand on a multiple-a-day basis. Bluetooth audio's latency is too awful to be usable when playing instruments or mixing/EQ'ing tracks, and furthermore, I'm not about to replace my $300 studio-quality headphones for some shitty bluetooth alternative, or drag some tiny dongle around I'll need to adapt the headphones.
Does this make me 'not smart'? I think the exact opposite. I couldn't use a new iPhone the way I have for the previous 6-8 generations before it.
There are dozens of reasons why people prefer to keep older models of devices. My 2011 MacBook Pro is still my daily Logic Pro driver, and I even still do a bunch of iOS development on it, because, gasp, I was able to open it up and upgrade the RAM to 16GB, remove the original hard drive, put in a 512GB SSD, remove the optical drive, and replace it with a 2TB HDD...which, if I'd bought a 2017/2018 and felt like the 8 or 16GB that came with it wasn't enough, I might as well throw out the window for all it's upgradeability.
Then let's talk about software - most of the people I know actively avoid software updates on their devices because they either don't know if it will hurt their device, or they simply actively like it the way it is already, and don't want to be forced into whatever changes the devs and managers felt like this time around.
People like what works for them. And they don't like change.
tl;dr - Highly intelligent people have highly intelligent reasons to stick with their older devices and software. Hipster managers who think they're 'too cool' to see this piss me right off.
A better way to put this might be: The newness or hipness of your device doesn't add any IQ points to your IQ score. The irony is those who need to understand this the most are often the last to get it, if they get it at all.
I am saying, that the impression OP gave me is they were literally throwing it on the CEO's iPhone X, they felt like it worked there, and didn't bother to even go a couple generations back.
This wouldn't be the first time I've actually seen this kind of business-destroying, frankly elitist behaviour that does not benefit business.
Ech. But first I need to get back to work on a search engine that would _not_ index ad serving pages with possible option of filtering out JS serving pages.
Please do, I would be all over that :)
I already use duckduckgo and fall back to !g, I'd love to try an ad/bloat-filtering search engine first, and fall back to ddg and google from there.
A developer now has a choice: use the big framework where you can satisfy those now entitled users or give up some success and make a tiny site with limited use of js. I'm all for the free web but I know which choice I would make when running a for-profit company.
Then again it can actually be pretty easy to over engineer nowadays. Once you have your node/npm environment set up with webpack and all the rest you can quickly iterate a new site based on an earlier project. And if you hear of something new and cool it's only an npm command and a few hooks away.
Where are the tools for making streamlined sites easier to build?
What happened to the old contest for teh best site in 5kb ?
We need those brought back -- seriously.
So index.html is easy, but is there a way to make it such that I can go to my website/articles/my-happy-day, and have that page be generated, rather than a separate handwritten my-happy-day.html file?
I typically "deploy" simple shit by uploading it directly to my bluehost filesystem, or Amazon s3. The only thing I can think of is a precompile thing, and then uploading the build.
EDIT: Isn't there also a way to pre-cache images with just CSS? o.o
"[Progressive enhancement] won't work with something like React though, and I don’t cry about that. This is a distinct robust design pattern, which we feel is durable and doesn't lock us into a limited-life framework that will get replaced by something else in a year or two." (emphasis mine)
I imagine this last line gets some people reacting emotionally. I've spent the last few years in React-land (in addition to a soon-to-be-legacy stack slowly transitioning), and benefited much from the paradigm shift. It has moved the web (and web dev) forward in many ways, especially conceptually. At the same time, I can't deny that it will eventually get replaced, just like the rise and fall of jQuery. That means we must keep in mind not to be locked-in, to emphasize the core concepts which will outlive any framework du jour.
Progressive enhancement is certainly doable with server-rendered React, but it does seem like the recent trends have favored developer experience at the cost of user experience.
I'm all for progressive enhancement. However, if the expectation is a web _application_ that has a certain level of interactivity designed for, with a real world deadline, and a targeted demographic of "people with supercomputers in their pockets", I can guarantee you'll make some compromises.
Bringing up Springer Nature and the BBC as examples of the Web Built Right™ is short-sighted when they are web _sites_ that serve static content. Of course you should send HTML from the server that represents the entire content of the page, apply some CSS, and sprinkle in JS. And of course, there are a lot of developers out there that would reach for the latest trending JS library + framework combo to pull it off, and they'll implement it poorly.
There's very little point in bashing "npm install exciting-tech", when I can npm install gatsby-cli, and do everything the article is crying out for in a "modern" stack. Oh, also, the BBC example? They use Node, and React, and server-side render React. You can bet they're npm installing exciting-tech.
However, the web is not just static sites anymore. People expect a lot more functionality, you work with people that design and promise a lot more interactivity, and there's a lot more people coding without enough experience to reach for the tech that is the most efficient solution to a problem.
I don't think a condescending article like this is beneficial for our community. Maybe we should try educating (nicely?) on how to pick the right tech for the problem. Maybe we should ditch "user" for "person" (or "surfers" lol), when talking about the consumers of our output. And finally, maybe we should have more empathy of the people that work with our trade and work together towards a better internet.
(2) I think "management" deserves some criticism here too. I think the average developer doesn't want to stick 15 different third-party trackers in a page, but they don't get much empathy when they bring it up with the "business" people.
Still, over engineering sites and products because you want work to be 'fun' isn't exactly exclusive to web developers. Quite a few game designers have similar issues in that field, and I suspect there's a bit of a clash in ideals between creators and audiences overall. It's rare you find a creator of any kind who just wants to do the same thing over and over and works solely for the money, and that's the cause behind all these issues.
At this point the web is documents and also an application platform. Conflating the two causes trouble - sure, simple documents don't need JS, but non-trivial webapps really benefit from dynamic UIs/data vs old-school form post interactions.
I've learned a lot of React recently for creating sites. I test all pages that can be static with javscript disabled, (I use gatsby or next for html ssr), but some parts can't be static, or worse there's still the "webpack inlines from hell" problem.
This is mostly for B2B web applications. I work on a lot of analytics dashboards that are pretty data dense. Even with react though, my sites never go over 400kb on load.
Okay, essentially my question is: what templating language/server framework can replace react, jsx, and graphql for webapps? I really want something like that.
In the United States in 2015, 11 percent of noninstitutionalized adults reported having a disability (National Council on Disability (NCD). The Current State of Health Care for People with Disabilities).
Disability in mobility and in cognition were most frequently reported (5 percent). Data from the 2009 to 2012 National Health Interview Survey found that 11.6 percent of United States adults 18 to 64 years of age reported a disability (defined as serious difficulty with hearing, vision, cognitive ability, or mobility [walking or climbing stairs]).(MMWR Morb Mortal Wkly Rep. 2014 May;63(18):407-13.)
That makes disability more than 5x more common than being a natural blonde.
There's a faction of web people who want web applications to compete feature-for-feature with native apps. I'm closer to that camp for historical reasons, but I also understand that compromise on application features is completely worth it if the benefit is an (even partially) "open" platform for applications to build on.
But then there's this other faction of HTML purists who see the "true" web as necessarily document-based, declared with markup, typeset with CSS, and compatible with absolutely every computer system of the past 30 years. This makes perfect sense if you work for a magazine and see the web as a platform for open (and often static) content, and I'm sure some of them do understand the difficulty and the value of compromise, but I never hear that.
So now it's 2018, the "beat native apps" camp and the "documents for all" camp talk past each other through the years and nothing really changes. Maybe some people are in both camps and suffering from cognitive dissonance, who knows.
But there are good reasons why web applications don't do what Photoshop does, and good reasons why Photoshop doesn't even try to provide copy-and-pasteable URLs for every unique application state. There are good reasons why "serious" declarative application frameworks (like XAML) are a terrifying mess, and good reasons why building a CMS from scratch in C++ would be insane.
Secondly, just because everyone has access to every website on the internet does not mean you need to build your website for everyone. If someone says their website is targeted at a specific group, they build it for that specific group. You wouldn't tell a formula 1 engineer to raise the suspension on the car he's building so it can clear speedbumps in the parking lot.
Again, The core concepts of the article are common-sense, but the tone and specifics don't resonate with me.
Wait, what? I can't find anything about door levers having this kind of security risk. This kind of sounds like voodoo security where "worse UI = better security, because fewer thieves know how to use it".
>Secondly, just because everyone has access to every website on the internet does not mean you need to build your website for everyone. If someone says their website is targeted at a specific group, they build it for that specific group. You wouldn't tell a formula 1 engineer to raise the suspension on the car he's building so it can clear speedbumps in the parking lot.
Sure, but most sites aren't narrowly targeting people who can see and who have the latest hardware as part of their core reason-for-existence. News sites certainly aren't, and yet their mobile and screenreader experience is cancer.
I specifically remember this talk: https://www.youtube.com/watch?v=rnmcRTnTNC8
at around 18 minutes in he shows one of his employees(?) use a long bit of wire to pull the lever and open the door in a matter of seconds.
The web is not about me.
But the websites I create are not for everyone. They are for my users.
My users are professionals, on desktop, with a mouse and keyboard.
Knowing who are my users allows me to do better design. I make the best experience for THEM. Not for you, nor for all your friends using different hardware.
It doesn't mean I don't care about loading time or how things are displayed on the screen. It means I do so knowing in which environments it will happen. And yes, it might means you will be excluded.
>> Knowing who are my users allows me to do better design.
>> I do so knowing in which environments it will happen. And yes, it might means you will be excluded.
Obviously the poster here is more than okay with this. They're explicitly stating it will happen, and it's part of the plan.
The problem is when it's not part of the plan.
Imagine the author is writing an internal web tool for a corporate enterprise environment.
Imagine all of those computers and devices are standardized, and the software will only run on these devices on an intranet.
The previous poster's point is valid here. We know the users, the users are professionals, we know who they are, and we know what environments we'll be deploying in.
Therefore, of course, beyond accessibility, of course the poster wouldn't care about people trying to kill time before the bus.
Let’s not, since this is not relevant to my experience on the web the slightest. I couldn’t care less what you did on your internal network, as long as it doesn’t affect me. If you use the same standard for your external websites, though, then we have a problem.
I am your potential user. A professional, on a desktop, with a mouse and keyboard.
I absolutely hate your slow, click-driven products. I have things to do, and I need the software to help me do them as fast as possible. I run a lot of software simultaneously, so your heavyweight resource hogs are actively preventing me from doing other things I need to get done. If I find a competitor that actually cares about providing value in exchange for money, be certain that I'll switch to them in a heartbeat.
On the contrary, because I know my users are on a desktop, most actions can be done with keyboard shortcuts.
This means you haven't tried it. Which means you're talking about something you don't know.
It gets worse. In order to deliver on those promises, the guild now has to decide what framework to use, which UI elements are worthwhile. We're politicizing our jobs, and the only way to solve political problems is with government. Enter the web programming guild.
There are many parties involved in designing the web, and developers aren't typically viewed as the subject-matter experts on how the UI of the web should work -- just on how to implement what's already been decided.
People outside the industry don't know or care what the division of work actually is, though. To them, the group of people who collectively create the sites are "the developers". That isn't wrong, it's just not precise.
You give way too much credit to the motivation of developers. Developers are just like anyone else working for a living....our primary motivation is to make money.
One place I worked at had an API with an "unfixable" 120ms delay on every request that persisted until someone ripped out the beautiful ORM system and replaced it with 3 SQL queries.
This is actually really really bad. HTML renderers should have been designed to immediately fail when incorrect. This allows developers to more easily write correct and robust code rather than making the browser be robust. If the web were designed this way, you'd have a world where only correct html is written.
Also its not like we are not engineering a war machine here. Machines like the warthog are designed to keep functioning when components are destroyed. This is not necessary for html as we are not sending our html into a war zone. Sections of your html are not actively being removed/destroyed by users.
You'd also potentially be looking at a web that is far more unstable and crash-prone if you essentially treated a browser like a compiler or the like.
My adblocker does this.
From the developers point of view though, the site is incorrect without the ads. He wants you to look at the ads, he's not in the business of providing you website content for free.
Still the adblocker actually does a surgical removal that maintains correctness of the html.
He got that right.. Wish the media would do the same
> The internet was our garden. And a beautiful garden it was. Sure, some fed agency created it, but let's face it, they used a fraction of the lot and we didn't really care for their supersecret bases they had littered about. There was so much empty space in between! And that lot we cultivated. We built a few nice trees and in their shadows we relaxed, we planted beautiful roses and yes, a few fruits and vegetables because, hey, it's always better if you grow it yourself. And ... heh, well, yeah, we had a few corners here or there where we grew that "special weed", ya know, but nobody really gave a shit, it was just us.
> We were pretty good gardeners. Well, you pretty much had to be in those days, if you didn't know your way 'round with rake and shovel, you didn't really get much out of it. Still, we were quite happy with it. So happy actually that we thought we should share that. I mean, there's so many people out there who don't even know just how great the garden is! And we invited them in. They looked around and, well, most of them didn't quite "get" it. Sure, it was nice, here or there, well, if you're into botany, that is, but it's kinda hard to get around and find your way through the jungle, and using a machete wherever you go, phew, hard work! But a few of them stayed. They didn't quite know what they do, but we handed them a few saplings and some seed and some actually managed to learn a thing or two about gardening. Sure, of course a few smartasses tried to steal our stuff, but we usually didn't have much of a problem to whack them with our shovel and get our stuff back. And, heh, yeah, we, too, went into each other's yards and played some pranks on each other, painted their roses black and the like, but it was all in good fun! And hey, they sure liked our ... ya know, "special stuff". They still had no idea how to grow it, but they were quite willing to help us share everything with everyone, as long as they got their share, too. And, well, why not, pass the blunt!
> That was about when the corporations noticed that, hey, where did all the people go? They took a look at the garden and they went batshit crazy. I mean, sure, we knew that it's great, but we never saw anyone go so insane about it. They saw it as the next big thing to make money with, and we laughed. Money? With this? Dude, you can't make money out of a system based on freedom and sharing! Everything in here is free. Yeah, in both ways.
> True. You can't make money in such a system. Unless of course you change the rules. And changing the rules, they could.
> I can't help but think that this must be how the natives of the US felt after they were "discovered". Because we had to face that there are suddenly areas in what we considered OUR garden where we couldn't go anymore. Worse, something that was the staple of our culture, going to a guy who did something great and asking him for a sapling of his wonderful tree. Became anathema. Instead of you SHOULD imitate and build on top of mine, the new creed was you MUST NOT. This rule, of course, did only surface after they themselves took from our gardens what they could possible rake together quickly. You might understand our utter disbelief and of course outrage when we noticed that turnabout is not fair game.
> Well, we have had our share of trolls and nuisances before. Long before we already had to deal with people who trampled through our gardens or were a general pest. Our solution was simple, we took our superior gardening skills and whacked them from here to next week with our shovels 'til they either learned to play nice or left for good. This didn't work out so well this time. No, not because they had the better gardeners. But they didn't need to. They had a much more powerful weapon in their arsenal: The law. First, they ensured that the laws would benefit them, and then they used it against us. And despite how despicable it may be, we have to admit that it is quite efficient to have others take care of your battles, especially when you know that you cannot win a conventional war.
> And now we're sitting here in what's left of our once beautiful garden. The once mighty jungle has been tamed and civilized, what used to be interesting and a land for explorers is now divided into lots that you may buy instead of simply use. You can get there easier now... well, if you prefer using long winding roads to a direct route, but the long winding roads are necessary so you pass by all the billboards that block your view to what's really interesting. Of course you may not step anywhere, only where you're allowed to, and don't even think about taking anything, rest assured it's for sale, not free.
> So we're sitting here now, at the edge of something we once knew as beautiful and free. We're looking at it and we wonder what we did wrong. Where did we fail? And I can only come up with one solution for when we try something like this again: Don't invite the masses in. Keep it to yourself. It's the only way how you can really keep it. And the only way you can do without a camo net over your herb garden.
You can't rely on websites being futureproof or adaptable to different situations.
You can't rely on content being accessable to everyone. Nor can you count on content being available forever anymore.
Sites are blocked or throttled by ISPs and governments. Sites are expected to censore they're content. Even search results are being censored.
You can't trust in even the most basic assumption of privacy or anonymity. Everything you do is being tracked and sold.
So many social networks require your phone number. So many sites need you to login or need your email.
It's all one big get rich quick market.
I'm not going to spend weeks hand coding some progressive enhancement in vanilla JS so some angry blogger can view my stuff in a console based browser with no JS.
It's like these people think that developer time is an infinite resource. Old woman yells at cloud indeed.
Your point about hand coding progressive enhancement is well taken. What we need are tooling and frameworks that make such an approach feasible. The author laments that the frameworks du jour tend to start with (and incentivize building) the resource-heavy, complex, fragile version of a site / app.
I also didn't like the tone of the article. And some arguments were pretty far fetched if you ask me. But this is actually a very valid point. Its not very hard to optimize websites for accessibility -- but its almost never a hard requirement except for government websites.
Oh and, one thing I found particularly interesting about this site:
442.57 KB / 444.68 KB transferred
Finish: 4.43 s
Something is a little screwy with the page though -- the main HTML page, which is 27K, takes two seconds. Some infrastructure somewhere could be running/configured better.
She brought up some extremely valid and common real-life cases for this:
• being in a foreign country
• being at a place with a shitty wi-fi connection
• that 30% of rural America is still dealing with 500kbps or lower internet
Basically, the ignorance to say 'I can't be bothered to take the time to optimize my web site' is literally basically what she's calling people out on. She's, from what I can see, stating that especially in a large organization - (she uses the example of an airline ticket site), the statistics behind this seem significant enough that depending on your number of users, their locations, and their activities, you could be looking at a significant number of users who could simply just ignore your web site or services because you couldn't be arsed to try to optimize your web images, or simply make the site work without having to display them. (Certainly not text-in-images, at the very least!)
I work as a senior iOS dev at one of the big banks in Canada on their iOS app. We won't stop supporting iOS 9 until that number falls under 2%. That still represents more than 10,000 active users for us. These 10k+ users rely on us for mobile banking, and we could lose them as customers if we don't support their devices, because some other bank will be happy enough to keep supporting it if we don't, and it just means more for clients for them.
The author clearly gave another example of this in the anecdote she presented about the individual who could not even log in to the airline's website at the airport itself, and ended up having to schedule a flight with a different company. They lost at least one customer there, and who knows how many more, all because they can't be arsed to check to see how the page behaves on a crummy connection, and optimize it thusly.
I remember living on a farm in rural Illinois, and the best internet we could get at the time was satellite. Which (or, worse, dial-up) is the only option for a lot of Americans, apparently.
In absolutely no way is this woman claiming 'developer time is an infinite resource.' In fact, I think she's saying 'dear developer, the web isn't about you.' It's about the users.
Plan your development for these kinds of cases to start, and you won't be spending 'infinite resources' on it, because it was always considered.
Fine, and they can go somewhere else. I'm sorry but I'm not about to consider this TINY percentage of users the same way I don't consider the TINY percentage of people accessing sites I work on from horribly outdated browsers.
The next sentence applies to web APPS, not blogs/news sites/etc.
People that say websites should gracefully fallback to non-JS web app when a user accesses the website with JS disabled are crazy. FULL STOP. PERIOD. You either end up dumbing down the entire experience for everyone OR maintaining 2 websites. Neither of which is financially sound for the vast majority of companies/developers given the percentage of people who don't use JS.
Yes there is: the Americans with Disabilities Act. Many private businesses in the States have been - and are currently being - litigated against under said act.
The mandate of the ADA is to provide "full and equal enjoyment" of a public accommodation’s goods, services, facilities, and privileges. Places of public accommodation include, "place[s] of exhibition and entertainment", "places[s] of recreation", and "service establishments". The vast majority of websites are therefore "place[s] of public accommodation", are bound by the requirements of Section 508, and - since roughly this time last year - are required to meet WCAG 2.0 AA (https://www.access-board.gov/guidelines-and-standards/commun...).
In the States businesses are also required to meet the requirements of various human and civil rights laws enacted by states and cities.
And if your company has an international presence, it's also covered under whatever legislation exists in those jurisdictions.
Basically: hell yeah there are requirements to make web content accessible to disabled people. Almost whatever you do, wherever you do it, will be covered to some degree by legislation designed to prevent discrimination.
Of course the existence of financial/technical incentives are good reasons for businesses to make sites accessible, regardless of any legal requirements.
When have you ever know a for-profit business to care if it's not mandated? They only care if it affects the bottom line. As a developer at a company I don't have a ton of say over what we spend time on accessibility-wise and even if I was able to get that code in the QA time to test it would never be approved.
Businesses are not charity. If it doesn't make financial sense to make their website accessible, then they won't unless they're forced to. I'm making no value judgments, this is simply the reality on the ground.
… except that you appear to adjudge a negative financial value to accessibility compliance, when in fact the opposite is true.
Unless that web content is used or maintained by the United States federal government.