In the late 90s the majority web of developers seemed to believe that the only proper way to link to a webpage was by running some messy JavaScript code
No, the majority didn't. Bad developers did stuff like this, and some are probably still using the same tricks right now.
The majority realised how stupid it was to do so, and therefore didn't create links like this.
I've seen some crazily bad website code in my time, include sites outputting their HTML using nothing but hard-coded document.write() calls. That doesn't mean the majority of developers were following suit.
Agreed, also I think it's naive to assume it's always the developers fault when something like this happens. (In the 90's) more often than not it was some douchebag marketer clamouring about people 'leaving' their site and convincing management to open new windows for outbound links.
There's always been a divide between what developers encourage, and what clients ask for and want. This is one of the better arguments FOR developers creating javascript-heavy single page sites (for themselves, or demos)... if they had to rely on their clients they'd probably be years behind the curve.
Most of the OP's points seem to be cherrypicked from the typical "don't do this" lists. Bad developers will be bad developers. By the time we can invent languages or softwares that turn bad developers into good developers, I think we will be ready to hang up our keyboards and pick up art.
Perhaps annoying websites that didn't use real links were more "memorable", because they caused issues and required workarounds. Personally, though, I've seen more of them that I would have liked to.
I'm going to take this opportunity to list a few of my biggest web develeper peeves
1. Forms that auto change fields
99% of the time using a computer tab goes to the next field so it's muscle memory to press tab after filling out each field. Then some dumb add web dev decides that for zip code, once I enter 5 digits he'll automatically move me to the next field. This has at least 2 problems. Either my muscle memory had already moved me one field too far or, I made a mistake on one of the last digits and backspace does nothing because it's been moved to the wrong field.
2. Credit card or phone number fields that require a certain format
Many sites insist on 1111222233334444 or 1112223333. Entering 1111 2222 3333 4444 or 1111-2222-3333-4444 and they complain. Just filter out the digits on submit or on the server! Stop making me do things the computer can do so easily.
3. Auto formatting fields
Stop trying to be clever with a form that shows one field that looks like (111)223-3333 but doesn't let me type the (, ), or -. Apply #2 above. Stop messing up my keyboard usage by making it different than every place else I type.
4. Forms that ask for city, state and zip code
Ask for the zipcode first and then fill in the city and state (the let the user change them)
Unfortunately, your credit card format may not be a problem with the webmasters, but the lawyers. I worked on credit card processing applications once upon a time. We were contractually obligated by our clearinghouse (First Data Corp) both to only submit credit card numbers in the exact format they were provided by the user, and also without spaces or dashes. So while it's a stupid limitation, it may not be the faults of web developers everywhere.
This was true a decade ago and may have changed, but that industry can be pretty slow to update.
How about we just say: Adobe Flash. Worst thing to ever happen to the web. It was the MIME killer. And it was all for nothing. Adobe themselves killed it, it was so bad.
Designers and game devs loved Flash, but web devs hate(d) it. The only cool thing about it from a web dev perspective was Actionscript, but this was before Javascript was cool. You could also do long polling with it, but not many people did – anyway, this was really something that game devs loved, not web devs.
"Swf" is for ShockWave Flash; swfs are notoriously large. One of the hallmark of Flash files was the ubiquitous loading screen.
Forms that hook keyboard events to do different actions to what you might expect.
For example, I was using a payment form recently. There was a <select> list for choosing my country, now I like to use keyboard as much as possible so I filled out the form by tabbing between the fields.
The normal way a select field should work is that arrow keys navigate the list and enter selects the current value.
Unfortunately the developer had hooked the enter key to submit the entire form.
Took my about 5 attempts to make the payment because I kept instinctively hitting enter to select the country before the form was completed.
i've always asked for country and zip code first. then ajax-filled (if i have a db table for it) the city/state/province into fields below.
addresses are strange. humans think of addresses in little-endian terms because to them the rest is implicit, while computers and filtering works better big-endian where no assumptions can be made in advance. meh.
It's much harder than it looks. "Forms that auto change fields" are all I want when I'm filling in payment details using my phone; instead I have to leave the keyboard and attempt to hit a tiny input every now and then.
If it does make for a better experience on mobile they could change behaviour based on viewport size, although that might confuse someone that expected it to behave the same on mobile and desktop. And anyway, don't you have a tab key on your phone's soft keyboard?
This is bullshit. One-page JS apps exist because there are a few big advantages to them. There are also disadvantages.
Take my site: https://circleci.com. Doesn't work in IE8 or less, and it has some bugs. But when you click a new page, it happens immediately. Properly immediately.
Inside the app, when you click a new page, it takes 1 AJAX request to load. Not 50 JS/CSS/img assets. No white flash. Just 1 click and it loads instantly.
Now, that's not to say that the majority of sites like this couldn't be done better. They could, and so could we. We need to sort out our SEO story for a start. But we made a conscious decision to do this because it makes sites fast, and much easier to write.
If you're saying that the only way to build a fast website is to use single-page design, you're wrong. Having reasonable amount of images and CSS, as well as efficient server side components does way, way more for noticeable performance than being single-page.
But I think the real issue at hand isn't about being single-page, it's about using single-pageness as an excuse for ignoring certain usability and architectural issues.
No white flash. Just 1 click and it loads instantly.
Unless you're using IE9.
Also, I couldn't click any of the top links on my Galaxy S.
It's not the only way, but once you optimize everything in a static app, the only place to go is one-page.
> But I think the real issue at hand isn't about being single-page, it's about using single-pageness as an excuse for ignoring certain usability and architectural issues.
Agreed. Funnily enough the architectural issues are easier with one page apps, IMO. You just expose a REST API, do your asset caching, and don't need any HTML caching at all!
> Unless you're using IE9. Also, I couldn't click any of the top links on my Galaxy S.
Thanks for the reports. I think those are orthogonal from the one-page JS app though, and are more due to building a PaaS with 2 people. We'll fix it though!
Funnily enough the architectural issues are easier with one page apps, IMO. You just expose a REST API, do your asset caching, and don't need any HTML caching at all!
Let's say I want to build a new search engine. How do I crawl your website without having Google-esque budget? More importantly, if all websites were built like that, how would anyone could build a new search engine?
That's also architecture. How do things interact? What is possible and what is not? What kind of things would happen if technology became popular?
The real issue here is not that the page is not crawlable right now. I know there ways of addressing this. The issue is that it's not semantic and that you need to do extra work (separate from normal development) for search engines.
With absolutely no offense intended, your search engine is very low in my priority list. I care about how my customers experience Circle.
If you were to create a new search engine, and it didn't take into account one-page-apps, I'd say you've built the wrong architecture too.
Finally, it's not that we don't have the time, it's that we dont have the time _right now_. Web apps suddenly became hard in the last 3 years - people expect a lot more than they used to. But our priorities must be to deliver the kind of experience that customers desire. Technology should be our servant, not our master.
It's not about my search engine, though. It's about any new search engine. It's about the future possibilities.
I care about how my customers experience Circle.
If your customers care about privacy and security and browse with JavaScript disabled by default, they will be annoyed. If they have disabilities and use a screen reader, they're likely screwed. If they get lost in your docs and decide to use "site:circleci.com" query to find the page they need, they will get nothing. If they would want to use a mini-crawler to download your docs for local usage, they will get nothing.
That is user experience, and those things are objective. They either work or not. On the other hand, performance gains and development time gains of single-page apps are -- debatable.
If you were to create a new search engine, and it didn't take into account one-page-apps, I'd say you've built the wrong architecture too.
There is no reliable way to build a search engine that would crawl through websites created mostly in an imperative language. Even unreliable ways to do that are complicated enough that Google and Microsoft fail miserably at them.
Presentation does not trump content, and the spread of this attitude will destroy the web. If nobody can reference and repurpose that content in ways you never thought of, it ceases to be more useful than print or even television.
CircleCI is an application, a tool, not a collection of documents. It has no use being crawled, since there isn't much content, and what you have is not for sharing. This applies to 90% of web apps.
When I visit your site, a blurb of text flashes for a minute then turns into your headline/call-to-action. Also, "Continuous Integration for Web Apps" jumps to "Continuous Integration (3 extra spaces) for Web Apps" after half a second. I'm on your side but there are very obvious deficiencies in your client-side app.
Agreed. There's only so much you can do with 2 people, and we've chosen to prioritize fixes to our platform over the UI. Difficult choice, but I don't think having a 1-page app affects it except positively.
The "Continuous Integration for Web Apps" flash is due to our Kissmetrics client-side A/B testing. An interesting point is that if we were doing server side A/B testing, we probably wouldn't see that. On the other hand, once I saw that for the first time, I realized it drew my eye to it, and figured it's not a bad thing. But maybe I'm compensating.
We have a clojure backend, the front-end uses Sammy.js for routing and history, and Knockout.js for data-binding, written in CoffeeScript and HamlCoffee, delivered using dieter (which is sprockets for Clojure).
(PS: like the sound of hacking on dev tools in clojure? We're hiring!)
The downside of building sites that are exclusively client-side rendered is that it stops working if the user has Javascript turned off.
That's why I personally use the strategy of rendering server-side only upon initial load, then hooking in client-side routing/templating libraries afterwards. This solves the issue of search engine visibility and client JS support in one fell swoop. (Mustache templates are great in this regard since there's libraries supplied for most languages.)
By the way, just a nitpick: your site seems to have a empty margin to the right, which creates a horizontal scrollbar. I'm using Chrome Version 21.0.1180.89 on OSX.
We made an explicit choice not to support non-JS users with our product. For simplicity, we don't support non-JS users in our marketing material/web site. We do intend to do something better there, for example by having a no-script pitch for why you should enable JS on our site.
We have a very pragmatic attitude. When people complain, we start to prioritize it (thanks for the bug report/nitpick!). If no-one complains, it can't be important. We haven't heard any complaints from non-JS users yet :)
I agree on some points, yet I believe the author exaggerates during most of his article.
PushState, for example, is not a bandaid, and for those not-so-modern browsers there are hashtags (let's be frank: 99.9% of users don't care about how beautiful or ugly their URIs are and even less about one particular character).
In many cases such as WYSIWYG editors, which he mentions, Javascript is the only way (HN users might prefer Markdown, but for most average users, that's already too complicated). New HTML5 input types are awesome, but they are in this case the advanced fragile technologies that currently lack a broad support base.
I particularly disagree on the last two points: These days, Javascript and CSS is minified, sprites are used on basically all popular sites and content is gzipped. Many people even use a CDN for their JS frameworks. Loading time optimization is a topic more important than ever before. The same goes for mobile versions: I see a lot of people even offering a mobile-optimized version of their portfolio.
He is certainly right saying that these aspects are all very important, but I don't really see the lack thereof in practice. Any examples, perhaps?
> PushState, for example, is not a bandaid, and for those not-so-modern browsers there are hashtags (let's be frank: 99.9% of users don't care about how beautiful or ugly their URIs are and even less about one particular character).
Keep in mind that using the history API (push|replace)State is not so much about making the URL pretty as it is about page refresh semantics.
With a hashtag the page refresh loading the correct content requires at least one more trip to the server to load the content represented by the hash and relies completely on your JavaScript not breaking.
With a url "fixed" by (push|replace)State the server can respond directly with the content speeding up the response time and removing a point of failure.
Can the server be set up so pushed states can work with any arbitrary path, as you can do with hashtag addresses? Can you get the same level of flexibility in handling bad URLs? I'm still struggling with getting PushState working, since I implemented hash routing as a slash-delimited query rather than a reflection of any actual directory structure on the server, (I.e. domain.com/#category/group/item loads an item, with various graceful fallbacks if it's not found, such as loading the group instead, or suggesting items that do exist). If I can avoid a 404 in most cases, I think the user will be happier. I'm not sure if I can get the same level of flexibility with PushState.
The server has nothing to do, pe se, with pushState. That just manipulates the client. If on reload/next visit, you don't want the link to break (which seems like a reasonable requirement), just set up a route in whatever web server you're running that maps to the url schema you're generating in the client.
Any thoughts on avoiding generic 404s in case of typo or removed content, in the hope of returning something more useful? Can we use context info from the URL given to personalize the 404 response, or maybe route to a different resource instead for certain contexts? This all seems so simple with client-side hash routing.
[Also, would that mean that I would need to maintain a list of every single possible route in the whole site?]
You can do whatever you want. You have the URL and can return any arbitrary bytes. The way you look at the hash in JS on the client would be the same way you look at the URL on the server.
In the sense of overall web architecture, it is a bandaid. You're using an imperative language to change the state of the page, and then you're using imperative language to fake moving from one page to the next. If you want the whole things to be crawlable, you also need to have extra code that generates the new "page" as if it wasn't a result of client-side manipulations.
Did you ever ask yourself what would it take to go the opposite way and efficiently represent client-state changes by page navigation? Not that much, especially which proper technologies built into the browser.
In many cases such as WYSIWYG editors, which he mentions, Javascript is the only way
There is nothing wrong with using WYSIWYG editors written in JavaScript. However, they shouldn't interfere with standard functionality, such as spell-checking.
It wouldn't hurt if such a popular control was built into browsers either, rather then re-implemented hundreds of times.
I particularly disagree on the last two points: These days, Javascript and CSS is minified, sprites are used on basically all popular sites and content is gzipped.
That doesn't stop people from writing inefficient JS code or simply loading too much stuff. Open this with FireBug, for example: http://www.smashingmagazine.com/ . That is one of the leading websites on webdesign. What is there to expect from corporate pages and various product (i.e. movie or game) websites?
Try browsing on Kindle (not Fire) to see which websites really are efficient and which ones aren't.
I think the author is missing out on what you mentioned, specifically that we're attempting to bridge the gap between size and complexity, and new and stable.
As someone who is constantly not using the features I'm learning about due to IE7/8, I mess with things that definitely won't work on anything other than Chrome Canary. Why? Because it's a fun and brave new frontier.
I really think you can't complain about the web being NOT perfect. It's full of dinosaurs and people are still running on oil. And besides, this newfangled green revolution has a ton of kinks we need to take care of before we can make the switch.
Transitional. The web is in a state of flux always.
I agree strongly with this article, but I think it's overstated some of its points.
Yes, if I go to a Web Design commuinity like /r/web_design on Reddit I'll see pages of unusable web pages with hundreds of HTTP requests, 2-8MB of assets and "support" for legacy browsers using Modernizr. Some people view this as the future, but I view it as a site built by an entry-level developer that will be unusable by most of its clients. If you ever want some free comment karma on Reddit find any submission, complain about the file size of the site and watch the upvotes from conflicted developers fly.
However, these things have context.
If this website were for a typical business then yes, you'll have "bugs" flying in from everywhere from the client. They'll not be able to view it on IE7 because that's the only browser their IT department will allow, their users will also be running this site on an ancient computer that cannot handle a dozen jQuery plugins running on a single page and they'll start making demands that you, the developer, will find unreasonable because "that's not how HTML5 works!".
However, if you're creating a new business with a set market of people with newish machines running modern browsers then you can afford to use "new" technology and to sacrifice your precious page size for some creativity. For some markets it is a good thing if a site will heavily utilise JavaScript. For some web applications, it's almost necessary to rely on JS.
Regardless, a good developer will be able to get a modern site working well in legacy browsers even if the user is on IE6. I'm no senior-level developer, nor am I solely a front-end developer, but I can at least do that much on a large-scale client site without any trouble. If I can do it then I fail to see how so many developers can.
Yes, developers will not learn. The average developer is happy to throw as much jQuery at a problem as they can, and they're happy to go nuts because they own a decent computer, use the latest browsers and have a fast connection. The best front-end developers I've ever known understand these problems and can provide similar or better results in an optimised way.
My humble home page now uses just html with very light css, no javascript. Renders fine on mobile (blackberry and WAP on an old Samsung B130 I keep just for testing). Loads fast. Suits my content. And the client is me, of course.
You can become a web developer without pretty much any knowledge(how computer is built, how it all works etc.), so it's not a big suprise here that there are many badly designed and/or coded websites.
Flash websites or one page JavaScript clusterf*#ks come from developers, who want to achieve cheap cool looking effect in order to cover the lack of real value in their product. They also want to have limited amount of technologies in their stack: Backend -> Server app -> JSON -> Web app, because it gives them false feeling of coding their projects faster.
Web development is like playing football or any other sport for the masses, everyone can do that, but only few can achieve good results.
This makes the article sound like a bullshit linkbait from Captain Obvious.
I think we don't see countermovements because they are boring. The exciting things that get shared online and get talked about are new but fragile techniques that you can't actually use in your day-to-day work. The well-trodden, stable way to build web applications are used every day without incident or fanfare. There is no need to write a blog post about graceful degradation or javascript-aided navigation that doesn't break the back button anymore because it's standard practice.
Speaking as a web developer I can tell you the days of my "bad" code lasted all of one year as I was learning.
After that, the bad behavoir this article describes that I happened to code is crap I didn't want to do but was forced to in order to satisfy a clients desires.
Most of the horrible decisions that occur in software development are not the fault of a developer - look to their managers or clients for blame. In my case at least I argue quite voraciously to adhere to standards. I almost never win.
Yes, but there are lots of new developers and not all of them learn as fast as you do. Their managers are always looking at a way to cut cost and the easiest way to do that is to hire cheap, inexperienced people and then train them on the job.
The crux of this is that there is increasingly a fundamental difference between "websites" and "web apps".
If your site is essentially a collection of documents, then I expect it to work with standard browser navigational tools, be easily linkable and function without Javascript being enabled, it should also work in old browsers.
If you are doing an application on the other hand, It's far more reasonable to expect an upto date browser and JS etc. I almost don't want to know that it is running in my browser, because the navigational toolbar in my browser is probably a bad fit for whatever it is that your application does.
Of course there are blurry bits and grey areas here. But as a rule, the front end of your blog is a website and the "admin panel" is a web application.
Face it - we have SO MUCH MORE POWER now than we did just a few years ago, and so many more tools for managing and abstracting that complexity. In fact, there seems to be a trend of abstracting complexity while focusing on lightweight codebases. Sounds like progress to me.
"100% Flash" is not that bad - I'm using FBSD at work and never bothered to configure flash plugin (it's possible! really!) so "100% Flash" sites are just completely under my radar. I think sites which use flash exclusively for navigation, as in top/left/right menu are worse. I can see those, but to be able to do anything (like go to other page!) I need to use google. Thankfully, I didn't see such a page in quite some time.
I believe there is indeed a countermovement that the author overlooks. It's a generation of tools beyond HTML, CSS, and JavaScript directly where content is distributed in slide decks or a similar presentation-like format. It's great for readability, it's relatively interactive, and it's flat. Surely you're not making an app in PowerPoint, but the Web is not the only/best way to consume content.
For examples see Issuu, Slideshare, and especially Prezi.
I'd be lying if I didn't mention that I miss the days of HyperCard stacks and MacAddict CDs.
Oh, and for the most-distracted among us... there are always books.
I have to agree with this. Abusing Javascript and CSS has become fairly common, and it's annoying. Example: altdevblogaday.com does not open in firefox. It just shows a blank page. But if I disable Javascript, it will work as expected. Go figure.
But why does it have to be that way? Those are just some blog posts that I want to read. Plain simple text. There isn't anything interactive there. It's not a web app. Why do they need some obscure javascript that messes up everything and makes the website unusable?
TL;DR - All of these problems could be found and solved by proper testing.
I enjoyed parts of this article and was left with one overarching theme - web developers really need to learn proper testing techniques (and their employers/clients need to learn why QA is important).
How many ad agencies have taken to calling themselves interactive agencies? How many of these have dedicated QA teams with different devices and a good array of testing tools? How many of the developers who work for companies like this get time to test their sites? How many hear "if you were better at your job, you wouldn't need to test sites"?
I'd also suggest that business schools start teaching marketing students to build simple web apps. A pint says that many of these problems stem from young marketers who want to be edgey, despite having no clue where the edges lie. My University has an Online Marketing course, but they don't teach anything more advanced than building a spreadsheet with multiple worksheets in Excel. Maybe this just proves that marketing is too important to leave in the hands of marketers...
One of my biggest pet peeves is still being perpetrated today: Using a Javascript onclick event to open a new page instead of (or even alongside), you know, an anchor tag.
If you've ever tried to middle click a bunch of links on, say, a big corporate news site and had your current tab change unexpectedly, now you know why.
Lets take twitter, for example. Right now I'm dealing with a weird bug. I have about a dozen followers, but none of them show up in my Enligsh language dashboard. But, if twitter recognizes that I'm on a Spanish speaking area then it translates all of the UI and the followers show up. Here is the funny bit. If I set my default language to Spanish the followers never show up!
Dear Twitter,
I love the service. You guys seem to be a great bunch. I use bootstrap a lot. But, if your most basic functionality is buggy, how do you expect people to use the service (and not abandon you)?
I know, I know. You are having front-end issues. But you seem to be stuck in an endless re-factoring loop. Please fix it. And stop trying on languages as if they were hats. Pick one, and live with it.
Most web sites suck - or at least implement one of the flaws described in the article.
But by the standard laid out in the article most smartphones suck, most desktop apps suck, etc.
But web sites, like smartphones and apps, aren't built by a single amorphous group of people called "developers". No, no, no, the development process is defined by executives, marketers, IT, designers and yes developers. Saying "Do Web Developers Ever Learn?" is misleading. It's like placing the blame on the engineering staff at Nokia for creating sub-par phones - when there's plenty of blame to place on Nokia's executive management.
A broken and stupid product is evidence of a broken and stupid company culture. Bad developers can be replaced. But a broken culture can force good developers to do bad things.
"And yes, I know what PushState is. It's a bandaid on a gaping wound, which doesn't get used most of the time anyway"
This is precisely where I finally understand that the article is merely an opinionated piece (i.e. it doesn't state facts) and a pretty misinformed one at that.
I'd also comment on his speed/mobile concerns: The web is both big and small, fast and slow, dynamic and static. Most developers have a hard time accomplishing both aspects of the duality, but at least they try.
You can praise the small, slow, and static sites, because of what they're good at, but let us also point out that they aren't optimized for desktop browsers to date, don't utilize the speed of their upper clientele of users to offload processing and handling to the front end, and also don't accomplish as much as possible without the need of a state change.
I like what you put regarding complexity, because let's be realistic, as web developers, are jobs are constantly shifting from technology to architecture to agile to scrum to etc.
The author is complaining about mistakes in hindsight, and tries to complain about modern web apps having the same problems? There's always a new frontier in web development; if you can't handle the fact that certain api's become deprecated or that certain features a dev used aren't completely implemented, then do it better yourself. Update older tutorials, evangelize the correct approaches to problems, then, you'll do you part to remove some misinformation of outdated solutions that others are using.
Well said. Responsive design has been around for a while, and I've seen it used well. But not every site has a need to support every combination of big/small/fast/slow devices.
I agree with everything he mentioned up until he started criticizing the today's standard and practices, he ignores the issues you run into in today's environment based on what you've said:
media queries vs two separate projects - responsive and flexible one site fits all approach vs less resources served for smaller mobile version and reduce complexity(?).
web fonts / assets management - this is a growth issue or a design issue focusing on either OECD/Fast Internet or Server Load. It is somewhat an issue, but not something to cry over really.
his complaint about his textarea box are ill founded. He's using posterous' system, and should build his own if he doesn't like it so much. Isn't it a free service?
modal / gui decisions - usually client/manager driven. Can't blame me on this one. I'm all about pages and lightboxes.
his complaint about mobile browsing is one that is justified, but one that isn't any modern developers fault. My first 10 sites were all responsive. Now some clients don't want it. How is this my fault? This does take extra time and consideration, unless a client wants a very simply mobile site only.
A lot of it has to do with short-sighted decision making. Attempt to control some visual aspect of the application takes precedence over long-term issues, such as usability or overall architecture (e.g being semantic and searchable).
Also: biased technology evangelism and special interests influencing web standards and popular opinions. These tow are probably the root issues.
Completely agree with you on the bad flash and links point. Not sure I agree with you on the webapps points though. Would have been nice to have some depth on why you feel PushState is a "bandaid" rather than a full solution.
You also seem to have not taken into account how new HTML5 is. Sure generally a semantic solution exists now but when the vast majority of sites were written this wasn't the case.
It's increasingly difficult to build a solid foundation for a modern progressively enhanced web app in pure HTML/CSS. It's still perfectly doable, but difficult enough that a lot of developers simply don't want do it, justifying it by saying they save a lot of time. (Usually that's short-term savings and long-term losses, but I don't want to debate that right now.) PushState does not address this critical issue.
To put it in a different way, current web standards encourage people to abandon page model entirely. When they need SEO, the thought is "oh, damn, I need to re-implement my wonderful JavaScript on the stupid server". That is backwards and is not progressive enhancement.
What we really need is an update to core standards (not JavaScript) that allows for building dynamic applications in declarative way and without abandoning the page model. Such updates are possible.
Arbitraty limitations imposed on GUI, such as opening things in modal windows.
I'd like to see some additional examples of this arbitrary limitation in the GUI. It seems to me that many design choices can be classified as limitations. If the user doesn't understand the reasons behind the design choices, then the user might mistake them as being arbitrary.
Fixed-width columns. They drive me up the wall; I've got a 1920x1080 screen because I like fitting lots of text on the screen, not because I want 75% of my screen to be white margins.
Links/buttons that open on middle-click. I middle-clicked because I wanted to open in a new tab, this page should stay where it is.
Sites that don't work at a DPI that isn't 72.
Those are the most immediate things that annoy me.
Maybe a better solution would be to progressively add more columns as the screen resolution allows, keeping each column at a fixed width. Doing this would be more challenging, though, than just using a single fixed width column that supports most of your users.
I have a problem with some of Google's web apps, where they don't let me middle-click a link to a separate document. I use the heck out of tabs and I can't stand when they mess with that ability.
I CAN middle-click the following Google link types:
Same functionality as right-clicking and choosing New Tab. This limitation occurs when there is either no href value for a link or some other element being used as a clickable element, in both cases the action can only occur with a javascript onclick event. This is something that in most cases could be taken into account by always starting with an anchor tag and then adding styles and javascript events.
>This is something that in most cases could be taken into account by always starting with an anchor tag and then adding styles and javascript events. //
Would that really take it into account? Surely you'd be loading a new link without the state that the client has created, you'd need to push the state to the server and construct the links based on that - at which point it seems you're duplicating effort at the server and client.
Now we have advanced frameworks like Bootstrap and Zurb, and HTML5 boilerplate, that serve boilerplate css, js, and other components, Open source is working very very well, but people still seem to focus on the negative and make rants using blogging platforms like posterous and wordpress that basically contradicts what you are saying because these sites use different fonts and techniques that you said were "bad". We live in a sub-par world with continuous deployment running free, are people going to complain and complain over literally words, because code is made up of letters! The energy is better served contributing to the cause
This phrase caught my eye. Have any concrete examples? I generally feel like continuous deployment is a good thing but am open to having my mind changed. Like anything else, it can be abused & misused but assuming there's a quality gate (tests, etc.) then why the generalization?
No, the majority didn't. Bad developers did stuff like this, and some are probably still using the same tricks right now.
The majority realised how stupid it was to do so, and therefore didn't create links like this.
I've seen some crazily bad website code in my time, include sites outputting their HTML using nothing but hard-coded document.write() calls. That doesn't mean the majority of developers were following suit.