Hacker News new | past | comments | ask | show | jobs | submit login
U.S. Web Design Standards 1.0 (usa.gov)
426 points by molecule on Mar 18, 2017 | hide | past | web | favorite | 109 comments

Great! Even if you disagree, just the existence of a Gov standard will:

1) Create a minimal common language for all sites that the citizens are forced to use. Now we can all can use these patterns in our sites knowing that more users will know them. Remember that software isn't intuitive, but familiar: http://www.asktog.com/papers/raskinintuit.html

2) Force software developers for public services to put up a minimal decent interface, instead of just a list of features.

The best page is their Design Principles: https://standards.usa.gov/design-principles/

State websites drastically improved in Hawaii once the company I was working for (that provides https://ehawaii.gov and many of the state's sites) managed to get a centrally defined standard set in conjunction with the CTO. It meant we could get everyone on the same page, speaking the same design language and really thinking about how they presented their departments information.

Wow Hawaii's website got to be the best state website. Very modern and clean layout! Great job.

I know some states don't even have responsive websites either, or some do but subdomains are still outdated designs. I wonder if how well a state maintains their website, is an indication of the rest of the state's quality. I know next time I move I'm going to pick a state based on the weather anyways(thinking Florida). Very surprised Florida's website is very outdated. Mobile web design has been around for quite a while, and I feel like their website is a bit of a disservice to people wanting information on the go.

It seems like though states seem to care more about their website, than cities do... My local city here in Ohio website isn't even mobile friendly.

Edit: Well I found one subdomain that isn't mobile friendly - https://emrs.ehawaii.gov/emrs/public/home.html but still it looks like most of their sites are.

It was really cutting edge when it was released around 4 years ago. Mobile first, statically generated content (regenerated every hour or so).

The design standards for the state were drawn up in collaboration between the CTO's office and my employers at the time (at no expense to the state). First and foremost: Mobile first, responsive.

That EMRS website appears to be the same old same old. It has been that way for at least 7-8 years now at the very least, as it looked like that already when I joined the company (2010, I think). I guess they haven't agreed to a redesign yet (most of the agencies jumped on the redesign opportunity, especially given there was no financial impact on them.)

Amazing work!

Good to see that some state governments are making progress on the internet front.

It is even better. They also have code guidelines. Easing the choosing of technologies (open source!) and configuration.

Less opportunities for that crazy developer who wants to implement your costumer facing site in Common Lisp.

> Less opportunities for that crazy developer who wants to implement your costumer facing site in Common Lisp.

Somewhere, an LM-2 who has been faithfully serving requests since 1981 just cried a single, perfectly simulated electronic tear.

Surely this very system (HN) shed a tear somewhere between brackets as well.

I think it is more about a general lack of trust; I think (my opinion) that neves was talking about choosing a technology for religion vs for pratical reasons.

Please tell me this is sarcasm! It's hard to tell. If it isn't, then all I have to say is that hacker news is written in the lisp dialect arc of Paul Graham, who got rich off of a customer facing site written in common lisp...

* Less opportunities for that crazy developer who wants to implement your costumer facing site in Common Lisp.*

Yeah, that would never work: http://www.paulgraham.com/avg.html

Nothing about this forces any government projects to do anything. It's just recommended guidelines.

>Create a minimal common language for all sites that the citizens are forced to use.

Hopefully, but this is just for the federal government. Each state may have their own standard or no standard at all. I have a feeling people spend more time interacting with state government websites...

Government standards are often used as blueprints in the private sector as well, not by enforcement but by contractual agreements with contractors. They may not always be at the bleeding edge, but they're usually very thorough, and have an ecosystem around them, such as seminars and case law.

> Each state may have their own standard or no standard at all.

All the better that this is being given away for anyone to use.

Doesn't necessarily matter. Usually the easiest path to resolving the next change request will be taken, i.e. patching on to the existing crappy thing from the 90s. The only thing that can change it is someone shouting. I suspect most software (not the glitzy things on HN) are driven by shouting. Shouting Driven Development. If the customer doesn't care enough about UI to raise their voice, then UI won't be prioritized.

This is the beautiful aspect of this project: now the easiest path for a new project is to use the "official design", No need to argue about it.

I don't know if I agree with Jeff, the iPad is definitively more intuitive, because it mimics real world interaction better than the mouse.

There's no explaining needed, touch it, move it, it's direct manipulation vs indirect.

I'm sure, when we have weight, texture feedback, it will be even more "intuitive"

You forgot Revolutionary. Break Through. Cool. Beautiful. Faster. Zippy. Precise. Awesome. Magic!


I don't understand what your comment has to do with what I wrote, but yeah, Apple uses a lot of adjectives to sell their products.

> Remember that software isn't intuitive, but familiar:

Well put. In practice, largely correct. But literally minded me wants software also to be DOCUMENTED. E.g., at my Web site, each page has a link Help that explains the page hopefully in simple but clear terms.

But, wait, there's more:

At my Web site, also a user never has to try to guess what an icon does -- there are no icons. Civilizations had lots of icons, and eventually we got the Roman alphabet which I believe is a huge step forward. So, for links I just use words in the Roman alphabet.

The pages never do anything sudden, e.g., cover up what a user is reading with a pull-down, pop-up, or roll-over -- there aren't any.

For your

> Remember that software isn't intuitive, but familiar:

I do exploit that: The pages all use just the simplest, old HTML, and I've written not even a single line of JavaScript (JS). Microsoft's ASP.NET has a little standard JS for me, maybe having to do with positioning the cursor or some such. But, really, all the pages just use the most standard and elementary HTML controls 2+ billion people already know very well.

For more, the fonts are all relatively large, say, 24 pixels high. The colors all have high contrast, and the 25% of men who are partially red-green color blind will still do just fine.

There is just one Web site with just one user interface (UI), the same for workstations, desktop, laptops, tablets, pads, smartphones, watches, Dick Tracy's watch, Flash Gordon's magic decoder rings, and any toys in Cracker Jacks boxes that can run a Web browser up to date as of about 10 years ago.

Each Web page is exactly 800 pixels wide. The layout is not by HTML div elements but by tables and is always the same for all users.

There are always both vertical and horizontal scroll bars, and the pages should still be usable in a window as narrow as 300 pixels.

For more, how the UI works is simple and intuitive, so much so a 7 year old with a cheap smartphone and who knows no English (the text is all in very simple English with no alternatives) should be able to do fine with three minutes of instruction or 15 minutes of experimentation.

So, one of my key thoughts was to exploit for basic HTML just your

> Remember that software isn't intuitive, but familiar:

So, I stayed with just the simplest HTML knowing that 2+ billion people were then already quite "familiar".

Where can one see my Web site? At present nowhere! Since just now my development computer is sick from data corruption, e.g., crucial parts of Microsoft's .NET are not available to my software, my software won't run even on my development computer. Right, bummer.

But I've put together a nice list of parts for a new development computer also powerful enough for a good, first Web server for my Web site. Intend to order the parts and plug them together soon! Then I'll get the software and data on the new computer, make some tweaks in the code I have in mind, load a lot of data, and go live.

> there are no icons. Civilizations had lots of icons, and eventually we got the Roman alphabet which I believe is a huge step forward. So, for links I just use words in the Roman alphabet

icons continue to be a form of universal language around the world for helping people to navigate where they need to go: this way to the bathroom, that bathroom has changing table, no smoking in this area, stop!, play/pause, radioactive, scald-danger, tent-camping, etc…

they don't communicate without ambiguity but they do serve a very functional and well-appreciated purpose for rapidly communicating information which would otherwise require more words and more thought.

adopting familiar icons around a website and providing a way to remove any ambiguity can enhance the experience; eliminating them altogether is somewhat of an overreaction

The advantage of iconography is that it doesn't require to you use a single language. Many signs in the US, which is largely monolingual, tend to be very wordy. In places like Canada, where bilingualism is mandatory, it's easier to do a cigarette with the red slash through it rather than have two signs, one that says "NO SMOKING" and the other "PAS DE FUMER" (that said, they sometimes do have the latter situation).

It's fairly well known that a lot of iconography is actually poorly understood without prior specific instruction. Do you know what this road sign means: https://upload.wikimedia.org/wikipedia/commons/4/49/Zeichen_... ? How about this one: https://upload.wikimedia.org/wikipedia/commons/d/d1/Chile_ro... ?

For your first icon, it means to be cautious and watch out for tanker trunks that are able to drive over bodies of water, as one of them might take an odd route to transition from driving over the river onto your road.

For your second icon, it means to watch out for windsocks that have been blown off their poles.

I never would have guessed either of those two! Ah, I love the Roman alphabet!

Your first icon looks like truck with a tank carrying some liquid, maybe water. The second one looks like a net for catching butterflies.

For both, I wouldn't know the context.

For stop signs, restroom signs, etc. I agree.

But here I am talking about the subject of the OP, Web page design.

Yes, I have finally learned what the icons from Adobe's Acrobat PDF viewer and my favorite Web browsers do, but those are not issues of Web page design.

Looking at Web pages on the Internet and designed by others, usually I have to make at best just wild guesses at what the icons do, hover and wait for some explanatory English text, usually too small to read, or just use the TIFO method -- try it and find out.

I can say that in all my usage of Web pages, I have yet to find an icon on a Web page that I both (A) already understand and (B) have seen on more than one Web site. So, to me, on Web sites, that icons are at times and in places useful is cancelled by the fact that on Web sites the icons are not very standard. Or, for icons, I can't pronounce them, spell them, write them down, type them in, look them up in a dictionary or at Google, etc. The icons are usually little drawings, but usually I can't see the drawing clearly enough to guess what it is trying to represent. There is a icon getting popular that is a rectangle with an arrow through it -- I don't know what it means, and I wouldn't be able to look it up in a dictionary.

In my Web pages, the user interface is supposed to be so simple that any links, e.g., Help, Home, Inputs, Results, need just one very common English word.

Maybe some Web pages really need icons instead of words.

For icons more generally, e.g., for road signs, restroom signs, applications software other than just Web pages, that's a more complicated subject. For some of the icons for some applications software, I believe that some well written technical documentation is missing but needed.

Personally, I always go with icon and text if I can. I will still use a plain icon with no text in a few cases and if it makes sense:

  •For the more universal ones like + and - and trashcans
  •If I am absolutely constrained for space and dont have time to redesign
  •If the target audience can handle it or knows it perhaps
I definitely agree that only icons is a bad design choice in most cases. I would go with plain text rather than a plain icon if I had to choose.

>adopting familiar icons around a website and providing a way to remove any ambiguity can enhance the experience; eliminating them altogether is somewhat of an overreaction

I think that was the point was that icons familiar to almost everybody are quite rare on the internet. Maybe no icons at all may be excessive but from personal taste I find layout and grouping clearer than icons in complex UI.

> At my Web site, also a user never has to try to guess what an icon does -- there are no icons. Civilizations had lots of icons, and eventually we got the Roman alphabet which I believe is a huge step forward. So, for links I just use words in the Roman alphabet.

I mean, an alphabet is a huge step forward over freeform pictures, but the Roman alphabet offers basically nothing over its immediate ancestor the Greek alphabet except being a little better suited to ancient Latin.

Ah, you got me! I know the Roman alphabet, but I know only a few letters from the Greek alphabet, a few I picked up in math and physics. When I type in math, say, into TeX, sometimes I use some Greek letters, but I use only the ones that I am familiar with and, e.g., know how to pronounce! :-)

I'm really surprised all the typography options require downloadable fonts (Source Sans Pro and/or Merriweather).

I'm on a 1mbit throttled connection right now, and it's really noticeable, even on these pages, the font loading takes a while and suddenly the whole page jumps around and re-renders.

Though apart from that It's a great guideline.

Really glad they put so many color choices in there and taking care of showing how to combine typography with color. Many guidelines I've seen have two accent colors and when you come to implementing the site, you straight away must deviate from the guideline and invent new things.

It does suck especially if agencies rehost the fonts and you end up downloading the same ones over and over. Hashed font caching solves this problem pretty well and that seems like the answer instead of forcing everyone to use a common set of fonts.

Maybe there could be room for innovation here? Signed font sets with known hashes (maybe piggyback on HTTP ETag?), so over time your browser has a cache of "well known fonts".

Things like Source Sans Pro are not unusual.

Why not modify the CSS syntax? For example, what now is

    src: url("path/my.otf") format("opentype");
can become

    src: url("path/my.otf") format("opentype");
    src: url-hash("path/my.otf", "md5:86fb269d190d2c85f6e0468ceca42a20") format("opentype");
This way, due to the double declaration, older browsers can opt to download the file as usual while a modern browser detects the hash and can look up the file from its cache. Note I chose md5 here as an example, it may very well be possible to use any hashing scheme here.

Obligatory: https://hillbrad.github.io/sri-addressable-caching/sri-addre...

(List of problems you need to solve to introduce hash-addressable caching to Web)

What if major operating systems would include the popular fonts on Google library with update(s)?

Then you can use them:

   src: local('Font Name'),

Please not. People e.g. download questionable fonts from the net, which e.g. have been stripped of everything non-alphanumeric, and then complain to you "your website looks bad", or they have official, but fossilized versions of fonts... I always strip local-sources when embedding webfonts for this reason.

You're eventually going to run into trouble attempting to use md5 at a government website. FIPS 140-2 (and soon 140-3) is a standard that stops the use of that particular piece of crufty, broken cryptography.

Fonts are the only type of vendor resources where this actually somewhat works – Google fonts has a rather large marketshare that allows it, and sites usually just dependent on the font, not a specific version of it.

With js or css libraries, there are usually multiple CDNs and every website uses a different patch version anyway.

Fonts are also the only resource that you have the option to install directly into your operating system's system cache, as well, and know that your browser will prefer local cached options. A tool like Skyfonts can be used to automatically install the top 50/100/etc fonts from Google Fonts directly into your system's fonts directory.

The trade off is that there are system font thumbprinting disclosures that you potentially open yourself to tracking, if you are concerned about your privacy on the web. (I think a good idea would be for one or more of the operating systems and/or browser vendors to directly encourage installing the top so many Google Fonts, because the more people do it, the fewer privacy concerns.)

I'd just put all the static libraries/resources on a common CDN for Federal government sites.

Web fonts mean no platform-specific fonts. That's a very right thing to do for a government. Comes at a price of download size, but that's probably still worth it.

Why is it "very right" for the government to use a platform-independent font instead of a font that loads quickly and provides users with access to relevant information quickly?

Or just `font-family: sans-serif;` in CSS.

I disagree entirely. Government sites should be snappy and accessible to all above all else. Unnecessary font downloads give people with poor internet connections a worse experience for questionable utility.

> Government sites should be snappy and accessible to all above all else

Which is a bigger accessibility failure?

- the page takes longer to download

- You can't see any text on the page


    - The font needs to download
    - The font might not download at all
Browsers support these wonderful two defaults called `sans-serif` and `serif` that will use a default, system-installed, user-controllable font that doesn't need to be downloaded and will be guaranteed to work for any user!

{ font-family: sans-serif; serif; }

You don't even need to use css at all to make text visible on a page.

Been using this on a govt. site Ive been redesigning for 2 years and could be doing so for another year or more.

We've went from one govt. design standard to another and I'm not sure if the site will ever get redesigned or finished as it has to get signed off by a 100 govt. VP types. I'm burnt out by the bureaucracy and they need someone to get this thing approved vs. what do you think and what do you think and oh what do you think ... each person having a different opinion and nothing getting done but revise it again. What version are we on now .. oh number 599.

Keeps me gainfully employed thankfully, but redesigning a website should not take 3 years or more.

This is great at this point because it's short and simple, but I could imagine this becoming terrifying in a few years if it does not remain in good hands. The natural thing to do is point out what could be added and pretty soon it could become a massive burden to make sure you're compliant with the standards. Long term I suspect this could increase the weight (cost and time of delivery) of government software projects. If only we could have some predefined limit set of how long and complex it can become. My libertarian side is coming out here, but the government is really good at adding cool things, not so good at taking them away, decreasing them, or even maintaining them.

I really wish libertarians would – from time to time - point at actually unambiguously government overreach. All I ever see is "seat belts are great, but making them mandatory is a nanny-state gone wild, and soon you won't be allowed to leave the house".

I think the myth of regulatory strangulation is just some fantasy of people who see a stack of paper more than an inch thick and for their lives cannot imagine why it would require so many words to make air travel (etc.) safe.

There is a fantastic overview of this by Matt Levine at Bloomberg.[1] (The following is all quoted text):

I am working on a tentative theory of regulation. It goes like this:

1) There are two kinds of regulations: custom regulations and bulk regulations.

2) A custom regulation is designed to accomplish a particular goal. You want people to do something, so you write a rule mandating that they do it and punishing them if they don't. For instance, if you want U.S. companies to keep jobs in the U.S., you might write a rule to mandate that, and to "impose a 'very major' border tax on companies that move jobs outside the U.S." That is an example of a custom regulation, and it is good because it keeps jobs in the U.S.

3) Bulk regulations are the kind that you buy by the yard, ones that you measure by quantity rather than purpose. They don't have a purpose, really; they are just generic "red tape." These are the regulations that presidents frequently announce they will cut in half, or freeze with an executive order. They're the regulations that come not from a reasoned desire to achieve a particular goal, but from a pure impulse to regulate. Bulk regulations are bad because they prevent businesses from doing business-y things without accomplishing anything good.

4) All regulations are custom regulations.

5) All discussion of "regulation" is about bulk regulations, which do not exist.

[1]: https://www.bloomberg.com/view/articles/2017-01-24/metrics-f...

I love Matt Levine, but he's wrong that "Bulk Regulations" don't exist. They are called licenses, and you have to have them do operate many types of businesses. See hair salon licenses, landscaping licenses, etc. There is zero reason for licenses in many low-skill sectors, but they exist and are regularly codified into state or federal laws as a requirement of doing business. These licenses are enforced by rent-seeking gatekeepers, who effectively drive out competition and raise prices on consumers. This is very much "bulk regulation", and is terrible for everyone.

How do you enforce uniform education about say, hygiene without a license?

You say there are zero reasons but I think there had to be at least one once, and probably still.

Do you need 1000 days of training to properly dress hair? Because that's the average training required to acquire the license.

Cosmetology licensure requirements usually specify hours, not days.

Even still, cosmetology is a surprisingly complex practice which really does require a lot of hands-on training to become proficient. You should talk with a licensed cosmetologist sometime to get the ins and outs of their education.

Or talk to someone who actually uses a cosmetologist's services regularly. It's very easy to gloss over the depth and complexity of a field when you don't even need its services.

But somehow, millions of women manage to wash and dry and style their own hair at home with no trouble...

This time of year and all, the tax code comes to mind. It's my only real window into government bureaucracy, but if it's anything to go by the idea that the government can produce a tangled mess from its competing incentives and constituencies is plausible to me.

While we're on the subject of really great federal websites take a look at the Federal Register. I think you'll come away with a wider and more empathetic perspective on the issue of regulation.

Agencies have the power to create rules, and they create a lot of them. The Federal Register was 81,611 pages in 2015. Granted not all of that was new rules.


I seriously love this website, and I love the Federal Register. The typography of the PDF's, the searchability of the website, the open API's. It's really fantastic.

Government overreach happens constantly. Just ask the people who recently had biometric eye scans for domestic air flights.

>18F specifically does not recommend using Bootstrap for production work because:

>It is difficult to adapt its opinionated styles to bespoke design work, and

They have a tool on their website to generate a theme (Bourbon and PureCSS do not), and Sass functions and mixins for customizing elements. Bourbon and PureCSS's components are just as opinionated (and there are less of them).

>Its CSS style places semantic layout instructions directly in HTML classes.

Sure, but you can just use the Sass mixins instead, allowing you to use the grid system without adding a single class to HTML.

Seems they knew this about Bourbon:

>Bourbon is a Sass mixin library that has extensions for a robust semantic grid (Neat)

The same is true of Bootstrap, so they recommend against it?

It sounds like the author wasn't familiar with Bootstrap.

I imagine you're right. There's so much hissing and booing and the mention of bootstrap that a lot of otherwise good developers never touch it.

Despite the fact that, you know, bootstrap is just fine.

Their wonderful tag line at the bottom of the page footer:

We’re from the government, and we’re here to help.

The full original quote being: "The most terrifying words in the English language are: I'm from the government and I'm here to help."

[0] https://en.wikiquote.org/wiki/Ronald_Reagan

I love this:

> The UI components are built on a solid HTML foundation, progressively enhanced to provide core experiences across browsers. All users will have access to the same critical information and experiences regardless of what browser they use, although those experiences will render better in newer browsers. If JavaScript fails, users will still get a robust HTML foundation.

I guess it remains to be seen just how robust this really is, but it's a fantastic goal to see explicitly embraced for modern websites. I skimmed some things in Lynx and w3m (elinks failed with an SSL error; rumor has it that it doesn't support SNI), and it honestly looks better than I remember a lot of sites looking in Lynx ~20 years ago, let alone the average modern site. Sure, part of that is imagemaps and frames going out of style, but modern sites haven't necessarily replaced them with more graceful constructs.

> An official website of the United States government Here's how you know...

> This site is also protected by an SSL certificate that's been signed by the U.S. government...

Oh really?

DST Root CA X3 - Let's Encrypt Authority X3 - standards.usa.gov

Kudos for using Let's Encrypt though!

Maybe some info in the cert is signed with its own key, so the cert is technically signed by the US government.

The main page developers will care about:


Some things are great, and show a pulse on the industry, but some aren't. For example, Bower is prohibited, but yarn isn't even mentioned or encouraged, much less required.

Though they do give devs the freedom to choose any framework, with a very nice pros/cons lists of the popular ones


Issues/PRs welcome! https://github.com/18F/frontend

Is yarn that big? I'm really curious as to why you think it should be 'required'.

I had a manager once who had a prior career in government. One of their many cynical internal catchphrases was "the conveniences which you have requested are now mandatory".

Anyone with common sense and a knowledge of history knows exactly where this is going. As things continue to move online, internet access and associated technology standards begin to be declared a "public necessity" or some other nonsense along similar rhetorical lines.

You don't even need to look very far for examples. UK accessibility laws (not that they're an unqualified evil, simply a legislative 'gateway drug'), references to 'digital haves and have-nots' from a few US election cycles ago, etc.

I'd really hoped to be closer to retirement before this sort of thing started happening.

Edit for clarity: I'm not saying this is immediately going to turn into some draconian thing, but as a founding member of a standards body[0] with a narrowly-defined intent, I've seen how easy it is for something like this to become a de-facto industry standard that non-experts use to judge things against, even when it's not appropriate.

[0] http://www.php-fig.org/

These are nice analogies and all, but can we please stop drawing grand philosophies out of something just because the people doing it happen to be in a government office somewhere?

This isn't a gateway drug. This isn't a battle between liberty and tyranny. These are web standards, they're generally a good thing, and they're generally overwhelmingly beneficial for everyone. That's it.

Freedom isn't lost overnight.

"The price of liberty is eternal vigilance."

Are you saying that legislation to make sure people with disabilities or low incomes still have access to services when all the alternatives are shut down is a bad thing?

That's pretty far from the mark, but being made to provide multiple versions of content to accommodate all ranges of accessibility for anything, not just vital or government services, published on the web could be a pretty difficult thing.

Content without captions spurred a lawsuit by the National Association of the Deaf against Netflix a few years ago. What if your personal website hosted some dynamic content where an uncaptioned sound bite appeared, and you ended up being faced with legal action?

What if the scope were broadened so that all websites required content compatibility modes for color blindness or spectrum sensitivity? Would that responsibility be shared by the browser makers as well? Would Google pull ads from businesses whose websites were not in compliance?

And in the Netflix case, the judge found that it would be “irrational to conclude [that] places of public accommodation are limited to actual physical structures.”[1] That's quite an interesting interpretation.

[1] http://www.slate.com/articles/arts/culturebox/2012/07/closed...

The foundational law has been around for over 25 years. It doesn't require anything of your personal website anymore than it would require your personal home to have wheelchair ramps.

> What if the scope were broadened so that all websites required content compatibility modes for color blindness or spectrum sensitivity?

The law already requires reasonable measures to provide access to people with disabilities. If a business website's colors prevents a customer from using the site, they could already make it a legal matter. But that doesn't happen often, it's much easier for everyone for a customer to contact the business, explain the problem, and for the business to make a change that corrects the problem (more likely, some number of customers explain and eventually something is done).

But standards and expectations do change over time, what's considered reasonable partially depends on technical possibilities, their general availability, and cost. It's reasonable to expect web sites to work with screen readers because they're readily available on almost all the computers people use; if screen readers didn't exist, it would not be reasonable to expect web sites to provide a purely aural interface for the vision impaired.

> the judge found that it would be “irrational to conclude [that] places of public accommodation are limited to actual physical structures.”[1] That's quite an interesting interpretation.

It's an interpretation that makes sense. The purpose of the law is to ensure people with disabilities have access to goods and services, it matters not whether such access involves concrete, paper, or ones and zeros.

> It doesn't require anything of your personal website anymore than it would require your personal home to have wheelchair ramps.

The only slippery slope that wheelchair users approve of :)

What if your personal website hosted some dynamic content where an uncaptioned sound bite appeared, and you ended up being faced with legal action?

Then I'd have to take the content off my website or use a service that automatically captioned it. To suggest anything else is to say that my website is more important than the accessibility of the internet to disabled users, which is nearly nonsense. A personal website is utterly trivial in comparison.

Businesses will very often do the minimum possible work to make money, which leaves anyone who can't access content due to their disability having to pay more than people without disabilities or it leaves them completely shut out of the service just because the business doesn't want to pay to make their service accessible. That's clearly and obviously not fair, and not something I want to happen in the society I live in, if only because I realise that I might have an accident one day that leaves me disabled and be in the position that disabled people are in now.

Of course that's not what he's saying.

Can we please apply the Principle Of Charity as the hosts have requested of us here and not resort to these sorts of rhetorical tricks?

I'm not suggesting the GP is saying having accessible services is bad. I'm asking if he believes legislation to ensure services are accessible is bad. As someone who believes web developers are often lazy and don't just do these things because they should I think the laws are important and necessary, else people will be left out from accessing what they need to access. I'd love to hear a counter argument if there is one.

> As someone who believes web developers are often lazy

There is some of that however what I've found there is more of is "We don't want to provide you the resources in manpower/budget/programmer time to make things accessible".

That and the laws are rarely enforced even WCAG is rarely implemented.

I'd love it if there was a clear bar for accessibility that was legally enforced as then I could say to clients "You have to do this to comply with the law" which is a much clearer "sell" than "You should make this accessible to <foo|bar> and it'll cost <fizz|buzz>" since clients largely don't care.

Now someone will argue that I should just make everything accessible anyway but I have to bill for that time/work/testing which means I'll be more expensive than my competitor who doesn't do that and clients are incredibly price-sensitive.

It's a hard thing to balance.

The internal product I'm currently building will be accessible, not just it passes WCAG checker accessible (which is a horribly low bar) but really accessible, I'm even planning to buy a couple of screen reader software (which is way expensive btw) and using it as part of my integration testing.

> a de-facto industry standard that non-experts use to judge things against

What purpose does a standards body have if not to set industry standards? Even if it's a government standard, would it not better than 99% of specifications a one-off contract would include? I mean – if it becomes outdated, the contractors can always make their case, and they'll probably have Google etc. not following the standard either to point at.

Also: slippery slope fallacy.

That's because the UK government wants to replace as many manual process by the web and make a load of people redundant

The introduction to the colors section talks about communicating warmth and trustworthiness. How does a mostly blue, grey and white page communicate those feelings. Personally, I have always associated blue and grey with cold and isolation.

This is really well done. Code is well laid out, well documented. Will make it easy to use this for training new college hires for sure. Didn't expect something like this out of the government. Nice to be surprised here.

This is great. Also worth noting in the UK, accessibility features for web pages are required by law (EQA, DDA). In the US, RAA 1998 is the similar legislation. Most govt agencies and private-sector businesses are required by law to provide accessible web pages, or face lawsuits and/or fines.

TL;DR: all sites of reasonable size (govt and otherwise) should follow WCAG 2.0.

WCAG: https://www.w3.org/TR/WCAG20/

UK - Equality Act 2010 (EQA): http://www.legislation.gov.uk/ukpga/2010/15/contents

UK - Disability Discrimination Act 1995 (DDA): http://www.legislation.gov.uk/ukpga/1995/50/contents

US - Rehabilitation Act Amendments of 1998 (RAA): https://www.congress.gov/congressional-report/105th-congress...

it says on the site: "This site is also protected by an SSL (Secure Sockets Layer) certificate that’s been signed by the U.S. government. "

But when you look at the certificate, it is signed by "Issued by: Let's Encrypt Authority X3"

From the installation: Note: Using npm to install the Standards will include jQuery version 2.2.0. Please make sure that you’re not including any other version of jQuery on your page.

Isn't that pretty much precisely what peer dependencies are for?

peerDependencies were deprecated as of NPM 3.

They're not exactly deprecated, they just don't install automatically anymore. As I read it (and use them in npm 3), they're a way to specify "you need this other library installed also" without actually installing it.

In a lot of cases ("we just need some flavor of jQuery 2"), of which I am not sure that this us-web thing is an example, it's handy. The thing you're distributing isn't standalone, it's a thing people will use in an environment that has some installed jquery already, so you're fine.

The way I see it, if you're using a library that you can specify very permissively, peer dependencies are quite handy, and in this particular case seem useful - I'm building some webapp, I need to know that my UI library wants jquery to work, but I have a choice of not necessarily using that part (probably used for DOM manipulation, so if I'm just npm installing it for the CSS, wev), or making sure I meet the dependency with the other stuff I'm using. If not installed by default is a problem, there's an npm-install-peers package that does that.

Is this supposed to be used by non-US gov sites? Can any old private site use it? It's public domain in the US, so assuming the answer to that question is yes, would it make sense for a private site to use it?

Legally, I'm pretty sure public domain is public domain (it's not just PD 'in the us'). You can do what you want with it without fear of copyright. And it's good style guide, but the US government is a pretty big brand to compete for an identity with - it doesn't really seem like it would be a good idea to just adopt their style guide wholesale.

However, using this as a framework to build your own style guide sounds like a pretty decent plan.

Unfortunately, some other jurisdictions don't have a concept of "public domain" works. That's why the page indicates "... this project is in the public domain within the United States." in its license [1].

But you answered the main point of my question, so thanks!

[1] https://github.com/18F/web-design-standards/blob/develop/LIC...

I have an older Firefox that I'm running and the Web Design Standards page is all borked up. I'm wondering how many people that don't keep the upgrade stream going for their browsers will now have problems.

How old? I have a copy of Firefox 3.6.27 from 2012, the main content is below the sidenav, probably because its in an <aside>, support for which wasn't added until Firefox 4. That's okay, it's still usable, the content is still all there and readable.

But there are some other things that are more fundamentally borked, the accordions won't operate. I'm for progressive enhancement and while requiring JavaScript for a site to work well or in some cases work at all is okay, there should still be some provision to "fail safe" so the content is available when the specific JavaScript fails for some reason (I don't think it's because they're using <button>, technically using that outside a form was against old HTML specs but I think browsers still routinely supported them). For accordions, my preference would be to use <details> and <summary> elements and a polyfill to make them work in browsers that don't support them; they'll "fail safe" and display all the content in browsers that don't support them and don't run the polyfill.

24.6 ESR from June 2014. The side navs are now on top of the main content. So the main contend slides under the side nav and is hard to read. I understand that it's a 3 year old browser, but it's on a Win7 machine, like lots of other web users.

Just an FYI but that version is probably full of pretty serious security issues. 24.1.1 has 78 published cves[1] and that list isn't complete.

I'd update!

1. https://www.cvedetails.com/vulnerability-list/vendor_id-452/...

I realized that Firefox 3.6 not supporting <aside> wouldn't really affect the layout, it's basically a <div> and it's floated left, something 3.6 can handle. The main content has a class containing width: calc(100% - 250px); and Firefox didn't support calc() until 4 but that wouldn't explain it happening to you in 24.6 ESR.

Maybe older Firefox has a bug in calc(), disabling that property in a newer browser is enough to shift the content below the sidenav because it falls back to width: 100%;. I see what you mean about hard to read, if I disable the calc() width, the sidenav stays put but the main content scrolls on top of it. That's a little confusing because the sidenav staying put is due to the property position: sticky; but caniuse.com says Firefox didn't support that until version 26, version 32 without having to enable a flag.

Anyhow, I do think government guidelines in particular should handle older browsers better than this page is demonstrating. Part of the point of using frameworks for libraries and others is they can maintain and aggregate all the little solutions to problems between browser implementations and between older and newer browsers.

This is a really interesting question, given the very long tail of old browsers out there. There's downloadable data about the browsers visiting gov't websites (all anonymized by Google Analytics)[1], but that doesn't solve the "when do you stop supporting" question, particularly for old versions of IE.

[1]: https://analytics.usa.gov/

Also notice the very inclusive picture on their landing page template.

So the General Services Administration (GSA) forked Bootstrap. =)

Why does this standard not fall under the purview of N.I.S.T.?

So these are the web design standards the government abides by?

What will happen is that adherence to these standards will be embedded in federal government software contracts moving forward. It's a good start and written by a common sense group of people.

And so the Hero Unit becomes a US standard.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact