This is so interesting. These modern sites clash really hard with my mental visual of "Government technology." And it led me to this, which is maybe even more interesting: https://federalist.18f.gov/
I must applaud your legislators for their foresight when they wrote the federal copyright charter. All works by. U.S. federal employees are in public domain, forming one of the planet's largest bodies of free online resources. The cultural impact of this collection is considerable also globally.
In my country, all tax-funded government works are copyrighted. It's an ongoing battle to make each department release their works under a free license, and massive bodies of government-owned digitized cultural works sadly remain accessible only to those who pay for them.
> Doesn't apply to works by contractors paid by the government though.
Depends on the terms and conditions of the contract. Keep trying to get people to ensure they secure rights for the government by default but it's a constant struggle to get people to sweat the details :p
We should always demand more from our governments. But we should also celebrate them for doing good. The politicians and government workers are people too, and while different people have different motivations I know that I am not alone in valuing praise from others highly.
If all someone hears is complaints, I think they are probably more likely to burn out. I don’t have any source to back this claim, but it’s how I think it is.
I have very high standards, but expecting public information to be public wouldn’t even be meeting those. It would be meeting one of my most basic standards.
To reject and fight FOIA requests takes _effort_, more so than simply complying with them. So no, I will not applaud shit. Our government needs to get past “basically good” before I praise them for frosting like this.
But feel free to embrace your feel good powwow approach. It’s done us so well so far.
Hello, are your web analytics also publicly available ?
Th EU has http://ec.europa.eu/ipg/services/analytics/ but I recently discovered that I cannot access any data as a citizen. Legally, I'm convinced this data should be public domain.
If this is meant for web sites, I have a question and I ask this on HN a lot. Why does the government need to provide ANY fonts? I have a real problem with websites pushing their own fonts on users. I don't think it's their place. Users should select their preferred font in their own browser. Why would a US government site feel the need to use particular fonts?
If this is for publishing in .pdf format I can understand. If it's for web, I just don't get it.
Because ~no users curate their fonts, and making your site look better for the ~all users who don't curate their own fonts is the better more pragmatic choice by such a large margin that it isn't really worth much thought.
Does this negatively affect your experience? Can't you override fonts if you want to?
Edit: note, I'm not the person the question was directed at, and have no connection to the federalist project at all.
Every time I've visited a site where I thought, oh that's a terrible font, it's been basically one of these reasons:
* Improper selection of default font for constrained display eg tabular
* Or the more popular "Let's create a font because we can. It's branding."
There are of course other considerations involved like the increasing diversity of rendering devices, but that doesn't make the fact that any web site font issue is nearly always the fault of the web site operator.
I can't recall any instance of "oh, that website with a great custom font is so appealing I'm going regularly and voraciously consume its content". The opposite of that is true though.
I know there have been a couple of occurrences where I dropped into the web dev tools to see which font a site was using because they used beautifully, fitting fonts.
Properly selected, fonts contribute to the user experience so seamlessly that most users don't even realize that the font used is something out of the ordinary.
This is similar in effect to users liking a site more when it is faster, but attributing the improvement to any number of other things that haven't actually changed.
This is an interesting perspective, and one that could only exist in the past few decades.
For millennia, of course, the written word would appear in the particular style of an individual scribe, and might take on an entirely different look when copied by another. Gutenberg’s movable type introduced the concept of a text appearing exactly the same across multiple copies, something that has persisted for centuries through lead type, phototypesetting, and into PDFs. Only with the advent of information technology in the 20th century did the idea arise of text appearing in formats other than those chosen by its publisher, advanced by technologies such as TeX and HTML.
In the latter’s case, however, the original idea of a platonically structured document to be interpreted per the user’s preferences has long since been superseded by a return to the concept of the publisher defining the presentation. Client CSS never caught on, while server CSS took over.
I would argue that a font is a part of an organization's brand. I think publishers like Airbnb and google have built around Cereal and roboto respectively.
Some of us do, and that's probably the best lifehack I've found so far.
Enforcing the same font and size across all pages is ergonomic and does lower the cognitive overhead of recalibrating your sight-reading for every new page you visit. Just my two cents...
Although we are part of the government, we operate as a business unit that must pay for itself to exist. To do this, we have to charge other agencies for our services. My group is "cost-recoverable," meaning, that we are longer losing money and can reinvest back into the product. You will notice our Github activity picked up recently and even more significant changes are coming soon. So, I'd say we are in the best shape we've ever been.
I was at 18F for almost 4 years. In that time, many of the projects I worked on were internally-facing. The US Government runs thousands of internally-facing websites and web applications. Many of them are entirely internal to single agencies, others are for inter-agency collaboration. Many of them require PIV or CAC cards for auth. And yes, many of them - whether card-authenticated or not - are utterly horrific.
This is a big part of why the USWDS project was created. 18F is tiny, and there's no way it could address even 1% of .gov by itself. Fortunately, there are many people all across government who want to improve UX on sites and apps. USWDS is a good starter kit that significantly reduces the cost of such projects for everyone, not just 18F.
^ what Yoz said (Hi Yoz!). Also within the last 6 months login.gov added support for PIV/CAC cards so while that product only handles the sign on experience, there is some hope for internal sites improving across the board. The tools exist at least.
I’ve found Boston’s city websites [1] to have a nice user experience and fantastic branding that ties in not just all the city’s web properties, but also some official postal mail flyers from the city, printed placards and signage at City Hall, and more.
I haven’t noticed this strong of a branding in other cities I’ve lived, but would love to see examples of others.
Boston even maintains its code on github! [2] Their digital department also has a roadmap of what their initiatives are, and it’s communicated in what I think is such an easily digestible manner [3]
The USDS (US Digital Service) was born in the fire of the healthcare.gov fiasco, and several industry veterans from Google stepped in to fix the infrastructure issues they saw within the US Government. Currently, @Matt_Cutts is leading the org. I can only think that your comment brings a smile to his face.
IIRC the USDS is also in part based on GDS[1] who did (and continue to do) a really good job at dragging parts of the government kicking and screaming into the future.
Does anyone know what it's like working for the USDS? Curious what it's like to work for the Government but under the leadership of presumably a bunch of ex-Googlers.
I really wanted to work for the USDS for a while. I'd still like to, but the invasiveness of an SF-86 which is required for some reason even if you don't need a security clearance, drug testing (I'd pass, but I'm not okay with requiring one anyways), and the fact that I'd have to move to D.C. make the sacrifice a big thing to weigh against the public good you'd be able to do. It's a tough call.
USDS's requirement to be in DC is a tough ask, but necessary given the work they do. At 18F, we travel to DC a couple of times a year, but we are mostly remote. There are significant hubs in SF, NYC, DC, and CHI, but the remote culture is the most successful I've seen.
I'm fine with the SF-86 once every 10 years. It's moving to DC Metro that's the deal-breaker.
And even if I were okay with both things, I'd rather fill out the SF-86 to get a real clearance, and work for one of the military-industrial-congressional complex "Beltway Baron" companies for higher pay, more job openings, and no 4-year term limit.
There are also a couple contractors that came out of the healthcare.gov rescue (notably: Nava [1] and Ad Hoc [2]) trying to do the same kind of work. I recently left a long career at Google to work for Ad Hoc, and we get to work alongside the great folks at USDS trying to make things better.
Hit me up if you're interested in getting into this space!
I always enjoy reading the job posting in the monthly whoishiring thread. I also like that Matt is active in the comments answering questions. For example, April: https://news.ycombinator.com/item?id=19545383
The Forest Service worked with 18F on this Open Forest project to enable people to buy Christmas tree permits online, which uses United States Web Design System components: https://openforest.fs.usda.gov/christmas-trees/forests
https://openforest.fs.usda.gov/christmas-trees/forests has a pretty heavy dependency on Javascript (read: there's no fallback to anything non-javascript, so I get a white page). It feels like that goes against some of the things that 18F is about.
Pages don't need a fallback to non-Javascript in 2019. Accessibility technologies now fully support a JS enabled web, with accessibility standards following suit.
That leaves only those who have voluntarily disabled JavaScript (<1% of users), but fortunately those users are typically aware of how to resolve the issue of their own creation.
I've worked on public facing government websites (not 18F). We simply don't support this edge case, and our legal department supports our legal right to do so (in particular in relation to ADA requirements).
I wholeheartedly disagree. This is a public-facing government website. It ought to degrade gracefully in order to reach the broadest possible audience. Or, write a decent site to start with, and you wouldn't have to worry about degrading. The entire page is nothing but a shell for a web app written in JavaScript that doesn't need to be a web app written entirely in JavaScript.
> It ought to degrade gracefully in order to reach the broadest possible audience.
You mean under 1% of users that intentionally broke their browser?
Seems unreasonable to dedicate resources to that, that could be better spent on 99% of our users. Why should the 1% get special treatment? And what other parts of their browser can they disable that we need to support, perhaps no CSS? Maybe IE5? Maybe they only render XHTML? Etc.
> You mean under 1% of users that intentionally broke their browser?
I'd say: configured their browser to work like a browser instead of like a platform to run arbitrary code from the Internet.
Ideally we should be able to trust most of the web sites we visit. The last few years have shown us this is a bad idea, here are my two top reasons:
- security: while I'm personally less concerned with reasonable ads there are a number of problems with ad technology, like infectious ads and creepy tracking.
- a bigger problem for now IMO: poorly written web apps that makes the machine noticably slower.
There is still one important scenario that benefits from supporting a fallback to "classic" HTML for web sites (and apps): Bandwidth constrained environments. GMail's plain HTML version is an existing example (a link to it shows up if the JS version takes too long to load).
Anyway, for public facing sites and apps, you may already be doing most of the necessary work for SEO purposes. Letting humans access the version that you're showing to search engine spiders shouldn't be a huge burden.
> Pages don't need a fallback to non-Javascript in 2019.
Strongly disagree. Having your page completely break instead of degrading gracefully puts up a barrier to those who cannot run JavaScript (for example, users with older, weaker computers).
People on "older, weaker computers" are likely running a browser we also don't support (<IE10) on an Operating System we don't support, and they'll likely see a TLS error before even hitting the load balancer (we don't support SSL or TLS 1.0).
It is unlikely that there exists a subset of users with a modern enough computer to even hit our web servers that is under-powered to the point of not handling JavaScript. Our analytics definitely don't show this.
> It is unlikely that there exists a subset of users with a modern enough computer to even hit our web servers that is under-powered to the point of not handling JavaScript. Our analytics definitely don't show this.
Does your analytics correctly register hits from clients that doesn't support Javascript (I'm thinking about survivorship bias).
Thank you for the response. I wasn't trying to suggest that it was a requirement, simply that it seemed odd that there would be no fallback whatsoever. I guess I'm behind the times.
> Pages don't need a fallback to non-Javascript in 2019. Accessibility technologies now fully support a JS enabled web, with accessibility standards following suit.
I think we should broaden the definition of accessibility to make websites aren't unneccesarily annoying or invasive for normal users either ;-)
(And yes, making web applications is part of my job.)
> Pages don't need a fallback to non-Javascript in 2019. Accessibility technologies now fully support a JS enabled web, with accessibility standards following suit.
This is wrong, of course, because it requires that people buy expensive hardware to use the latest accessibility technology, which is not generally available. It's like demanding that people buy electric-powered wheelchairs instead of making your building accessible to normal wheelchairs.
> We simply don't support this edge case, and our legal department supports our legal right to do so (in particular in relation to ADA requirements).
And I'm sure that ignoring poor people enables you to sleep very well at night.
If you disable a tool which you know is used to build many websites nowadays then you really should expect a deprecated experience. You're in the minority and honestly I don't know why you should expect web developers to cater for you.
Because things like blogger and Google groups shouldn't require js to just view text. There is a large, large, enormously large body of work out there that doesn't need JavaScript to achieve its goal like showing text or images, but are more-or-leas broken without JavaScript. That's the tragedy. I don't expect a web app, such as Google maps or Google docs to work without js, but I should be able to read blogs and newsgroups.
That situation does not normally come up, because I stop trusting that site. If a page that just displays documents requires javascript to do so, then clearly the developers were not worthy of my trust.
There are legitimate uses of javascript, but I'm not going to enable it just to look at a document.
First, as a person who makes web sites and apps, I believe it's unprofessional to fail to account for users who can't or won't run JS. JS abuse is rampant. I choose not to let sites run JS by default because I don't trust most sites not to have some sort of compromise or malware.
Second, it's fine with me if some site wants to forgo my patronage, or provide "a deprecated experience", but not the government.
I certainly do expect my federal, state and local government to follow best practices and provide working websites that I can use without running JS.
I can't think of a single function of government that requires JS, thank God.
> I certainly do expect my federal, state and local government to follow best practices and provide working websites that I can use without running JS.
Best practices is a moving target. What made sense in 1999 doesn't make sense in 2019. The web simply requires JS, CSS, and HTML today. If you disable any one you aren't compatible.
There's no actual argument for why websites should spend significantly to support a tiny subset of users that intentionally break compatibility for ideological reasons. It is unfair to our other >99% of users who we'd have more time for.
The old arguments such as accessibility aren't correct any longer: accessibility devices specifically support JavaScript (text contrast, HTML organization/order, aria tags, video subtitles, etc remain highly important).
If you really insist on a JavaScript free world you are of course welcome to visit a government office in person, pick up, and mail back a paper form. The website is merely a convenience we offer to you.
Otherwise you'll need an IE 10 or newer browser, on an Operating System that supports TLS 1.1 (Windows Vista or newer), JavaScript, CSS, and HTML.
> Best practices is a moving target. What made sense in 1999 doesn't make sense in 2019.
Sure, but it still makes sense to use JS sparingly. Running untrusted remote code in your browser is a huge nest of attack vectors. I don't think that, in 2019, running JS from the open internet willy-nilly can be described as "best practices", despite the prevalence of it. We're not there yet. If three hundred million people jump off a bridge I'm still not going to do it too.
> The web simply requires JS, CSS, and HTML today. If you disable any one you aren't compatible.
That's, like, your opinion, man.
You're trying to insist that your concept of the Internet is the concept of the Internet. It's a self-fulfilling prophecy. But it's not quite true yet, eh?
> There's no actual argument for why websites should spend significantly to support [non-JS users]
Right, they shouldn't spend more because the tech they use should provide for non-JS users out-of-the box without additional overhead. If devs have chosen NOT to use tech like that then they are at fault, not the user, eh?
> for ideological reasons.
What about for security reasons?
> It is unfair to our other >99% of users who we'd have more time for.
But the reason you have to "spend significantly" to do the right thing is that you chose to use and deploy crappy JS frameworks, not that some people refuse to run your crappy frameworks. This is classic "blame the user".
Now, this is your prerogative if you're doing your own site/app, but the government doesn't get to exclude some people from service just because they don't run JS. Speaking as a techno-elitist, that's techno-elitist BS.
> If you really insist on a JavaScript free world you are of course welcome to visit a government office in person, pick up, and mail back a paper form.
AH-whaaaa? Rather than fallback to plain HTML+CSS you're content to let the user fallback to hard copies and physically transporting their meat-puppet? To save costs? On web development? Where's the sense in that?
> The website is merely a convenience we offer to you.
Well, no. It's an INconvenience you offer me. If you're offering convenience to most people but deliberately excluding some that seems to me to go against the egalitarian spirit of our American government, no? "Unfair"?
> Otherwise you'll need an IE 10 or newer browser, on an Operating System that supports TLS 1.1 (Windows Vista or newer), JavaScript, CSS, and HTML.
I run Dillo. A government website that doesn't look decent and work right when accessed with the Dillo browser is just broken and sad in 2019.
>First, as a person who makes web sites and apps, I believe it's unprofessional to fail to account for users who can't or won't run JS. JS abuse is rampant.
If you do web for a living, you should also know that fallbacks and graceful degradation and server-side rendering all come with a cost. Both monetarily and in terms of complexity.
I would bet that the vast majority of government websites are nothing more than simple forms or informational pages.
Yes, I'll Grant you that if you've designed an interactive, multipage form then it is costly to rebuild it to degrade. However, I'd argue that the form was probably unnecessary technically complex and that starting with having a degraded option in the first place isn't additionally costly (and better protects the agency from ada lawsuits).
In the very rare cases that we're talking about a true webapp (i.e. Google docs, or I'll even Grant you GIS (mapping)visualizations, even if it's possible to degrade those), then yes, decisions need to be made on what minimal technical requirements are required. E.g. a government agency offering an application that only works in chrome would be a non starter.
While I agree with you that most if not all sites should offer some sort of no JavaScript fallback, the development resources required to offer such a thing is just generally unpractical when it's such a small minority of users. Government or not, they still have to choose between spending their limited departmental resources on an extremely small minority, or the greater userbase as a whole.
> the development resources required to offer such a thing is just generally unpractical
Yeah, iSnow made the same point, but it doesn't scan. "Boss, I can make it work w/o JS but it will cost more..." Huh? Doesn't that sound just like what an unprofessional developer would say?
The fact that so many popular JS frameworks don't do the right thing is part of the JS abuse in my opinion (same goes for accessibility.) Lazy developers wrote half-assed frameworks and other lazy developers chose to use them and then people start to believe that adding JS somehow makes it hard or expensive to do without JS when really they are just doing it wrong in the first place.
> when it's such a small minority of users.
The population of the US is just under 330M, so if, say, 0.5% can't or won't run JS to interact with taxpayer-funded government services that's about 1.5M people. Those folks (of whom I am one) should not be disenfranchised, so to speak, because the gov hired unprofessional developers. The government shouldn't do that, and they certainly shouldn't try to tell me that I'm some out-of-date digital neanderthal for caring enough about web insecurity to disable JS, eh? Lousy devs (as demonstrated by the fact that they can't provide a non-JS web experience/fallbacks at an affordable rate) are precisely the ones the JS code of whom I have no wish to run, and I certainly don't want my tax dollars going to pay them to screw me out of access to the service also paid for by my tax dollars.
> choose between spending their limited departmental resources on an extremely small minority, or the greater userbase as a whole.
Or they could use tech that works for everybody automatically for the same cost, eh?
Disabled users don't get a choice, that's their lives. People who voluntarily disable a core browser component do. Apples and oranges. You should (and legally need to) worry about accessibility.
When I can't access the page content I read the page source. Disabled users can do the same. There are other workarounds like graphic captcha solvers. There's nothing unsolvable about accessibility, it's just annoying.
Some do, and you're also going out of your way to require javascript from your site that just displays documents.
The "modern web" is dramatically worse (slower, more compute intensive, less consistent, less secure) than the web used to be. This is mostly because of the tendency to force javascript into places where it was never necessary.
At this point, there is no reason to support <IE11 style JS, but JavaScript-less browsers are still A Thing and always will be. Search engines, Opera Mini, various accessibility thing, people with extremely low bandwidth, and weirdos who choose to disable JS are all factors.
Yes, if you're making a webapp, there's no good way to do it without JS and you can just be upfront about that. Also, people who can't use ES6+ are going to zero over time, so it's fine to just write ES6. But you should also have a no-JS fallback version of an information page (anything that's not a webapp) because that's an important usecase that won't drop to zero over time.
Before I submit a bug report-- is there a good reason that the zero isn't dotted in this font?
For example, executive order numbers have both "O" and "0" appearing in them. But I'm sure there are stronger examples in various gov't depts. where those characters can occur in sequence and create errors. Not to mention any code snippets that already appear in government web sites.
I suppose the extra ink could be a concern, but we should be moving away from paper printouts anyway. And as we do that, extinguishing a small dot in this great nation's zeros would be an important symbolic first step in combating climate change.
I’m wondering that myself. It makes a BIG difference to me when strings appear out of context. With zeros and Os together it can be very confusing, and the dot or slash makes a big difference.
I wouldn’t put it in the standard numerals, but it could work in the lining/tabular figures. If nothing else, this would be a good place to use an OpenType stylistic set: https://typofonderie.com/font-support/opentype-features/
Usually these are available as “alternates”, can be enabled in system text dialog on MacOS, also through css. (Not on desktop right now so can’t check).
What is the advantage of this over Open Sans? Is it less encumbered legally? Or are there other advantages? Or is it simply another alternative, which is also great.
Yes, Open Sans is under the Apache License while Public Sans should be under Public Domain (although this is a bit unclear and is being debated in the github issues).
While that's a great reason in the US, internationally unless it's under the CC0 or similar the Apache License is actually preferable to Public Domain (assigning things to public domain is not trivial or even possible everywhere)
Sure. In any country without a concept of releasing something into public domain the fact that you did so would be irrelevant and the Apache License would apply, in any country where you successfully released it into public domain that already gives all the rights granted by the Apache License. The restrictions of the Apache License obviously wouldn't apply in those countries.
Though if you try such legal tricks you may as well use the CC0 [1], it's designed to emulate public domain as closely as possible in a way that's valid everywhere.
Assigning new works to the public domain is a pain in many jurisdictions, but if a work originates in the US as public domain, it is in the public domain everywhere.
Note that this doesn't apply to non-USG-made derivative works, though.
The older I get, the more skeptical I become. Kudos to the author for releasing their project, it looks like a lot of effort went into it, so I hate to sound overly negative, but I just can't help wonder if there is a practical reason for the US government to develop its own font?
Is Sans Serif Fonts the department where the US government can really make a difference, or is this just a designer who happened to get a job with the US government and designed a font because they always wanted to and now they got the opportunity?
The way I understand the "appeal to worse problems" fallacy is: "Y is worse than X, so why work on X if we haven't solved Y yet?"
In this case, I see X as a solved problem (perfectly appropriate Sans Serif typefaces exist) and Y as unsolved, whereas the fallacy treats both X and Y as unsolved.
I'm not the parent, but now you have a disagreement (it's not solved; solvedness is a continuum) not a fallacy.
I don't think it's in good faith to assume that because parent said "perfectly appropriate Sans Serif typefaces exist" that the implication is that solvedness is a binary; the implication could just has easily been that the problem is sufficiently solved so as to not matter, which is a disagreement about where to draw a line, not an abuse of logic.
I sympathize with the premise of your argument but disagree with you conclusion in this case. High quality typefaces are fundamental to any digital product within the government, commercial, and public sectors. In my opinion, this is a great contribution to the commons.
Why bother having an architect of the Capitol, then? Why bother building government buildings out of stone when wood is cheaper? Appearances matter: would you want to do your taxes with Comic Sans?
These have existed well before someone said "let's create Public Sans!". It would be nice to know what the reasoning was. The Github page only goes into the details of how Public Sans is different from Libre Franklin.
As a taxpayer I personally am pleased that this is at least one place the government isn't subsidizing some corporation for private gain at my expense. Even if a foundry licensed the font to the government for free, it would be free advertising for that entity. This is one area where I think it makes excellent sense for the government to have done something on its own. The surprise is that it actually looks decent.
Many (most?) of those typefaces are commercially licensed, and can't serve as a universal "default" for typography across all the USG's web properties. Additionally: almost none of them are in OS/browser font stacks, so using them would incur logistical problems in addition to licensing.
There are two problems with that reasoning. First: it is clearly contradicted by the inclusion of Source Sans Pro and Merriweather, both licensed under the SIL OFL and developed by third parties, in the USWDS list of components. Second: creating another font that isn't in operating systems and WWW browsers does not solve the latter problem.
Switching to Public Sans solves none of the problems you're pointing out.
According to the Public Sans's Github page, "Source Sans Pro" was the USWDS default, which is a SIL font (so no licensing issues) and is also not in the OS/browser font stack.
I don't understand your second point. Public Sans isn't in OS/browser font stacks, either. What makes its inclusion more likely than, say, a 15 year-old open source font like DejaVu Sans?
I took the parent comment to be saying, "look, there are all these preexisting sans typefaces, why not just take one of the ones on this list". The answer: because they'd have to arrange to pay, and to source them from the commercial services that make them available.
I agree that there are other typefaces they could use! I was mostly commenting on how a list of faces that includes Avenir, Univers, and FF Meta was probably not the best summary of the available options.
I've been wondering why all these new fonts tend to have a strong focus on sans serif font (Fira has no serif variant, Source Serif Pro came later than the other Source fonts). Aren't there a lot of situations (namely, lengthy legal documents) where you'd prefer a serif font before you want a sans serif?
Primarily because they focus on interfaces, not publishing. Serif is traditionally used for legibility of long text. Sans always wins for actual interface design (at least those with some manner of complexity) because it's much less visually distracting when used in a variety of weights and sizes. Interfaces tend to have much more mixed usage so we usually go with sans.
Generally just using the system font is fine for most UI interfaces and the only good reasons to change it are:
1. You have a specific branding style you need to achieve (this is becoming rare)
2. Your interface has pixel perfect user created content where the difference in font to font usage between OSes / Browsers would cause actual breaks in layouts from user to user.
I run into the later often, and so we end up picking a neutral style font like this because it's the closest thing to a generic "system font" like Roboto or San Francisco that has a friendlier license.
Thanks for the link! I've been looking for a free-as-in-beer, good-looking font family with tabular numbers and had pretty much given up on hope. Inter UI is fantastic!
Not really. Serifs used to be too detailed for displays and still are except hdpi displays. So for digital so go sans and people really attached serifs with something archaic and institutions dont want to look like that.
For print sure serifs are good but again it depends. There are many highly readable sans serifs (that have big x-height).
If you're just looking for a font for your microsft word or other office document software, you're most likely not going to bother downloading new fonts when Word has plenty of fonts available easily.
I would imagine tech geeks and artists are generally the people who are downloading new fonts, and those poeple are probably spending most of their time using sans serif fonts or monospace.
I agree that it's worth looking at all public structures of font selection as well. The Highway department is in this right now. Highway Gothic is old an difficult to read, yet instead of a public domain free to use sign font, they are battling the use of Clearview which comes with a cost because it has a copyright attached to it.
There are multiple needs here and it seems the only action comes from the "sexy" side of things.
Good choice to base this on Franklin Gothic. Given that IRS tax forms already use it, it's effectively been part of the US government's "brand identity" for a long time.
The question neither this page nor the GitHub answers is... why? I understand a lot of corporations did their own fonts to escape font licensing. Is there fonts the government is currently licensing and trying to get out from under? Where will I see this font? Is there a reason other open source fonts weren't already adequate?
A typeface is a key part of an organization's visual identity.
A typeface expresses many things, you can imagine it as different sliders along, say, 100 different dimensions, similar to songs expressing combinations of feelings.
An off-the-shelf typeface will express things that are close to exactly what you're looking for, but almost never exactly 100%. Commissioning a font gives you exactly the visual identity you're looking for, zero compromises.
Additionally, distinctiveness/uniqueness has its own value too -- corporations commission typefaces so no other brand will share the same identity. For a large nation-state, that carries the same value.
(Although, unlike a company like Microsoft or IBM, I'm not sure if the US Government can prevent any private company from using it?)
In this case, the most powerful nation-state on Earth established its exact visual identity by forking a font from an Argentinian dude whose last pic on Instagram is of Ernesto "Che" Guevara x)
I could totally see a local government org being put through the ringer for every bit and bob of a website. If there is already an established font choice, that's one less widget they are being charged for.
Interesting that this web site uses a Let's Encrypt certificate. I suppose it saves the maintainers from being gouged by a beltway bandit charging thousands to monitor and then update an ordinary CA-issued certificate.
(I work for 18F but don't officially represent it here.)
The site is hosted via Federalist (https://federalist.18f.gov/) a SaaS which runs on top of sibling Platform as a Service cloud.gov (https://cloud.gov), which is based on Cloud Foundry. (Think of cloud.gov as "government-compliant and -operated Heroku" and you're not too far off.)
One of the benefits of using cloud.gov is that it automatically brokers and renews certs from Let's Encrypt... Math is math, and there's no sense spending taxpayer money where we don't have to! Here's the page about it, including a link to the broker source at the bottom:
https://cloud.gov/docs/services/cdn-route/
I'm very curious your thoughts about the NTE clause of employment stating you can only work there a maximum of four years. And you are initially hired for a set two year term.
Different strokes (literally). I quite dislike it. In my mind this font lacks symmetry and consistency needed to be easily readable. Especially how the curved base appears to drop below the flat base! It hurts my eyes.
You mean how e.g. the base of "o" drops below the base of "r"? That's standard optical compensation because if you make them level, the curved base seems to sit higher than the flat base. The amount of compensation needed differs depending on point size.
It definitely doesn't. His argument kinda relies on the idea that the government interacting with a religious organization in any manner is a violation. This is just not the case
Does it? To my admittedly untutored eye the first impression is Helvetica aka Switzerland, although it's not quite as stripped down: https://en.wikipedia.org/wiki/Helvetica
I can only imagine the hours of meetings and deliberation that went into developing a one-size-fits-all, cross agency font. That meets all sorts of weird political and accessibility guidelines.
Still, it only has to be done once! I'm a huge fan of what the USWDS has done so far. I love the model of an internal tech resource for the government.
I am really impressed with 18F and how the federal government has really stepped up their game on the digital/open source front.
It does make me wonder though if the usage rights of these projects should be restricted to use by US taxpayers though since that is who is ultimately paying for this work.
You could say the same thing about the UK government's equivalent, which I would argue is somewhat of an influence on 18F (though I'm not an insider). They actively open source their work, this being a great example of it: https://github.com/alphagov
I don't think that 18F and US taxpayers lose out on anything by open sourcing this. It builds a lot of good will and they are probably using systems that have been contributed to by others as well.
It goes further than just things developed by GDS themselves, all government projects for which GDS is responsible (i.e. they conduct audits and assessments at alpha, beta and live) are also open source. Here's one I worked on: https://github.com/dvsa/mot
Things generally don't have to be open source until you get to your beta assessment, as the alpha assessment is really just checking that you have a clear plan for getting to beta and they agree with your approach, so you don't fail for not being open source at this stage. The project I'm working on currently (an app to let companies wishing to perform road works see all road works being performed across the country, hopefully preventing things like two different companies digging up and resurfacing the road in a short space of time, etc.) is about to hit beta so will be open-sourced shortly
US taxpayers also have a high chance of benefiting from the work of non-US taxpayers who use this to reduce their expenses. Much like I benefit from workplaces that aren't mine offering sick days - fewer sick people working in the world means _I_ get sick less. Fewer people spending money recreating open work means they can spend that money elsewhere (or reduce the costs of what they do) and those network effects might well give me a benefit.
It's a fork of https://github.com/impallari/Libre-Franklin so it'd be a bit against the spirit of things (if even legally permitted) to relicense it to a non-free license. It'd also limit the US taxpayers who wanted to use it in ways that required the full freedom to share with not US taxpayers.
Fantastic font. I've long seen Helvetica (literally meaning "Swiss" with one added letter) used in official US forms, communication and the like. If this becomes the "US brand" and replaces Helvetica as a national font, it's a really great choice.
Just installed it as my system UI font under Linux (previously I was using B612, and before that CMU Sans). Took me by surprise how readable it is even at small sizes.
I just wish there was a monospaced variant/equivalent so I can run it in my terminal/Emacs windows.
Improved for what use case? This is not a font I'd expect anyone to write code in (to each his own, however.) Context will determine how a particular character is read.
Governments frequently publish references in which accurate literal transcription is necessary; law, regulation, and (legal) code references, identifiers, and the like.
1lO05S and other commonly-confused characters are best clearly distinguished.
If you are reading conversational english, you are correct that 99% of the time I/O/0/5/S/etc. should be easily apparent.
But if this is to be the de facto font used by all government agencies then you get cases where part numbers, serial numbers, legal-case numbers, scientific studies, anything/everything at NASA.
And then there is the military.
Removing ambiguity and uncertainty should be the point.
Is it legal for the federal government to specify a font license other than “This is released into the public domain”? They automatically give up all copyright upon publication, which ought to exhaust any licensing terms.
If a license is granted to a work and the US government accepts that license and then publishes derivatives of that work, the US government releases its derivatives into the public domain without regard for the original licensing, as they have complied with the terms of the license to them and have no legal right to encumber their fork with any restrictions. Licenses depend on copyright, without which they are unenforceable, and the government is prohibited from copyrighting any work. The license clause should, therefore, be invalid and unenforceable upon any derivatives of this work, regardless of the accidental inclusion of it.
My idea is being hashed out on their GitHub, though sadly I suspect they’ll need lawyers to advise them rather than HN commenters such as I: https://github.com/uswds/public-sans/issues/30
IANAL, I believe the whole idea behind GPL is to make sure GPL supersede what you might claim otherwise. So if the federal government can possibly make a derived work into public domain (or crown copyright if UK) instead of GPL, that's a loophole in GPL!
(I know this work is not GPL, but similar idea applies)
Public domain code ain't GPL-compatible though unless it's explicitly released under a "license" that functions as a public domain grant (e.g. CC0, 0-clause BSD, WTFPL, etc., though WTFPL has some legal wording issues IIRC), the reason for this being that not all jurisdictions recognize the concept of "public domain" and thus require explicit license terms.
I don't know any jurisdictions that don't recognize the concept of “public domain”, but there are some that define “public domain” differently than it is defined in the US.
In the US “public domain” means “without copyright”, while in some other countries it means “without authors rights” (authors rights are a superset of copyright). The problem here is that some authors rights (both in the US & in those countries) are “inalienable”, meaning they can't be given or taken away (although there might be exceptions listed in the law).
Public domain may be incorporated into a GPL project if it is relicensed at time of commit under GPL by a committer whose country honors public domain to the degree necessary, but it may not coexist in a dual-license scenario with GPL, and that commit’s relicensing has no implied effect on the public domain repository from which the content was drawn.
No FVAR version mentioned on the website? (If there isn't one: it's rather entirely worth converting your different masters into a single FVAR master instead)
Why would we go beyond step 2? As a taxpayer, I could buy the argument that the govt needs a new font, and that, having made one, it should make said font available for free so taxpayers can get the benefit.
I don't see why my taxes should fund a public CDN, though.
It's possible (probable, even) that the government would want to use a CDN for static assets (like fonts) for the same reasons that sufficiently-large private orgs would want to use a CDN for static assets.
In that case, it's also possible (probable, even) that some particularly-lazily-developed smaller sites would simply point to those same CDNs instead of hosting things themselves (much like how quite a few sites point toward CDN-hosted versions of jQuery and FontAwesome and such).
The only remaining step would then be for that taxpayer-funded CDN to cache any and all taxpayer-funded static assets.
Why not? It all depends on how important as an infrastructure CDNs will become. I know that in the States you prefer things to be privately held, but it's not too absurd imagining a future where CDNs are an essential and fundamental service to every day life for the vast majority of the population.
Windows does not have crap font rendering, Windows has advanced hinting support that varies by typeface and is infinitely customizable. Other operating systems apply a global hinting that typically discards the font designer's own preferences for how the font should look (especially at smaller scales).
Fonts without that manual hinting look like crap on Windows because Windows does not force through its own rendering/hinting/kerning overrides.
And all that is probably besides the point: if you're referring to how the fonts are rendered within a webpage, all the major browsers use their own custom rendering via one (finely tuned) set of parameters and interfaces for macOS/Linux/etc and another (rudimentary) for Windows. And most webfonts do not include any sort of hinting instructions in the payload, so even if the original typeface had manual hinting and would render well on a proper type engine normally, when served over the web even to a browser doing things right it'll still appear horrible.
In case you didn't know, Microsoft is really big on typography, and has commissioned a number of incredibly beautiful typefaces that render perfectly under Windows (and other operating systems).
I don't think I've seen a single typeface that actually looks good on Windows 8 or later, though. Even Linux does a better job (with or without subpixel rendering).
Maybe I just haven't viewed enough fonts on recent Windows versions to have encountered the better examples, though; are there any fonts (to your knowledge) that do look especially good on Windows (and/or better than they do on macOS or a FreeType-using Unix-like OS)?
It doesn't matter if Microsoft is really big on typography if it looks like garbage on Windows. I agree with the OP; most open fonts look absolutely horrible on Windows.
This is exactly what I mean, also, if you try to make them look better with that lousy truetype wizard, it tries to make you choose what looks better without any context, and if you pick some wrong choices you fuck up the whole font rendering in the OS, and there is no clear way to revert to defaults... nasty experience nonetheless
"I disagree, I like sans [for long prose] better" comments miss the point; this is not an anecdata competition, it's about how many people prefer each one (and also it would be worth considering how strongly they hold that preference, since mildly pissing off even a slight majority of your readers/customers is better than really pissing off a smaller percentage). Even if you're skeptical of the studies Mayer looked at, Google (literata) and Amazon (bookerly) seem to be in the serif camp. Do you think Amazon made a mistake, too?
Funny enough Medium uses a serif font, and I enjoy reading it more, and it inspired me to do the same on my site. But I use Georgia so I don't need any fancy web fonts or wat not.
Web sites should really use the user's preference and not override with either serif or sans serif (size too). HTML has a presentation freedom for a reason.