Hacker News new | past | comments | ask | show | jobs | submit login
Why don't websites immediately display their text these days? (superuser.com)
235 points by laurent123456 on Feb 7, 2013 | hide | past | favorite | 132 comments



Agreed that this is a terrible nuisance.

Webmasters please: use a CDN, have a download budget, use YSlow, smarten up how you use caching, push your scripts to end, and so forth. Nobody wants to stare at a blank screen for 30 seconds while you're loading up all of your bells and whistles. If you want to keep reader engagement you've got about 5 seconds from load to text appears, max. Probably a lot less than that.

And the same goes for web apps, btw. In fact, web apps are worse because people aren't passively engaged. Your content site takes 20 seconds to load? I bail out. Your web app takes 20 seconds to load? I'm spending the entire time cursing you under my breath.

Anybody who delivers anything over the web should be stone cold brutal about doing anything possible to decrease load times.


A lot of this can also be alleviated by being smart about using a web app's login screen to preload resources that will be needed on subsequent screens, as well. It's relatively simple to include a bit of JavaScript on the login page to fetch webfonts for the rest of the app before they are needed.

Furthermore, if you use JS to load your webfonts, you can have the site display in "web safe" fonts until your custom fonts are loaded.

Example: https://developers.google.com/webfonts/docs/webfont_loader#S...


IMO the websafe -> webfont transition generally looks crappy. You get a brief window where you see the plain font, you start reading, then all the text changes size ever-so-slightly, which moves the entire page around. Jarring. At least as bad as having a bunch of images that aren't fixed size that reorganize the page as they load.


All the more reason to use userContent.css to override site fonts. Size and face are legible.


Absolutely agree :) Add in a social-plugin blocker (if you're not a disable-js person), and the internet moves like 3 times faster. Just preventing e.g. someone's twitter feed from loading often makes the page respond a couple seconds sooner, which is really unforgivably bad.


> Furthermore, if you use JS to load your webfonts, you can have the site display in "web safe" fonts until your custom fonts are loaded.

Google "FOUT." Some web designers consider this to be worse than the original problem because it is unsettling to the viewer to see the fonts shift and morph every time you reload the page or hit Back.

There are downsides to both methods and that is why each browser treats the problem differently.


That's a really handy little trick, thanks!


Other answers are trying to trick the machine, while the obvious and safe fix is to serve plain html. Just like HN by the way, which is the only bearable site from China on a 3g connection.


For this particular problem, it doesn't matter whether the site is serving "plain HTML" if the CSS specifies a webfont that the user doesn't already have.

So you have two alternatives for a good experience:

1. Use only fonts users have preinstalled. 2. Make sure that you handle downloading webfonts in an intelligent way.

For many sites, 1 is fine. But for others which try to set themselves apart with distinctive design, 2 is necessary.


No, 2 is never necessary. You can set yourself apart with design just fine. The text, actual information people are trying to retrieve, is not part of the design. Specifying a font for the ordinary text is a perfect example of designers prioritizing self-obsessed wankery over the user.


The idea that typography has nothing to do with design is ridiculous on its face. I get that you don't like webfonts and you are welcome to that opinion, but it's silly to suggest that font selection and the placement of text isn't properly the job of a designer.


I didn't suggest typography has nothing to do with design. Placement of text is not at issue here, font face choice is. There are many font faces already available. They were created by designers. Ones who specialize in creating fonts in fact. They are much better at it than web designers. Breaking a website so it is entirely unreadable because you feel a non-standard font is prettier is something a bad designer does. It does not help the user, which is the entire point of a web site.


> I didn't suggest typography has nothing to do with design. Placement of text is not at issue here, font face choice is.

Choice of typefaces is part of typography. You can't place text without a face.

> There are many font faces already available. They were created by designers. Ones who specialize in creating fonts in fact.

There are about 10 fonts you can rely on people having (assuming you don't care about Linux users!). Several of these are gimmicky and generally not a good choice for anything. You really don't have all that great a selection.

> Breaking a website so it is entirely unreadable because you feel a non-standard font is prettier is something a bad designer does.

The site isn't entirely unreadable. It just takes a little bit longer to display. A very similar thing happens when a resource takes a while to load and leaves the body text the same color as the background for a long time.

I'm a big advocate of having lean sites that display fast. When I'm asked to work on a site, the first thing I suggest is usually optimizing it so everyone sees it quickly (I had the biggest smile when I managed to cut one down from 20 seconds to 1.5 seconds on a slow-ish connection). But even I think just blanket damning custom fonts is really extreme. They really can add a lot to the feel and readability of a site, and they don't always suck.


>Choice of typefaces is part of typography.

That doesn't make it part of the discussion.

>You really don't have all that great a selection.

Now go count how many books, newspapers, magazines, etc, etc have their primary text in something other than one of the few common "standard" fonts. You don't need a big selection. The desire to make your site look weird is not actually a necessity, it is a desire.

>The site isn't entirely unreadable. It just takes a little bit longer to display

It is until it loads all the way, which can take quite a long time, especially on a mobile network. And yes, it is actually fairly common for some elements to timeout and not load, images, css, javascript. If you use a non-standard font for the main text, then it becomes one of those things that can fail to load.

>But even I think just blanket damning custom fonts is really extreme

So do I. Which is why I never did that. I am only saying that using a custom font for the primary textual content of the site is a choice that has purely negative impact on the user.


> Now go count how many books, newspapers, magazines, etc, etc have their primary text in something other than one of the few common "standard" fonts.

That would be most of them. The LA Times uses Ionic, the New York Times uses Imperial, the Chicago Tribune uses Mercury Text, Entertainment Weekly uses Scout — I'd actually be really hard-pressed to think of a newspaper or magazine that sets its body text in Arial or Verdana.

> It is until it loads all the way, which can take quite a long time, especially on a mobile network. And yes, it is actually fairly common for some elements to timeout and not load, images, css, javascript. If you use a non-standard font for the main text, then it becomes one of those things that can fail to load.

This is kind of a browser-dependent thing. I would argue that any browsers that display this behavior are flawed and should be fixed. However, even on browsers that don't handle this gracefully, you can work around it and handle it gracefully yourself. So I agree that if this happens, you have done something wrong, but using a distinct font was not the mistake — coding your site in such a way that your body text might never display was the core problem.


>That would be most of them

Really? Of the millions of books, most of them use a special font? The hundreds of thousands of little local papers all have their own special font? Do a tiny minority who use a "virtually indistinguishable from Times" really constitute most of them? Is a font that <0.000001% of people can distinguish from Times really creating a unique and distinct design?

>I'd actually be really hard-pressed to think of a newspaper or magazine that sets its body text in Arial or Verdana.

I'd be hard pressed to think of one that uses comic sans too, that reflects on comic sans, not the necessity of unique fonts.


> Really? Of the millions of books, most of them use a special font?

They use something a fair sight better than what's available to web designers. Whether a font is "special" or not is kinda subjective. I would not consider Ubuntu to be any more "special" than Garamond. But they certainly do not feel constrained to use a handful of fonts Microsoft licensed back in the '90s.

> The hundreds of thousands of little local papers all have their own special font?

Nobody said anything about "their own special font." Any paper big enough to hire a designer probably uses something more than Arial and Times New Roman.

> Is a font that <0.000001% of people can distinguish from Times really creating a unique and distinct design?

I guess that depends. Do you think more than 0.000001% of the population could distinguish Georgia from Times? Many people couldn't name a font if you put a gun to their head, but that doesn't mean different fonts don't matter at all. (If you do want to argue that fonts don't matter at all beyond serif and sans serif, you're welcome to that opinion, but I'm not really interested in getting into that.)

>> I'd actually be really hard-pressed to think of a newspaper or magazine that sets its body text in Arial or Verdana.

> I'd be hard pressed to think of one that uses comic sans too, that reflects on comic sans, not the necessity of unique fonts.

We have just dismissed about 30% of the fonts available to web developers under your criteria. Do you still feel like they have a good selection?

Also, again, nobody said unique fonts. A font does not have to be unique to fit a design better than another font.


Your argument can be extended to most of website design. The identity of a design and the impact on accessibility are two factors that constantly have to be weighted against each other in web design. Do you choose a slightly gray text color which improves the overall appearance of the text but reduces the contrast? Do you include images? There are hundreds of excellent text fonts out there and the more you work with them, the more you realize that they do have different characteristics other than the serif/non-serif distinction. Only a handful of fonts are installed on most computers, so the decision becomes: Do you choose a font that doesn’t really fit the design in order to safe a moment when loading the page or do you go with a web font and optimize the loading procedure (CDN…) and get a coherent design? Can you express more of the visual identity by removing an image and including a web font instead?


>Your argument can be extended to most of website design.

Certainly, as it should be.

>Do you choose a slightly gray text color which improves the overall appearance of the text but reduces the contrast

No, you do not. Good example.

>Do you include images?

Including images does not make the site inaccessible. Bad example.

>Can you express more of the visual identity by removing an image and including a web font instead?

We're talking about the primary textual content of the site. Not a header, or a menu item, or a title. Very few people are crazy enough to render the entire text of their site as an image.


The point is, in web design there are a lot of different aspects that have to be balanced. One of them is accessibility & speed, an other one is visual design and identity. The choice of a web font over one of the ~6 standard fonts affects both of these and there is a choice to make.

This choice can be: I use a web font because this allows me to convey 80% of my visual identity with the text, I can put the text (and content) as the central element of the page and include no images on the page except for a small logo, leading to a fast loading page. For this, I accept that the user has to wait a little bit until the text is displayed when he/she visits the page the first time and the font isn’t cached yet. On a decent internet connection this might not be noticeable but on a bad 3G connection it might take the page load from 4s to 6s. Do the benefits outweigh the risks/disadvantages?

I do agree though that this decision is often not made consciously and a lot of times a web font is slapped upon a site just because we can. Way too often, visual designers create pictures of websites in Photoshop with no idea about the implications. On the other hand, I have seen that the rise of web fonts has made designers shift away from creating designs that are basically a frame with the visual identity where they slap in some random content. A lot of sites now treat the content as the main element of the site, reducing all other elements to a minimum. I would argue that this change greatly improves the quality of web sites and that the rise of web fonts is one of the main drivers of this change.


>use a web font because this allows me to convey 80% of my visual identity with the text

You lost me here. You convey absolutely no visual identity like that. People are reading the text, it is not window dressing. I'm sure marketing would be shocked to hear they don't need to worry about logos or colours or branding any more, because they can convey 80% of their identity just by using a font where the a has a 1% longer tail.


>No, you do not. Good example.

This. This is the reason people are reacting poorly to what you're saying. Pretending that a sweeping statement like that can cover every possible website is _absurd_. The same goes for webfonts.

The question you're answering was not meant to be answered by you. It has to be weighed by each site designer. Saying otherwise is offensive to the people who take the time to answer those kind of questions differently for each project they work on.


>This is the reason people are reacting poorly to what you're saying

I didn't see anyone react poorly. There is nothing bad about polite disagreement. I enjoy hearing other people's views, and having them try to convince me of the merits of their views.

>Pretending that a sweeping statement like that can cover every possible website is _absurd_

I obviously disagree. Can you give me an example of where deliberately choosing to make your website harder to read is a good choice? And that would be considered good web design?


I specified that #2 is sometimes necessary for a specific end (distinctive design). You seem to be arguing that one should never seek that end. I disagree, but that's an entirely different argument.


>You seem to be arguing that one should never seek that end.

Not at all. I am saying it is in no way necessary to load a non-standard font for the primary textual content in order to achieve the goal of a distinctive design. Loading a font to use for your branding is perfectly reasonable, users aren't there to read that anyways. Punishing the user by making the content not be there because you think distinctive design requires all your text to be unreadble_font_x just bad design.


So all of a sudden we have become a design Nazi and telling people what their website can or can't include for design.

The alternative to not allowing custom fonts is that people go back to using text in images to display their custom fonts which sucks when you are dealing with multiple languages or just want to change the text for something else. Or something like Cufon [http://cufon.shoqolate.com]. Or use flash.

I think I prefer webfonts, the developers just need to be smarter about how they are used and the implementation for including them.


Your post is both nonsense and repulsive. I suggest you read what you are replying to rather than spewing hyperbolic nazi accusations at people.


Specifying a font for the ordinary text is a perfect example of designers prioritizing self-obsessed wankery over the user.

It's a common phrase used through the US and Europe incidently, and used in the context of what you stated I don't think it is repulsive, but chc has already explained that to you.

I don't think my post is nonsense, but just like you, I am entitled to my opinion. Text is not just for reading, it is also used for design so my opinion is that web fonts have a place on the web, the developers just needed to smarter in the the implementation as I said. With all the sites done these days with HTML5 and CSS3 and using techniques such as parallax scrolling, I don't think half of them would have the effect that they do with "Arial" or "Sans Serif". The web goes beyond reading articles on on HN or NYT believe it or not.


You may be either very young or not an American English speaker, but the phrase "X Nazi" is American slang denoting somebody who is overly strict about whatever X is. It comes from the "Soup Nazi" episode of Seinfeld, in which a soup shop owner would evict customers from his shop with the declaration "NO SOUP FOR YOU" if they didn't follow a very precise procedure for ordering. It is not actually an accusation of Nazism.


That is why I said hyperbolic. It is still a very distasteful response, especially considering what it was replying to. I just read my post again, and I still can't find the part where I told "people what their website can or can't include for design".


You said any designer who chooses a body text font (something designers have been doing for hundreds of years, incidentally) is guilty of "self-obsessed wankery." The difference between that and telling people what they can or can't include is pretty slim.


No, I said a designer who chooses to make their website less usable for the user by forcing a download of a non-standard font (something that hasn't existed for hundreds of years, incidentally) is prioritizing their self-obsessed wankery over what is best for the user.

The difference between that and telling people what they can and can't include is absolutely massive. I am telling web designers (people whose job is to make web content accessible to users) that making content inaccessible is bad design. I am not saying "you can not be a bad designer". You are still allowed to do a bad job. And the rest of the world is allowed to point out that you are doing a bad job. Calling someone a nazi for pointing out that bad design is bad design is not constructive, or productive.


Though the last time I checked it didn't use gzip, which would improve things radically in that case I'd imagine.



It's not only reader engagement (and of course bounce rate which overlaps somewhat) but page speed is a factor in SEO nowadays too: so it's reader acquisition as well.


Webapps can look at using some of the newer HTML features for offline capability and caching. There's the HTML App Cache [1] and IndexedDB [2] for example.

[1] https://developer.mozilla.org/en-US/docs/HTML/Using_the_appl... [2] https://hacks.mozilla.org/2012/02/storing-images-and-files-i...


That only happens the second time you load a page.


No one needs a CDN to serve text quickly; that is massive overkill.

CDNs in general are massive overkill for most sites. What a CDN does is reduce latency due to lightspeed. Is lightspeed really why it takes a site 10 seconds to show text? No.

As you point out, caching and proper javascript management is the key to a site that shows text quickly. CDNs do provide caching, but are a complicated and expensive way to do so.


CDNs do a whole lot more than that. They allow common resources to be shared between sites. If 10 sites self-host jQuery, you download it 10 times. If they all pointed to the same CDN, you'd download it once, reducing your latency to below lightspeed - the time it takes to check caches on your device.


This is not a feature unique to a CDN; the same result could be achieved by simply hosting jQuery on a single server and pointing all your websites to that copy.

My point is that when people talk about using "a CDN", usually what they are really talking about is caching. Whether that caching happens at the origin or the edge makes a very very tiny difference (the lightspeed difference).


This stuff is seriously irritating to me as well.

The other thing that is irritating is websites that take so long to render they freeze up my mouse and computer, rendering interaction with their website useless for 10 seconds. It's not like I'm on an old machine either. I tend to show my appreciation by clicking the little "x" on the tab that corresponds to the webpage.

There really isn't any good excuse for this stuff: I've seen quite a few sites that clearly use heavy javascript that render almost instantly. There is no reason that a blog or simple site should take a long time to render.

There seems to be a trend of making things "awesome" with no consideration of user-experience and usability, and frankly, the designers and programmers who do this stuff make the good programmers look bad and ultimately hold back the progress of the web.

And I don't use Google Fonts either because I'd rather have my fonts installed locally.


You're a programmer, surely you must realize this is a Chrome bug? Not specific web page bug?

Personally it affects one of my PC's but not the other. I've tried to fix it a couple of times but because it's very infrequent it's hard to test a fix. Google 'Chrome Freeze' to see the frequency with which this happens.

This isn't being caused by web pages, it's a bug in Chrome that they can't seem to fix. It used to be caused by having two versions of Flash installed, but who knows what it is now. It gets worse/better over time.


You'll thank your lucky stars with the knowledge that I am not a professional programmer and I'm definitely not a front-end or much of a web programmer. Besides, there are others page down who seem to experience this problem, presumably some of them are programmers who aren't aware of this bug?

With this information, I am now more mystified at why some sites get burned by this bug while others do not, and, at least with your logic, why a programmer wouldn't be well aware of this bug; if they are, why aren't they exploring options and why aren't these programmers testing their functionality on Chrome (all browsers). Surely, these sites run slow on FF as well even if there is no freezing up like I am describing?


Ah ok, I totally misread your profile.

Those sites won't freeze at all in FF. It's something going wrong in the way Chrome calls Flash or something.


With Chrome I find that all the slow downs / freezes I've noticed have been while flash is running in one or more of my tabs. Flash + Chrome just doesn't seem to be a great combination.

BTW, you can disable flash in chrome by going to: chrome:plugins


No, there are plenty of JS heavy pages that can all but bring Firefox to its knees as well, and - if the machine is heavily loaded already - basically freeze the entire machine. I surf using both Chrome and Firefox at different times on different boxes, and I see this behaviour across both.


It's all about monetization. If you look at the source of a typical media website you'll see that most of what gets downloaded and most of the JavaScript executed on page load is adverts, social crap and analytics stuff.

It's just how we pay for those services.


Nowadays most major analytics and ads are served asynchronously.


The hitching frequently happens because adding iframes to a document is damned expensive, and requires the browser to concoct a whole new DOM per iframe. While analytics don't tend to make heavy use of iframes, advertisements absolutely do.


I was working on a page the other day for optimisation - 97/100 score now.

The main issues that remain are to do with FB, Google

- serve images from a consistent domain,

- leverage browser caching,

- minify javascript (that's a Google problem, I mean, really?!),

- ga.js and plusone causing reflows

- script serialisation

Doesn't appear there's much to be done about these from an end user perspective but they're not major slow downs as the page is well above the median load speed for similar pages.


Yeah pagespeed by Google always points out Google services for being un-optimized and slow.


Yes, but that doesn't help much against the JavaScriptional (and rendering) bout of hyperactivity that that ensues shortly after the first HTTP response and later on while using the site.

The complexity of embedding all that stuff is enormous. Just block it all for a day and see what that does to the perceived speed of the site.


I have found the biggest culprits are the social widgets and fancy js library. Ads and analytics rarely slows down a site (for me). There are exception though like few days ago I bumper in to a site that had 6 different analytics. That's just crazy, also outside widgets can load their own analytics (I think disqus does that). Things can quickly turn dirty.


The main reason I ever see this sort of thing is because my (modest, but not crazily slow) machine doesn't have enough memory—it only has 1GB, which definitely isn't enough for multiple tabs with typical bloated websites these days. If I've got gmail, youtube, and a few other tabs open, my machine will happily thrash it's little disk-drive out to accomodate them.... ><

[This is especially true if you're using chrome, which tends to use more memory per tab than other browsers because of its process-per-tab architecture (which is nice because of the isolation and controllability it gives, but it does make it more difficult to share memory between tabs).]

If I were in charge (I'm not because it's a work machine), the first thing I'd do is buy more memory.


I'm running an i5 w/ 6 Gig of RAM. I'm pretty obsessed with keeping my machine clean as well, so I'm not entirely sure if it is the slow machine, but idk really.


This happens to me sometimes in Opera, Chrome and Firefox, and it's extremely irritating.

I remember the good old days when dual-core processors were just starting to become mainstream, and browsers would just use one core. The browser could crash all it liked, but the system would always be responsive. But now browsers take advantage of all cores, meaning that if they crash, they can block your entire system.

I wish there was a way of telling the browser to strictly use only K cores on Windows. I'd instantly switch to a browser offering that feature.


Not a windows user here but I remember that you can do something like this using Task Manager. Just right click on the process and see if you can set the CPU affinity or something similar for your browser.


Fantastic tip, thanks! Right click the process and "Set Affinity..."


What OS/browser are you using? I have no idea what you are talking about. The only time I have ever seen anything like what you are describing is when maybe something is messed up with flash or an external pdf viewer. But even that doesn't freeze up the mouse.

Also I am a pretty heavy browser user, I have 20+ tabs open on average and a bunch of plugins.


Usually use Chrome on a Win7. I've seen this happen about once a day or every other day. Comment got upvotes so I guess I'm not alone in this experience?


This happens to me with Chrome on Mac OS.


This doesn't happen to me either. Same environment. Unless your can provide a reproducible test-case your comment is baseless.


I also experience that randomly, using Chrome on Win7 and Snow Leopard. The freezing can be up to 30sec that I sometimes think my system hangs.


If a web page can freeze your whole OS, I'd think there's something wrong with your OS...


I use firefox and chrome. Does this also happen to anyone using ff?


Happens a lot to me on Firefox / Windows 7.


Are you sure it's not just flash (from e.g. ads) acting up? On my win7 laptop the only time I saw pages freeze was due to a plugin conflict where somehow flash managed to install itself twice as a plugin; disabling one of these would usually fix it. Or, you know, just try disabling flash altogether and see if there's a difference.


Google apps locks up frequently on my atom netbook.


The blogger "javascript gears" loading animation from Google is even worse.


the blogger "javascript gears" is one of the best examples of "javascript gone wrong" on the internet. if you just want to display static text, use HTML. if you want to offer functionality, use front end javascript. whenever i see something like "use front end javascript to show non interactive text" then i know something went terribly wrong.


This gets especially bad on mobile.

I can sit on a mobile connection under bad conditions, barely capable of pushing 10kb/s. But you know what? The raw article text isn't going to be more than those 10kb.

Anything exceeding that is fluff, and when you need to DL 5MB/s of webfonts and god knows what to read those 10kbs, something is horribly wrong.

Good presentation is good, I don't disagree with that, but when you go down that route, make sure that the actual content gets the priority in loading and rendering, and not fluff.


Have you noticed how TechCrunch is a step ahead of this mess?

It shows the text quickly, but then hides it only to show it again, unchanged in few seconds. It must be a clever trick to make you get disappointed with their web devs and to anchor your attention there, so that you won't be as critical of their editors when you finally get to read that blinkentext.


They probably use something like Google's WebFont Loader[1] to avoid the delay in text visibility when using web fonts.

[1] https://developers.google.com/webfonts/docs/webfont_loader


That's not entirely true - this is what I had to see during the wait http://i.imgur.com/8NVrkLZ.png?1


When web fonts came on the scene, I was enthusiastic to finally see headlines rendered as images, or even worse, those sIFR Flash headlines, disappear.

But now people are using custom fonts even for body copy, and with this whitespace problem, it's just as bad.

Webdesigners will always find new ways to sacrifice usability.


custom fonts shouldn't be used for body copy?

Shouldn't technology be bending to our will? Not our will to technology?


Not currently. Many custom fonts look atrocious in the 10-14pt range in Chrome on Windows due to aliasing issues, and that's such a large slice of any particular market that you can't really do that.

It would be nice to get that sorted out, but for the time being, you have to be very careful about your body fonts.


>custom fonts shouldn't be used for body copy?

No.

>Shouldn't technology be bending to our will? Not our will to technology?

It has nothing to do with technology, it is about user experience. Pushing a crappy font on me because you think it is pretty does two things. It makes you feel good about yourself, and makes it harder for me to read your site. This is the same narcissistic designer syndrome that gave use flash intros back in the day.


Erm... I think you are just not a designer yourself and don't understand the difference custom fonts can make to the 'readability' and overall feel of a website.

Using custom fonts for headlines and then a 'helvetica' variant to mix in on copy only gets you so far.

I'm not a designer, nor a narcissist, but I've seen body copy fonts used well.


I thought web browsers were supposed to use a built-in font until the web font resource(s) were finished loading?

When I use Opera, the pages render and then a second or so later, the font faces change. It's a little jarring, I admit, but at least the text is rendered.


It was my impression too. According to one of the comments on SuperUser, Webkit based browsers like Chrome don't show any text until the font is loaded.


Correct; Chrome doesn't do it to prevent the FOUS (Flash of Unstyled Content) that so many designers toiled so hard with Javascript hacks to work around.

There are several things you can do with varying degrees of effectiveness:

1) data-uri encode your fonts and shove them into your CSS files directly. By the time your CSS shows up, your fonts are loaded. Downside is that your CSS grows substantially as a result.

2) Use a CDN (ideally a grown-up CDN that does geo-targeted delivery) and get your fonts downloading to the client ASAP by declaring them as early as possible. Use aggressive caching headers.

3) In all cases, consider using font subsetting to reduce the size of the font package you have to deliver.

4) In all cases, be aware of how browsers load fonts, to ensure that you don't have to end up downloading, say, a massive .ttf file just to throw it away end up downloading a .eot file that is a quarter the size.


The problem with these strategies is that most of the fonts being used are from places like FontDeck and TypeKit, and are paid for by the impression (or a monthly/annual license fee including X impressions). That means that the font requests are tracked (and requests have increased latency) and aren't cacheable. Actually purchasing the font for unlimited web use (something that's normally prohibited in the font's standard usage license) can be godawful expensive compared to the foundry's monthly plans, at least over the short term (1-2 years).


This is really just cost analysis, though. Page speed is money at some point. There are foundries that will sell you Web font licenses for dozens of dollars; unless there is a specific need for a high dollar font, it is very possible to do it on the cheap.

Designers are picky, but the boss pays the bills at the end of the day, and chances are he will settle for a slightly different font if it saves him thousands of dollars per year.


Hahahaha oh god give me a few minutes to stop laughing. The designers are so stick-in-the-mud about everything being perfect that they make the site invisible until everything is loaded, and then the user gets to stare at a blank page for 20 seconds, and it's not an accident, there's even a term for the situation.


jakobe said it best above ( http://news.ycombinator.com/item?id=5181226 ):

  > Webdesigners will always find new ways to sacrifice usability.
I even see some of them defending the practice in this discussion... 'f the users -- it looks good/strong branding/etc.'


While I agree with the quoted part, custom fonts aren't necessarily problematic. But you have to make sure it renders well and doesn't delay the page.


I too am all for nice web design that puts the user first. It's definitely possible and positive for the web; it just takes (sometimes a lot of) extra work and people are too lazy to test everything.


Yeah. "Progressive enhancement" is a dirty term depending on who you talk to.


> Webkit based browsers like Chrome don't show any text until the font is loaded.

Imagine how fun that is while loading pages on a flaky cellphone connection.

Especially when you know what's going on, it really makes you hate a website in ways I'm sure they wont be able to imagine.


I'm a web developer that implementes designs given to me. Most often in modern times they include custom fonts. I always just ignore this and go with the closest system-available equivalent font.

Reasons: - Delayed rendering of text (as per OP's link) - adds a considerable extra payload to download before the page is rendered. - UX and responsiveness on a site is way more important than the first impression of a nice font IMO. Yes, I agree fonts can help readability, but the standard fonts are not that bad.


>fonts can help readability

Yes, this is true. But it seems that many custom fonts are not designed for being rendered in a browser.

Recently, I have seen pages using fonts which look blurred, fuzzy or have a weird shape. These fonts are used for the main content (copy text). There must be people who notice that this is hard to read, but nobody seems to say anything. Some days ago I was told that "it looks expensive".

I am also affected by occasional browser and OS freezing due to webfonts. The freezing may last for an eternity (subjective) and is accompanied by an intense feeling that I have lost control. A user experience can't be worse.


Often times there is a modern web friendly replacement. I've got an idiot designer who specified Grotseque MT designed in 1926. It breaks up, doesn't scale well over pixels. Even the photoshop mockups looked jittery. I've swapped it for Adelle Sans. He wouldn't even have noticed, they are close enough.


It could be that the fonts you saw render better on one platform or another. Windows and OS X render text differently enough that it happens sometimes.


The text is hidden on purpose by the frontend programmer because he wants to avoid a phenomenon called "Flash Of Unstyled Text"(FOUT) that happens with embedded fonts (@font-face). For more info: http://paulirish.com/2009/fighting-the-font-face-fout/


That is incorrect. As the article you link to states:

> Webkit takes a very different approach, and very intentionally. They believe it's better to keep the text invisible until the font is ready. This way, there is no moment where the text flashes into its newly upgraded self.

Google Chrome, the subject of the original linked post, is built in WebKit and thus inherits this behavior.


> That is incorrect.

It looks correct to me, you're both saying the same thing. It's not the site's frontend developer, it's the browser's, but it's for the same reason.


It's funny that example site doesn't use one single built-in font

I can understand the desire for a nice menu or heading font but the whole website?

Here's that page:

http://portableapps.com/apps/development/notepadpp_portable

Looks like it uses Ubuntu font-face for everything. It's a nice font but Arial would have been fine too.

Also looks like the font is hosted on a third party site googleusercontent.com so I guess they are hoping you've cached it before.


On this system (Win 7, 8 GB RAM) with Firefox 18.0.2 I see that missing text effect even on reload.

In Firebug that Google returns a 304 for the Ubuntu font files but the browser is requesting the only about 3 s into loading the page. Really irritating.

After installing the font locally, it's gone.


The font files aren't that big either, and as you note are using googleapis.com.

The page is 600k and the fonts are ~160k; primed cache page-weight though is ~6k.

I wonder if the use of Ubuntu font really makes any difference to their users, A-B test would be interesting.

As Ubuntu is free wouldn't it be useful if the browser offered to install the font locally?


Having a strong brand is not something you can measure via A/B testing, it's a long-term commitment.


Yes.

This wasn't clear.

>I wonder if the use of Ubuntu font really makes any difference to their users //

What I meant was I wondered if using Ubuntu had any negative effect. Personally I think it adds to the site but then I've Ubuntu font installed.


It is extremely difficult to deal with this issue on the developer's side.

My solution has been to embed the font in the format the most commonly used directly into the CSS. Then I Gzip it and put it on a CDN. My blog now loads under 200ms almost anywhere, and there is no flash of unstyled text.

But I realize achieving those load times may be hard for a heavy web application/site.


"Give thanks to nature, the bountiful, because it has made necessary things easy to procure, while things hard to obtain are not necessary."

-- Epicurus


Arguably, things are promoted to being necessary if they are easy to procure. Otherwise they remain optional.

-- V.Pupkin :)


Wow, from an evolutionary point of view that actually makes sense.


More like: if you require rare things, you are less likely to live long enough to reproduce.


How fast does it load on a cellular connection in the country it is hosted? It would be great if it loaded in <200ms on fast cell networks.


I'll try that tonight when I have access to my Mac (remote-debugging with dev tools).

Until then, I can tell you approximately that the DNS shakeup takes 5 seconds, the server/CDN shakeup takes 1 more and the loading/painting is near-instantaneous.

Will post solid numbers tonight.


To everyone saying CSS3 webfonts are anathema to good sense and usability:

As engineers, we're sensitized to performance implications[0]; the added value of not using Arial is a more difficult thing to quantify. Many businesses perceive that value to be significant - behaving as if it doesn't exist isn't useful or pragmatic.

Still, on mobile you'd have to be completely insane to incur download time and latency x2 for the CSS and font files before showing any text. If you need it that badly, take the hit and force a FOUC ("flash" in this case a misnomer). When reading HN on the go I frequently wish CSS had never been invented.

Most desktop sites though can happily get away with a bit of bling type. I'd wager no one would notice if they were more careful with what they show prior to the font loading (hiding the bullets from lists, for example). In the battle between web performance and web functionality, the onus is as always on making intelligent conscious choices that represent the right tradeoffs for your objectives.

[0] http://at.cantl.in/nerd-stuff/2012/11/29/fast-page-start.htm...


Yes, that's on my list too, but the top of my annoyance list are sites that play an advertisement (audio and/or video) that can't be disabled. The advertisement always starts before any useful content appears.


There is actually a very simple solution for this : use the Javascript include method and place the code at the end of the body tag.

This is what happens technically : The CSS will get loaded, and the custom font will not be found. The browser will then use the fallback font. As the rest of the page gets loaded, so will the Javascript function. This will download and include the custom font, therefore changing the fallback font to the custom font.

The change of font may be quite disturbing, but hey at least people with slower connections can read meanwhile.


Is there a way to block such annoyances? Something like NoScript for CSS ( or actually for third party connections? Since the problem is not actually CSS but slow font/ad/whatever servers.)


RequestPolicy for Firefox blocks all requests to other domains. I sometimes think it should do sub-domains too. When you use this you will see just how many assets websites load from or place elsewhere.


Indeed, RequestPolicy, AdBlock+ and NoScript are the essential default plugins, both from a security and usability perspective.

But sometimes the content will not load at all, because to show an article's text, the page uses Google's ajax, webfonts, a cdn, and the blog's master site @ wordpress.com or blogger.com etc. In that case, the content has to be really interesting for me to allow all those third-party connections.


I adore RequestPolicy in principle, but after a year I just couldn't take it anymore. NoScript was better, it took me five years until I was fed up with constantly whitelisting every new site I visited and playing guess-which-sites-are-essential roulette.

I really, really wish there existed versions of RequestPolicy and NoScript with large and comprehensive default whitelists. Until that day, Ghostery is all that I have the patience to put up with.


Yes. If a website won't even show basic page text without allowing various things I will leave immediately.


Check the RequestPolicy preferences page; you can do sub-domains. Fair warning: it's a lot more fiddly to configure your whitelist that way.


Thanks. I'll try and remember this next time I think I need it.


Ghostery blocks everything third-party, CSS or JS.


horrible bloated javascript frameworks. if your page is not interactive, it should be done in just css. if you are loading 5 js files just to display a couple paragraphs of text, there is a problem.


I completely agree. Google's Blogger service is one of the worst offenders. I've been to so many blogspot.co.uk sites where absolutely nothing loads if Javascript is disabled. Once you enable Javascript, often a page of text with no visible interactivity loads. Or the interactive elements are for the sidebar widgets, not the content of the page.

Javascript is simply being ridicuously overused and few people seem to call out for it to stop.


I ran into this issue as well when working on https://starthq.com

The font file is loaded the first time any text using the font is displayed and I am using Font Awesome for the icons which are only visible when selecting a drop down menu. As a result there was a very annoying flicker the first time the menu was displayed.

The somewhat hackish solution was to include one of the icons that uses the font as part of the logo in the upper left of the page, which is rendered on initial page load.


An equally hackish alternative would be to have a floated div way off screen using the font to force loading (works for image carousel pre-loading too, but with a pretty obvious load time hit)


I've always wondered why it takes forever to load most pages. Im running debian on SSD with 32G ram and a video card that can handle anything yet Iceweasel, reg FF and Chrome all take forever to load pages like I'm using 1990s internets. Blocking all java scripts and tweaking the guts of FF to load faster still results in Netscape navigator 33.6kps speed to render text.


Anyone remember that in old IE days, you have to mouseover to see the image loading on buttons?


You were still able to preoload them before even in "old IE days", it was just a bad practice :)


Why? Because they're too busy deciding what ads to serve you. ba dum-bump tish!


It certainly looks that way from the user's perspective. You're sitting there, waiting for the actual text to load, yet the ads are already blinking and jumping all over the place.

An all too common experience.


What's REALLY annoying is websites where lots of layout has loaded, and no text (or only some of the text) shows pending a font load which could take seconds. This doesn't look "fast," it looks janky.


Most people would consider this a significant performance improvement on "nothing shows up for several seconds".


tl;dr web fonts




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: