
A 12pt Font Should Be The Same Size Everywhere - kickingvegas
https://github.com/kickingvegas/12pt-should-be-the-same-everywhere/blob/master/absoluteMeasurementDPI.md
======
abalone
"My appeal is to software and hardware developers to ensure that a 12 point
font will be rendered the same size everywhere, regardless of screen size and
density."

This is a not very well reasoned appeal. If it were implemented, every Keynote
presentation would be unreadably small when projected onto the big screen,
literally the same size as the presenter holding up a piece of paper for the
audience to read with their binoculars. Likewise, no more than a few
characters at a time would be visible on a phone.

There is simply no serious counteragument; nobody can say making things
unviewable would be a step forward.

The best you can do is complain that the _name_ of the units is misleading.
Fine. So what about introducing a new unit that is truly faithful to a real
world measuring stick? You can argue for that all you want, but guess what?
Nobody in their right mind would actually use it, outside of exotic scenarios,
because generally you want to avoid the above effects. You want your user
interface to be visible to users at different screen sizes meant to be viewed
from different distances.

Thus, even if we were to introduce a new "TruePoint" or "TrueInch" unit, it
would be largely.. and I beg you to excuse the pun here.. pointless.

~~~
Jach
I agree about the _physical rendering_ size always being the same for a point,
a point should be more general, but I still think there is some utility for
having things defined in real-world units showing up as real-world units when
measured on glass in the same z. An inch is an inch, and if it's too big for
the display it should be clipped and if it's 100 feet away it should be
unreadable. It would be useful for classes of devices--instead of special
casing for resolutions, ratios, and densities, special case for
desktops/laptops, phones, tablets, home projectors, conference projectors,
etc.

But what's really missing is the explicit presence of a projection surface and
the ability to adjust the factors that go into the calculation when
projecting. It would be nice to have a unit that lets you say "I want this
item to be as tall as the subjective inch if a person holds a ruler one foot
away from them." That way no matter how close or far away I'm looking at the
item (assuming the display is large enough--too small and too far will get it
clipped, too few pixels will get it blocky), if I hold a ruler one foot away
from me the item will be "as tall" as the ruler's inch mark. (Edit: I'm having
fun trying to visualize a powerful enough 64x2 display built on a mountain.
Where I grew up, there is just a single letter. From afar:
[http://upload.wikimedia.org/wikipedia/en/d/d9/Little_mountai...](http://upload.wikimedia.org/wikipedia/en/d/d9/Little_mountain_utah.jpg)
And from on top of it:
[http://2.bp.blogspot.com/-QeeaDcCiD4o/TexykEujKiI/AAAAAAAAHR...](http://2.bp.blogspot.com/-QeeaDcCiD4o/TexykEujKiI/AAAAAAAAHRA/DQSDreeonko/s320/2011+1914.JPG))

~~~
lloeki
> "I want this item to be as tall as the subjective inch if a person holds a
> ruler one foot away from them."

Sounds like you're looking for arcseconds.

------
crazygringo
Well yes, 12pt should be the same everywhere, but points are a terrible unit
of measurement for anything computer-related.

Points are useful on paper only. They should only exist in the context of word
processing and like, where we expect things to be printed in an actual
physical size, with 72 points to an inch.

For computer interfaces or web documents, we just need some kind of
measurement that is a relatively known proportion to the computer or browser's
interface. Fortunately, CSS "px" does that rather well -- it doesn't indicate
physical pixels, but rather logical pixels, and we all know what "16px" text
looks like relative to our OS, and it works great. And happily, people rarely
use "pt" in CSS, and in my opinion it should never have been an option in the
first place.

~~~
endtwist
That's like telling me points are a terrible unit of measurement for print
because we use different sizes of paper and points should be relative to the
paper size.

The _whole goal_ of points is that a point is a point is a point, regardless
of whether I'm printing on a business card or a billboard. Why can't we have
that uniformity for digital devices, too?

~~~
Scene_Cast2
Just curious, but why not inches / cm instead of points?

~~~
andos
An inch is too big. A centimeter is too late.

~~~
repsilat
Real metric users would tell you to use millimetres instead of centimetres
anyway. Powers of 1000 and all that.

(Real metric users would also spell it "centimetre", but if adopting the
American spelling will expedite a changeover I think it's a fair trade.)

~~~
andos
A millimeter (which I rather often call “milímetro” or “millimètre”) is also
too late. By the way, the SI uses powers of ten.

------
wmf
You seem to be taking it as given that the size of stuff should match between
different displays, but you don't give any reason why that's desirable. I
understand the argument for 1:1 WYSIWYG in DTP, but not for UI. I haven't been
doing much DTP in the last decade.

Some people prefer larger UI and some prefer smaller — partly due to
differences in vision and partly just preference — and today's industry lets
them choose. Your proposal takes away that choice and thus is guaranteed to
anger people.

~~~
nkwiatek
First of all, whether or not something offends people is rarely the best
reason to do something.

Secondly, "choice" is a generous way to describe it. It's more that sizes are
happening of basically their own will, whether the user likes it or not. At
least if inch measurements were respected, the designer could be deliberate
about a decision. This would not invalidate user stylesheets or zooming; I
can't fathom a single disadvantage here.

~~~
wmf
I'm more concerned about UI widgets where user stylesheets and zooming tend to
not exist (and zooming would produce blurriness if it was used).

------
PaulHoule
scary... my experience is that the more platforms try to get this kind of
thing right, the more they get it wrong.

back in the 90's, photoshop would try to do all sorts of gamma correction on
images you were editing with the consequence that, if you didn't turn it off
and make sure it was always turned off, you'd get your colors wrong 100% of
the time.

a system like that working requires that all the pieces be correctly
configured, and the consequence is that instead of having something that's 4.1
inches on one platform and 5.2 on another, you have something that's 12.7
inches on one platform and 1.4 on another.

~~~
gcb
Gimp had the only cheap effective, backward compatible solution:

There was a setting that showed a ruler in the screen, you could scale it to
match a ruler you were holding against the screen.

It disappeared after a while...

~~~
ralph
I don't think this would be necessary if folks ensured their X server's
Screens' resolutions were configured correctly.

    
    
        $ xdpyinfo | grep -A 2 '^screen #'
        screen #0:
          dimensions:    1920x1080 pixels (483x272 millimeters)
          resolution:    101x101 dots per inch
        $
    

The all X clients would know how many pixels represented a physical
measurement on the screen.

~~~
gcb
And how do you configure it? most lcd manufacturers put wrong info on labels
and dmp info

~~~
ralph
One uses

    
    
        Option "DPI" "96 x 96"
    

in an xorg.conf or similar file; this overrides DDC from the monitor. Note,
not all monitors have the same density of pixels in both directions.
[https://wiki.archlinux.org/index.php/Xorg#Display_Size_and_D...](https://wiki.archlinux.org/index.php/Xorg#Display_Size_and_DPI)
has more.

~~~
gcb
but my monitor does not have the real DPI the label on it's back says...
configuring by DPI is moot.

i read a little more on the link you provided... and the closest to the gimp
old way of doing that is putting the width and height of the viable area of
the monitor in mm. think that might work rather well if it's correctly
implemented.

~~~
ralph
I wasn't suggesting reading a label. One measures the visible picture in real
life, e.g. ruler, and calculates the DPI. CRTs didn't have a label on the back
and, anyway, one could adjust the picture size.

------
ChuckMcM
It would be nice, won't happen of course but here is to hoping.

The challenge isn't typography, its people. People who want their document to
be as wide as their phone on a phone and as wide as their tablet on a tablet,
and not quite as wide as their screen on their desktop or laptop. If you force
them tom compute their 'zoom factor' they get annoyed. People who give them
what they want, get their business. And typography continues to suffer.

~~~
gcb
That's not the point of the article. If it's badly used or not is not
discussed.

The point is that even if it could cure cancer, it's not possible to be done
on any display.

------
brittohalloran
Github repo as blog post?!?! Hows that for some transparency in your edits:

[https://github.com/kickingvegas/12pt-should-be-the-same-
ever...](https://github.com/kickingvegas/12pt-should-be-the-same-
everywhere/commits/master)

~~~
jherdman
And if you want to get really fancy, there's Github Pages --
<http://pages.github.com/>. So, you get all of the benefits of the repo
approach, but your pages can be all sexy and stylish.

------
dpark
This sounds like a good idea at first[1], but then you realize that text is
often accompanied by graphics that are not vectorized. Resolution independence
when coupled with raster images is a _hard problem_. You can render the text
at the desired size fairly easily, but you can't resize a bitmap arbitrarily
without it looking pretty terrible. This is why resolution independence hasn't
happened for displays despite lots of attention. It's also why Apple just
doubled everything to keep it simpler.

[1] Actually, it might not even seem like a great idea at first if your first
thought is to contrast your phone screen and TV screen.

------
px1999
I don't think I see the point, aside from for historical reasons and printing
(though that one's already been solved quite thoroughly...)

The beauty of electronic displays is that they can be tailored to your
specific wants and needs. If I want to view your text, I should be able to
view it at a size that's comfortable to me on whatever device I want to read
it on. If I'm not reading it, and am instead waiting for updates, I should be
able to scale it down and put it somewhere in my workspace that doesn't take
up too much room or distract me.

To have an edict that says 12pt must always be 0.4233cm or 0.16777in (quite a
ridiculous measurement) achieves nothing. What if I want to display the
resource you built for display at 12pt on a 24 inch monitor? A 6 inch phone? A
projector? What if I have bad vision and want it larger? Suddenly, 12pt needs
to get multiplied by some arbitrary factor and it's lost all of its usefulness
again.

My expedient excuse is that it's not necessary unless you're dealing with
something static and physical.

~~~
asdfsdafds
The basic solution is fairly simple (and I'm surprised there's not much
mention of it here): use something other than "point". In an ideal world,
there would be three measures:

Points/inches/mm/etc. - used _only_ for displaying content which must match
the physical world. For the most part, this means "page preview" and when you
want to display some image at "actual size".

Arcseconds/radians/etc. - used for displaying content that is intended to fill
a certain field of view. The vast majority of content.

Pixels - used for fitting to pixel grids and 1:1 display, or other scenarios
where scaling artifacts are not acceptable.

There is a considerable deal of fiddling to do with this, especially with
regards to user-configurable scaling (which would be extremely important for
arcseconds/etc., given that field of view depends on distance to device), but
the basic need for some angular measure seems obvious.

------
xmmx
This will lead to more harm than good. Sure, designers can use it to their
advantage to create great pages, but some people who don't account for this
will make their websites so their 12mm font looks great on their 4000px
monitor but everything looks fuzzy on my 1024 px screen. And oh god the
scrolling when I try to use my ipod touch to browse the site.

~~~
gcb
that's what they do already. i'm totally missing your point. How does this not
happen with px? and with em and % being also based on px...

------
homer-simpson
For your reading pleasure, here is (imho totally flawed) argument by one
Mozilla guy who thinks otherwise:

[http://robert.ocallahan.org/2010/01/css-absolute-length-
unit...](http://robert.ocallahan.org/2010/01/css-absolute-length-
units_12.html)

------
lstroud
Shouldn't we just switch to dimensional measurements like millimeters, inches,
etc and expect the operating system to use whatever metric it chooses to make
it display size appropriate (pt, px, em, etc)?

I think the problem with this is that developers (me included) have enjoyed
using resolution to to change the physical size components in order to gain
screen real estate. Perhaps, shifting to a zoom/scale setting would be a
better approach than tying ourselves to resolution.

Humorously, it would be kinda like a print guy measuring font size in fibers
and the size of the type changing based on fiber density. :-P Then again, they
probably do that.

------
wnoise
No. I want to be able to make slides that work on projectors. The only unit
able to handle this would be "angle based on standard viewing distance", which
will require markedly different "actual lengths" for different devices.

~~~
Cushman
It has been pointed out before that the CSS `px` is such a unit:
<http://inamidst.com/stuff/notes/csspx>

(Although I seem to recall the discussion here was skeptical of the idea.)

------
splicer
A point is not necessarily 1/72 inches. TeX, for example, uses 1/72.27 by
default. See <http://www.oberonplace.com/dtp/fonts/point.htm> for details on
the different definitions of points. Even once you've settled on a definition
of point as a unit of length, there's still the problem of defining what a "12
pt font" means; does it refer to the cap height, the length of an em-dash, or
something else?

------
silon3
For the web, users must be able to override font size independently of layout
(all fonts, not just some of them).

Zoom text only feature in Firefox makes it much better than Chrome for me
(even IE has a really poor man's implementation of this).

------
teilo
"Font-hinting becomes less necessary at 200 PPI and higher."

Umm, no. Font hinting is still important at 300 DPI, so it's also still
important at 200 PPI.

------
gcb
i was fighting with the same thing the other day... then i realized, i do not
look at my 24" monitor at the same distance then i look at my 4" phone screen.

So, which size should the font be if i'm designing for both those screens? pt
sure isn't the answer.

this being only tangentially related to the topic :) so, back on topic, yes, i
wholeheartedly agree that pt should mean what it mean. it's just retarded that
it's not. and you are not even accounting for TVs which cut random portions of
the displayed image for no reason, making this even harder to calculate the
real dpi.

~~~
ajross
Ironically there _is_ a straightforward answer: arc distance. What angle does
the font (or pixel, or whatever you're trying to scale) subtend from the
position of your eye? This works always. It's sometimes hard to define for
some devices (e.g. TV's, which might be used a widely varying distances), but
even then the world has come up with standard conventions (e.g. assume you're
10' from your 40" TV, assume your phone is about 12" away, assume the controls
in your car are at a 30" arm length...).

And, of course, it's not implemented anywhere. So we all suffer with "dpi" and
"px".

~~~
Someone
According to CSS 2.1, px _should_ be defined in terms of an angle:
<http://inamidst.com/stuff/notes/csspx>.

~~~
ajross
That's fun. Obviously that's not how it works, 1px is 1 physical pixel
everywhere I've ever seen. Think of how much would break if something decided
to do this "right" ...

~~~
cma
On retina displays it is two pixels.

~~~
thebigshane
I haven't paid too much attention to this but shouldn't it be a square of four
pixels? That would be true double resolution, right?

Or is there some pixel trickery like those olpc displays?
[<http://wiki.laptop.org/go/Display>]

~~~
reitzensteinm
px is a measure of length, not area.

A 12 px high font is 24 px high on a retina display.

A 12 px by 12 px box is 144 px^2, or 12 x 12 x 2 x 2 = 576 px^2 on retina.

------
drivebyacct2
Huh? I sit at a different distance from my monitor than my phone is from my
face. I buy high resolution monitors, not because I want the "OMG the circle
is a smoother circle" sense of a Retina display but because I want greater
density. If we're nitpicking about "pt" in particular, fine. But if we're
saying we should design UI elements and text specifically so that it's always
the same size, count me completely out.

