

Dots Per Inch - bootload
http://www.tbray.org/ongoing/When/201x/2010/06/07/DPI

======
KirinDave
I'm not sure what logical fallacy I'm committing (perhaps genetic fallacy?),
but whenever Tim Bray talks about photography, images, typography, or really
_anything_ involving visual appeal I close the window.

Look at his blog. Look at his photography. Look at his outputs. He is a
competent writer with technical knowledge , but when he starts making
aesthetic arguments I think he's out of his depth. Especially when he uses
other people's informed decisions to justify his uninformed ones (e.g., JDD's
choice of monitor being good enough for him).

That'd be fine if this article was a researched technical argument, but it's
not. This article is a "120ish seeems good enough for me, and I got this stuff
from JDD and I seem to remember old printers being good at..."

~~~
jacobolus
Seriously. Where does he come up with nonsense like:

> _I suspect that unless you’re preparing high-quality print work, there’s no
> appreciable benefit for a photographer, and perhaps some downside, in an
> ultra-high-res display._

Count me among (amateur) photographers who would just love to have a 300 DPI
display on which to work on photographs.

Here for example is one quite obvious appreciable benefit: I can see at full
resolution what my prints will look like with varying amounts of sharpening
applied, in a roughly accurate simulation. Previously, if I opened an image up
at 100%, the proper amount of sharpening for print would make an image look
extremely “crunchy” on screen, because 100% on screen has pixels three times
as big as a 300 PPI print resolution. If instead I looked at the image scaled
to 50% on a typical display, I could no longer notice any difference from
fine-resolution sharpening, because the screen display would be interpolated.

More than that though, I’d love to have a 300 DPI tablet-scale device to show
off work on. Photographs would be absolutely gorgeous.

~~~
sesqu
So your argument is that 300 PPI displays are great for processing photos you
intend to have printed at 300 DPI. I have no problem with this claim, and I'll
extrapolate to "better for >= 300 DPI prints".

So, I guess the obvious question is, how much DPI do various prints come out
with? And since almost all electronic displays have far less than 300 PPI,
would you agree that web designers do themselves a disservice if they get a
high-resolution display?

~~~
jacobolus
Well, personally I send a lot of prints to <http://whcc.com/>, which involves
uploading a 300 PPI image to an ftp server and filling out a web form, and
then receiving a nice quality print by UPS 2-day ground a couple days later.
If I was printing on an inkjet, I might print at even higher resolution, but
after 300 PPI the improvements get harder and harder to notice.

As for web designers: (a) they presumably aren’t the same as “photographers”
that Tim Bray was explicitly talking about so I’m not sure they’re really
relevant to my comment, but (b) it’s a lot easier to simulate lower-resolution
displays than higher-resolution displays. That is, if you had a 300+ DPI
monitor, you could approximate the appearance of both a traditional (say, 100
DPI) computer screen, and an iPhone, Android Phone, high-resolution netbook,
or whatever else.

------
jonah
DPI [Dots Per Inch] != PPI [Pixels Per Inch] - this is a common
misunderstanding when discussing screen resolution and they're not directly
comparable.

DPI has to do with the printed image - how many of the smallest dots of ink
can fit along an inch on paper. The resolvable image detail depends on this as
well as other factors like the dithering algorithm.

PPI defines how many groups of pixels (each normally made up of an R, G, and B
sub-pixel) fit in a linear inch. Even this is complicated by things like some
of the current OLED screens that have twice the number of green sub-pixels
than red and blue.

More: <http://www.google.com/search?q=dpi+vs+ppi>

------
feverishaaron
People confuse PPI (pixels) and DPI (dots) all the time. Yes, a press might
run at a higher DPI than 300, but a 300-400PPI image is generally considered
sufficient for even coffee table book-quality printing.

<http://www.tildefrugal.net/photo/dpi.php>

~~~
bonaldi
It's not quite as clear as that. People aren't confusing DPI and PPI --
they're the same thing. Pages like the linked that suggest otherwise are
muddying the problem, because they're misunderstanding how printing works by
ignoring line screening and halftoning.

A 300ppi screen that fits 300 pixels into an inch is exactly the same as a
300dpi printer that can fit 300 dots of its laser into an inch of drum.

The difference comes because screens can directly vary the colour of an
individual dot. Printers can't -- ink is either present or it is not. The way
you get around that is by varying the size of the dot, in a process called
halftoning.

Halftone dots are very small, and are arranged in patterns you'll recognise if
you look at newsprint photographs close-up. You measure the size of the screen
in LPI -- lines per inch, the number of lines of dots (at max size) that can
fit in an inch.

Because of the need to vary the size of the dots, on a 300dpi printer you end
up with a very coarse line-screen of about 60 LPI. This looks rotten, and so
printers go up to resolutions of 1200+dpi. This means you can have a
linescreen of around 150lpi, which is good enough for most printing. Each line
roughly equates to 2 pixels, and hence a 300ppi image is sufficient for
printing, regardless of the resolution of the image or platesetter.

But this doesn't mean there's a difference in ppi or dpi: they're still
exactly the same.

~~~
sesqu
If I understand this right,

    
    
      * DPI is print resolution, a property of prints and displays
      * PPI is image resolution, a property of printed and displayed images
      * LPI is halftone minimum printing resolution, a property of printers
    

Those are absolutely not the same. DPI is looking at what you got; PPI is
looking at what you wanted. LPI, then, is looking at what you can do, and each
is measuring a different technological solution to "image".

PPI is taking a digital image and saying "I want it yay big", LPI is saying
"I'll do it by drawing at least this many differently sized dots of entirely
different colors", and DPI is going "This is how many dots I'm counting you
did it with".

edit: Apparently, some manufacturers of small LED displays like to report DPI
as the number of LEDs per inch - that is, 3x the native resolution for images.
That's a good example for people like me, who know little about printers.

~~~
bonaldi
No, DPI and PPI are literally the same thing. (Photoshop used to call what is
now PPI "DPI"). "PPI" comes from people wanting a way to talk about the dpi of
a digital image without including all the baggage and confusion of dpi, which
too few people understand.

There's no _technical_ difference, however.

~~~
sesqu
If they're the same thing, which thing is it? The number of pixels (virtual
representations; samples) per inch of display, or the number of dots (physical
representations; approximate components) per inch of display?

~~~
bonaldi
You've got a false distinction there. It's the number of constituents that can
be packed in per inch: whether those constituents are print dots, physical
pixels, or utterly notional (as in displayed images).

~~~
sesqu
No, that's just the _per inch_ part. I still need to know what the
constituents being measured are, as logical pixel vs. physical dot makes all
the difference:

I don't care about the dot density, as long as there are enough of them; I
care about the dots/pixel, and deeply about the actual inches. I ran into this
problem as a kid, printing some images that had dpi metadata that the printer
took literally, but I didn't know how to fix it back then so I ended up with a
thumbnail print. Attaching ppi (!) metadata to images is stupid anyway, since
the actually relevant information would be the inches (or the dots/inches, for
designer decrees).

~~~
bonaldi
"I care about the dots/pixel" Yes, but those are two different dots you're
talking about. What you actually care about is how many printer dots you get
for each image pixel.

There is a distinction, and I'm all for precision in language, but it's
nothing more than a language distinction for the sake of clarity.

It is not a technical distinction, is my point, and people like Bray aren't
_wrong_ to call this a 300dpi display. The worst you can say is they are
slightly less precise in their speech than is optimal.

~~~
sesqu
Ah. So your position is that there is a distinction, but it's not lexically
recognized. My position is it is lexically recognized, but not by enough
people.

It seems to me that a lot of people do draw a line between pixels and dots,
and that a lot of people don't (but usually mean pixels). I don't know if
there are proper authorities for this, but I consider the distinction
important enough to warrant specific language. My dictionary here says that
pixels are onscreen dots - I used to think the same, but now I consider pixels
to be the samples in memory. I think you are with the dictionary, since you
used the term "physical pixel". My adopted distinction meshes well with my
ability to change the resolution of my desktop, measured in pixels, but not
the resolution of my display, measured in dots.

I agree that Bray can call it a 300dpi display - I didn't criticize him for it
- and I explicitly translate that to a 300ppi natural resolution display. As
noted above (according to Wikipedia), some manufacturers are now exploiting
this by reporting such displays as 900 dpi, counting the subpixels as dots.
Therefore I consider it important to state natural resolution ppi numbers for
devices with natural resolutions, such as LCDs, and dpi resolutions for
devices that don't have a natural representation of a pixel, such as CRTs.
Representation numbers, ppi and dpp, I have yet to see anyone provide (maybe
because there are too many resolutions already).

tl;dr: It's confusing people, so make the distinction if you can.

~~~
bonaldi
Almost the opposite: my position is that there is a distinction but it's
purely lexical, and the two terms are effectively synonymous. That's why it's
not quite right to say that people "confuse" them all the time (as my
grandparent did); it's more correct to say they're not as precise as you might
want.

You yourself show the interchangeable nature of dots and pixels when you say
you measure the size of your display in dots, even though (assuming it's not
CRT), it's very much measured in pixels, regardless of the resolution you've
got your desktop set at. (This whole thing came about because of people saying
Bray was wrong to measure his screen in dots, after all).

tl;dr: you can insist on a distinction between dpi and ppi, but it doesn't
really get you anything, and it definitely doesn't help people understand
printing!

~~~
sesqu
For clarification: LCD displays like mine have red, green and blue tinted
light sources, arranged such that one of each combine to roughly a square.
Other displays work similarly, except they don't have any squares. I consider
each light source a dot, and when viewing images at 1 pixel per square
(natural device resolution), I consider the square a pixel for convenience. So
inches are what I measure the size of my display in, natural pixels per inch
are what I measure the resolution of _my_ display in, and dots per inch are
what I would measure the resolution an arbitrary display device in, if it did
not have a natural mapping between pixels and dots.

So dpi is equivalent to 3 _ppi for LCD under my definition, and both 1_ ppi
and 3*ppi dpi counts have been sighted in the wild. Using my definition is
like using SI mibibytes; when you see dpi you don't know which is being
measured, but when you see ppi you can be sure, since ppi is only used by
people like me.

As such, I do not consider dots and pixels interchangeable; pixels are what
I'm trying to represent, dots are what I'm representing them with. If I'm
dealing with a printer, I acknowledge that I'm not going to get neat square
pixels no matter how hard I try; instead, I'll end up with round, scattered
dots. So I look at the dpi of the printer and the ppi of the ideal
representation of my image (in inches), and get something like 6 dots/pixel. I
consider this sufficient to render a close approximation to a tiny square, and
proceed (actually, I don't usually look at that number, since printer
resolution is never a problem for me).

This lets me focus on the one aspect of printing that actually matters to a
computer: megapixels. As long as I keep the ppi ratio of my layout high, and
trust that the printer has high dpi and proper scaling, I get to ignore
stupidities like jpeg dpi metadata. In theory, anyway - occasionally print
houses request specific dpi, which can be considered a ppi hint and used to do
rescaling with the algorithm I want.

Disclaimer: I've never done any print or layout work.

------
czhiddy
> Given the nature of the Android ecosystem, I think we can be pretty sure
> that if it turns out that Apple bet right and the extra resolution is a good
> selling point, we’ll be in an amusing arms race around DPI PDQ.

Once you surpass the limits of human vision, where individual pixels can't be
resolved at 12-24" distances (whether this is 300dpi or 400dpi, or whatever),
there simply is no benefit to going higher. There are significant costs to
increasing DPI, such as memory + GPU bandwidth. The hardware/manufacturing
limitations are probably costly as well (does _anyone_ manufacture displays in
volume with > 350 dpi?)

~~~
abeppu
Just to be explicit about it, 20/20 vision corresponds to visual acuity that
can distinguish features subtending 1 minute of arc. The angle subtended by a
pixel at 326 ppi is arcsine( (1 inch /326) / 12 inches) = 0.87 minutes.
[http://en.wikipedia.org/wiki/Snellen_chart#.2220.2F20.22_.28...](http://en.wikipedia.org/wiki/Snellen_chart#.2220.2F20.22_.28or_.226.2F6.22.29_vision)

But I think the psychophysics get more complicated when you throw in time and
movement.

------
davidmathers
_my nice 42" 1080P TV at home, at a laughable 52DPI_

Yes, it would be laughable if it was 6 inches from your face during normal
usage.

------
tspiteri
As stated in the article, laser printers do have much higher resolutions. But
then ink dots are either there or not, you cannot have as many different
colors as you can in one screen pixel. So comparing laser printer resolution
to screen resolution is not so simple.

Also, it would be interesting to know at which point, as screen resolutions
get higher, font smoothing becomes pointless.

~~~
ugh
_But then ink dots are either there or not, you cannot have as many different
colors as you can in one screen pixel. So comparing laser printer resolution
to screen resolution is not so simple._

A 300 ppi screen is a 1200 sub-ppi screen. Those subpixels are not as
versatile as ink dots in some ways (a green subpixel can’t suddenly turn red)
but more versatile in other ways (as you already said, a ink dot is either
there or not – it can have a few different sizes but it’s arguably easier for
a subpixel to change its brightness).

------
yish
DPI is a pointless term in comparing different medium. One of the comments
mentions Dots per Radian of viewing angle which I think makes much more sense.
Don't need/want my TV to have 300DPI, but 30 DPI would actually be pretty
awesome there.

~~~
Yaggo
Indeed. Any medium (regardless of dot size) will appear razor sharp when
viewed from far enough (so that one can't see individual dots).

Of course, if there are enough dots in particular physical area, the human eye
is incapable to see a single dot from the closest possible focusing distance.
This is what I would call a "retinal diplay". Not sure if 326 DPI is enough
for that, but is surely enough for the iPhone's screen.

<http://en.wikipedia.org/wiki/Optimum_HDTV_viewing_distance>

------
jrockway
Interestingly, according to the Wikipedia page, the OpenMoko phones had some
of the highest-res displays, and they are already many years old. (I got one
to play with, and the display was pretty nice. But the radios were crap, which
kind of made it useless as a phone.)

------
ZeroGravitas
I'm always amused at people advertising advances in technology, via the medium
of the old technology.

So recent examples have been seeing adverts in Standard Def for High
Definition devices or services. It seems to involve lots of slow-mo and highly
intricate images that by their very existence on your SD screen mean you don't
need an HD screen to view them.

Similarly DAB digital radio was advertised on standard FM radio with lots of
space-age swooshy noises.

For high-dpi screens it seems water droplets are all the rage.

------
houseabsolute
Interesting that you should mention the typography of your blog because I
always have trouble reading it and have been wondering for some time what
about it makes it so hard.

~~~
mcav
Line spacing (particularly between paragraphs) doesnt match, and header
negative-margins make it look uneven to me.

------
Derbasti
I think that Apple is following Android here instead of the other way around.
Still, I like the idea of having more pixels at my disposal. We will see what
the app store will make of that.

------
staunch
My laptop (Dell D830) is 1920x1200 in 15.4": 147 PPI. I love it. I'd buy
2560x1600 at 15.4" if I could.

