
Can a Human See a Single Photon? (1996) - adenadel
http://math.ucr.edu/home/baez/physics/Quantum/see_a_photon.html
======
sillysaurus3
Would there be any interest in a series I've been thinking of putting together
about all the fantastically counterintuitive ways your eye fools you without
you even noticing? Example:

[https://media.giphy.com/media/3o7btPOMufN5FziFWg/giphy.gif](https://media.giphy.com/media/3o7btPOMufN5FziFWg/giphy.gif)

The fact that colors influence colors around them means that there are far
more than 256^3 viewable colors: the spatial axis is like a fourth channel.

Another fun one: If you take a paper plate and poke a hole in it, then take a
toothpick and stick it in front of the hole, from the bottom, pointing
upwards.... Then it'll look like it's coming _down_ from the top! It's very
hard to explain in words, but basically it's a pinhole camera, so everything
appears upside-down. Your brain tries to compensate and flips it right side
up, but it can't quite do that with the toothpick-hole scenario. So when you
push the toothpick up from the bottom, it looks like it's coming down from the
top.

Lastly: [https://i.imgur.com/epXYhJd.png](https://i.imgur.com/epXYhJd.png) If
you cover up the top half of the image, the bottom looks gray. But the
strawberries look red.

A fun puzzle is, how do you see an individual pixel? Can you think of any
common object that would help you do this? (No magnifying lenses, for example,
and using a phone is sort of cheating.) Answer below.

...

Answer: pluck a hair out of your head and put it on your screen. The line
along the edge of the hair will turn into a stairstep pattern, which is just
another way of saying you can see individual pixels.

~~~
tbabb
Perhaps a nitpick, but 256^3 doesn't really represent anything about human
perception-- it's just how we store colors in computer memory. The human eye
can perceive a far wider range of colors than any screen can (currently)
represent. It's just as valid (and probably more accurate) to use floating
point to store color (rather than bytes), and in many high-fidelity
applications this is actually how it's done.

The bit about colors "looking different" in different contexts also isn't
really an "extra axis". The example you give is just your brain (attempting
to) separate out the illumination from the scene colors. The range of actual
colors you can perceive isn't increased by this.

Fundamentally human color perception is three-dimensional, because humans have
three classes of color receptors on their retinas. There are also rods in the
retina (the cells that are color-insensitive and detect only brightness), and
in principle they could inform color perception, but this does not seem to be
the case-- the intensity of light only shifts things around in the original
three-dimensional space.

There are also a very, very few people (women only) with genetic mutations
that give them _four_ different color receptors in their retina. However,
there is no evidence that their brains can take advantage of this extra
information and allow them to perceive an extra "axis" of color. It is much
more likely that the "mutant" receptors are lumped in with the stimulus of one
of the others, which will shift their perception of color, but not expand it.

In short: There are three axes of color perception. Optical illusions don't
change that.

~~~
sillysaurus3
Optical illusions reveal that we have more than three axes of color
perception. The first gif and the last photo are examples of this, and the
renaissance artists had a firm appreciation of it. (See DaVinci's journals,
specifically the chapters on color.)

Part of the reason I want to put together this series is to rekindle this
knowledge. It seems very much "lost": your comment represents the status quo
of color science, but if you were an ambitious artist in 1450, you'd run
experiments and determine that much of what we take as fact is actually quite
a lot more complicated in practice.

The eye, and especially our perception of colors, is so complicated that
entire tomes barely scratch the surface. And Munsell's discoveries were only
made in 1900, barely over a hundred years ago. There's still a lot more to
discover.

To be clear: it's true that our three receptors imply three axes of color
perception. But when you assemble an image as a whole, the entire portrait
results in an experience quite different than any individual color.

EDIT: _That the perception of color is influenced by surrounding colors is
true, but that this adds a "fourth channel" is [citation needed]._

One of the most important aspects of color science is that you have to be
willing to believe the possibility that some strange ideas are true.

In this case, I can dispel that illusion, but only if you're open minded to
it:

Consider a painting. Why choose a certain shade of yellow? To produce an
effect.

The above images demonstrate that _where_ you put that shade of yellow causes
a very different effect.

Now, if you accept as an axiom that the only reason to choose a color is to
produce an effect, then that means all three primary colors (red, green, blue)
are different axes in your ability to cause an effect. But the fact that the
arrangement of colors causes different effects means that the spatial
arrangement is a fourth "lever" that you can use to change the experience.
That implies it's accurate to call this phenomenon a fourth channel.

(I'd post this edit as a reply, but HN isn't having it right now.)

One interesting area of science to investigate is the frequency spectrum of
natural images. Our brains are tuned to see certain frequencies more than
others, e.g. blades of grass. And the reason colors produce different effects
depending on where they are in relation to each other is to help us resolve
different shapes in an image.

[http://web.mit.edu/torralba/www/ne3302.pdf](http://web.mit.edu/torralba/www/ne3302.pdf)

[https://www.cs.cmu.edu/~efros/courses/LBMV07/presentations/0...](https://www.cs.cmu.edu/~efros/courses/LBMV07/presentations/0208Gist.pdf)

~~~
tbabb
The statement I take objection to is this:

> The fact that colors influence colors around them means that there are far
> more than 256^3 viewable colors: the spatial axis is like a fourth channel.

That the perception of color is influenced by surrounding colors is true, but
that this adds a "fourth channel" is [citation needed].

~~~
just2n
I'm not sure I buy that it means there are more absolute viewable colors
there, but rather there are merely different interpretations of the same
colors. We do have fairly strong confirmation that our brain doesn't interpret
color absolutely but can adjust based on luminescence (see: all examples of
shadow illusions) and patterns (see:
[https://www.youtube.com/watch?v=mf5otGNbkuc](https://www.youtube.com/watch?v=mf5otGNbkuc)),
perhaps that's what is meant by a "fourth channel"?

------
splittingTimes
Let me up the ante, not only can we detect single photons.

A pioneering study at the interface of biology and physics found, that
isolated rod photoreceptors of frogs are sensitive to differences in the light
sources statistical properties, e.g coherent (laser) vs thermal (sun, light-
bulb) vs sub-poissonian (resonance fluorescence of quantum dots) light
statistics:

"Measurement of Photon Statistics with Live Photoreceptor Cells" (2012) [1]

"The results indicate differences in average responses of rod cells to
coherent and pseudothermal light of the same intensity and also differences in
signal-to-noise ratios and second-order intensity correlation functions"

It seems Lorentz hypothesis in 1911 was right, that the boundaries of our
perception are set by basic laws of physics, and that we reach the limits of
what is possible.

===

[1] [pdf] [https://physics.aps.org/featured-article-
pdf/10.1103/PhysRev...](https://physics.aps.org/featured-article-
pdf/10.1103/PhysRevLett.109.113601)

~~~
theoh
I can't find what this Lorentz hypothesis of 1911 is, but surely you are
claiming too much. Physical entities can detect X-rays, but humans haven't
evolved that ability (except, recently, externally/collectively).

The idea that our perceptual apparatus is optimal, rather than a contingent
product of a particular evolutionary path, is easy to refute -- but maybe that
isn't what you mean by "reach[ing] the limits of what is possible."

~~~
mromanuk
We can reach maximum optimization of wathever physical property we use. For
example our brain and it super efficiency vs a super computer.

~~~
bhaak
But humans had millions of years to evolve.

The computer is only 70 years old. Give it 100 years and we will make it as
efficient as a human brain, possibly even better.

Evolution is a slow process. Non sentient trial and error. We beat that in
time easily.

------
looki
> However, neural filters only allow a signal to pass to the brain to trigger
> a conscious response when at least about five to nine arrive within less
> than 100 ms. If we could consciously see single photons we would experience
> too much visual "noise" in very low light, so this filter is a necessary
> adaptation, not a weakness.

Wonder if this is related to what I experience. Visual snow - basically seeing
static in your vision, especially at night. I sometimes don't notice it for
weeks, but then I just pick up on it again and can't stop noticing. It's still
hard for me to believe that it's not normal, given that it happens with any
camera ever built, but apparently not many people experience it.

~~~
blocked_again
Visual Snow is a condition which very few people are aware of (including the
doctors). I still remember the day I started experiencing the snow. It was a
sudden trigger. I couldn't stop noticing it for an year mainly because I was
scared whether I am going to be blind. I went to atleast 4 or 5 doctors and no
one found any problem with my eyes and I often got ridiculed by friends for
making up something which I apparently don't have according to the doctors. I
came to know about visual snow after an year from Internet and I realised that
many other people have this and they also had a similar experience as mine.

[https://www.youtube.com/watch?v=f34R3GC5I5k](https://www.youtube.com/watch?v=f34R3GC5I5k)
[https://en.wikipedia.org/wiki/Visual_snow](https://en.wikipedia.org/wiki/Visual_snow)

If you are a billionaire you can support the fundraiser for finding a cure to
Visual Snow which has only reached 1/5th of it's goal even after 40 months :(

[https://www.gofundme.com/visual-snow](https://www.gofundme.com/visual-snow)

~~~
taneq
Two thoughts, in order:

1) This isn't normal? I get this plus an auditory analogue (imagine an
impossibly high pitched sound like crickets chirping) which I've always
figured was just what my 'noise floor' sounded like.

2) Why would you see this as something to cure? (Edit: Assuming it's something
you only see at very low light levels - if you get it all the time even in
bright light then that would be pretty bad.) It's just what you get when your
visual system's auto-gain tries to amplify darkness. I'm not sure what else
you'd expect.

~~~
blocked_again
I see it all the time! A couple of years before I could look at the sky in
morning or evening and enjoy the clouds. Going to beach and watching sunset
was my favourites. Now it's a pain to do the both. I can see star like
particles flickering in the sky and floaters. It's also uncomfortable to look
at the monitor if the brightness is high. Because of all these visual snow
patients also suffer from depression. My vision used to be like watching a
movie in Full HD TV before. Now it's like watching a movie in an old crappy tv
which is having a bad signal reception.

~~~
smallnamespace
That's odd, in that I've probably noticed visual snow-like artifacts ever
since I was a child, but it never bothered me even a tiny bit.

Couple of personal observations:

\- Most of the time I only notice snow or artifacts if I'm consciously looking
for them or don't have anything else conscious occupying the brain (e.g.
staring at a wall out of boredom, or closing my eyes and still paying
attention to visual input)

\- Once you start looking for visual artifacts, you'll see them everywhere.
Right now if I stare at my ceiling in dim lighting, I see little multicolored
'heat wave' patterns roiling about as my visual sensory system works overdrive
to extract more signal that there actually is. I also notice a slight
'ringing' halo around bright objects. But as soon as I try to do anything at
all, my brain apparently decides that other things are more important and
actively filters all this stuff out

Don't discount the possibility that you're putting yourself in a vicious cycle
here:

perception of something wrong -> heightened subconscious threat processing
(your brain starts looking for a problem in your visual perception) -> more
conscious awareness of visual artifacts -> perception of something wrong

The way to break that cycle is to just worry about more important matters, and
it'll either go away by itself or you'll stop caring.

------
tristanj
Article is from 1996.

Another study was done last year and they also found humans can probably see
single photons, though the study is limited as its sample size is n=3.

[https://www.nature.com/news/people-can-sense-single-
photons-...](https://www.nature.com/news/people-can-sense-single-
photons-1.20282)

~~~
ekianjo
> can probably see single photons

Photons don't exist in the first place. Science has moved on and modern models
don't assume the existence of actual particles. It's only electromagnetic
waves.

~~~
colanderman
Ok, tell that to the Standard Model. [1]

Like it or not, in modern physics, "photon" refers to a particular type of
discrete lump of momentum and energy. It is a thing you can definitely have
one of.

[1]
[https://en.wikipedia.org/wiki/Standard_Model](https://en.wikipedia.org/wiki/Standard_Model)

~~~
ekianjo
Then its not a particle with zero mass as often defined.

------
jankotek
Under ideal conditions human can see 9th magnitude stars. We tried under real
sky, and got 8.1 magnitude stars at 80% probability.

Not sure how it translates to photon counts.

------
dang
Discussed in 2008:
[https://news.ycombinator.com/item?id=299724](https://news.ycombinator.com/item?id=299724).

------
akashakya
If you like the these kind of writeups, You might like the parent post, which
have many such intresting articles from the Usenet Physics FAQ.

[http://math.ucr.edu/home/baez/physics/index.html](http://math.ucr.edu/home/baez/physics/index.html)

~~~
gciruelos
john baez is a brilliant guy.

and also he's the cousin of joan baez, which makes me appreciate both of them
even more.

------
baking
There is a well known (I heard about it in the late 70's) problem with these
types of experiments. The assumption is that if the subjects' "guesses"
statistically match the assumed hit rate of photons then there must be a cause
and effect. This experiments seems to make no attempt to eliminate the
possibility that this is completely random.

~~~
sgk284
All of human reasoning and understanding comes from matching our observations
statistically with potential causes.

You may heat up a tea kettle a thousand times with a flame and conclude that
the flame is heating the tea kettle, but perhaps every time the flame was too
small and too far and it was in fact another source of energy hitting it from
the environment warming it.

Our understanding of the world comes from substantial enough correlation that
it moves into the realm of cause-and-effect. Though admittedly we often assume
this too quickly.

~~~
baking
But you have to be able to have a control. I heard Barbara Sakitt speak around
1976-77. She was the author of the widely cited "Counting Every Quantum"
(1972) and she basically said that her original methodology was flawed.
Unfortunately, no one cites her later work and I don't have access to the full
text of "Information received from unseen lights" (1976) which I think was the
paper she was presenting.

------
appleflaxen
I would never have guessed this result, because it seems like a vast
overoptimization for light detection: in what circumstances would a human
being need to see a single photon? And if those circumstances are rare, then
what is the cost (in fitness, energy, etc) that we pay to maintain that
ability?

~~~
baking
Rods can respond to individual photons, that is pretty much accepted. The
issue is what can we perceive. The first nerve synapse screens out about 50%
of isolated photons (ones that are not correlated from other rods) which has
been estimated from modeling. Which means that some of those singular photons
are getting through (perceived) and can be isolated from the background noise
through careful experiments.

I don't think we evolved to see single photons, but that we evolved to be very
sensitive to light while reducing the noise and that sweet spot means that we
can quite possibly, under ideal conditions, see individual photons slightly
more than we think we see non-existent photons.

------
jankotek
Humans can see single gamma photons, even with closed eyes :-)

~~~
cocoablazing
[https://www.orau.org/ptp/articlesstories/invisiblelight.htm](https://www.orau.org/ptp/articlesstories/invisiblelight.htm)

------
z3t4
When we are able to guess, eg not have to be 100% accurate - all our sensors
become super human.

------
tudorw
an aside, but an interesting one, Biophotons... was new to me,
[https://www.ncbi.nlm.nih.gov/pubmed/?term=biophoton](https://www.ncbi.nlm.nih.gov/pubmed/?term=biophoton)

------
m3kw9
Very nice write up on the experiment

