
Vision-Correcting Displays [video] - pioul
https://www.youtube.com/watch?v=SNdapCs6vR8
======
DenisM
If I understand this correctly, this is a light-field display.

The implications are bigger than vision correction - LF display can
reconstruct actual 3d images, as opposed to the stereo images being marketed
as "3d" today. Stereo displays give two different pictures to two different
eyes, but the don't provide perspective shift (the picture doesn't change when
you move your head left and right), and they don't provide different focal
planes (your eyes focus on the screen plane regardless of how far the object
is supposed to be, creating a dissonance between the distance inferred from
the angle between the eyes and the focusing distance).

~~~
bd
Yup, I wonder if this technology could be used to improve head-mounted
displays, to be able to focus on different parts of the virtual scene, not
just being always focused on infinity (probably together with high-precision
eye-tracking to figure out depth where you try to look at).

~~~
mkeblx
It can and will in relatively short order:
[https://www.youtube.com/watch?v=deI1IzbveEQ](https://www.youtube.com/watch?v=deI1IzbveEQ)

Douglas Lanman, the researcher behind this technology work at Nvidia was hired
a few months ago by Oculus VR.

------
thelad
Don't get this, what's the point in 1 device being corrected while the rest of
the real world is blurry?

One use case example was a guy in a car with GPS navigation. So he can then
see the GPS nav but how does he drive if he can't see properly!?

~~~
Cass
If someone's far-sighted, as many people above the age of forty are, they
might be able to see the street perfectly well, but have trouble reading a GPS
screen that's only fourty cm away from their face.

And speaking as one of the people whose vision defect can't be corrected by
glasses, but only by uncomfortable contact lenses, there'd be immense value to
me in a computer I can use without lenses - especially since reading on a
screen is basically the only thing I need the damn things for while I'm at
home. For someone whose work is mostly computer-based, you might be able to
get away with taking your lenses out for the entire day and only wear them for
the drive to work and back.

~~~
danellis
> they might be able to see the street perfectly well, but have trouble
> reading a GPS screen that's only fourty cm away from their face

So they would also not be able to see their speedometer, or their fuel
guage...

This is a technological solution looking for problems that are far more
conveniently solved by conventional means.

~~~
lemonad
By conventional means, do you mean reading glasses? Observing people
needing/using them, to me it seems like a hassle and definitely not a solution
that most are satisfied with.

The video mentions both the GPS and speedometer as potential targets for this
technology. Giving it some thought and being close to the age where I might
benefit from this, it isn't such a bad idea after all.

~~~
danellis
Obviously I can't speak for other bespectacled folk, but I wouldn't call them
a huge hassle. I put them on in the morning, and take them off at night. Once
in a while I clean them. That's about the extent of my interaction with them.

(Although I'm short-sighted, so the arguments above don't apply to me anyway.)

~~~
vizeroth
While I've seen my dad struggle with reading glasses (and now bifocals) after
not needing to correct his vision for the first 40+ years of his life, I've
had mostly the same experience you describe for the last 20 years (thanks to
inheriting my vision problems from my mom's side). However, in the last couple
of years I've started taking my glasses off if I am reading for long periods
of time, because my distance vision is getting bad enough that the correction
is making the closer text slightly more difficult to read than it is without
the lenses. Eventually, this might also lead to needing bifocals myself,
despite the fact that I have no problems reading very small text within arms-
length without glasses.

Essentially, I'm getting close to doing the opposite of what most people do
with reading glasses. For now, I do most minor reading tasks and my work with
my glasses on, but for lengthy reading I take them off. Over time, I'm sure,
I'll end up taking them off (or looking below my glasses) for almost every
reading task, and eventually for work as well.

In the end, though, I don't think the technology will be likely to really
solve the issue for me, except to correct a few displays for my corrected
vision. I have to pin most of my hopes, at the moment, on improvements in
surgery and eventually being able to afford the surgery.

------
agumonkey
Additional details here [http://newsoffice.mit.edu/2014/new-display-
technology-automa...](http://newsoffice.mit.edu/2014/new-display-technology-
automatically-corrects-for-vision-defects-0731)

------
amichail
Check this out:
[https://news.ycombinator.com/item?id=180224](https://news.ycombinator.com/item?id=180224)

------
serf
One can imagine a pair of normal eyeglasses which have an IMU or accelerometer
of some sort in them. When they sense they are being taken off, the users'
phone switches profiles, the screen blurs, and the user moves seamlessly from
glasses to phone without ever realizing what took place.

Would be a neat touch. I like connected appliances that aren't. (if that makes
sense)

------
mattangriffel
I've always thought it would be interesting to correct vision at the
brain/neural level rather than the physical level. Can anyone comment on
whether this would be possible?

~~~
kghose
This is indeed an interesting question. Why can't the brain develop a reverse
blur function? We can do this with algorithms
([http://en.wikipedia.org/wiki/Adaptive_optics](http://en.wikipedia.org/wiki/Adaptive_optics)).
There was a fad (I guess) once about correcting your vision with practice
([http://en.wikipedia.org/wiki/Bates_method](http://en.wikipedia.org/wiki/Bates_method)).

I guess this is a limitation of the plasticity of our brains. People who have
hearing or vision or other losses as young children adapt faster and better
than people who have these losses when older.

It would be interesting to understand if young children with vision deficits
can learn to see better with time.

My personal recollection is that I had NO idea things were blurry until my
first visit to the optometrist when they put glasses on me. Things were so
SHARP!

I think the core of this is that out adaptability depends on sensori-motor
loops. We can calibrate our responses for faulty sensors, but we don't correct
just for the sake of correctness.

~~~
tomr_stargazer
> Why can't the brain develop a reverse blur function? We can do this with
> algorithms

Adaptive optics requires more than just algorithms -- the algorithms' output
are fed into a rapidly moving reflective surface to de-blur the incoming light
[1] -- so this is a bad analogy. You might be thinking of some of the
deconvolution algorithms [2] that the Hubble Space Telescope used before its
"eyeglasses" were installed in 1993 to improve its flawed images.

[1] "Adaptive optics works by measuring the distortions in a wavefront and
compensating for them with a device that corrects those errors such as a
deformable mirror or a liquid crystal array."
[https://en.wikipedia.org/wiki/Adaptive_optics](https://en.wikipedia.org/wiki/Adaptive_optics)

[2] "The error was well characterized and stable, enabling astronomers to
optimize the results obtained using sophisticated image processing techniques
such as deconvolution."
[https://en.wikipedia.org/wiki/Hubble_Space_Telescope#Flawed_...](https://en.wikipedia.org/wiki/Hubble_Space_Telescope#Flawed_mirror)

~~~
darkmighty
Exerpt from
[http://www.inference.phy.cam.ac.uk/itprnn/book.pdf](http://www.inference.phy.cam.ac.uk/itprnn/book.pdf)
(page 564):

"Deconvolution in humans

A huge fraction of our brain is devoted to vision. One of the neglected
features of our visual system is that the raw image falling on the retina is
severely blurred: while most people can see with a resolution of about 1
arcminute (one sixtieth of a degree) under any daylight conditions, bright or
dim, the image on our retina is blurred through a point spread function of
width as large as 5 arcminutes (Wald and Grison, 1947; Howarth and Bradley ,
1986). It is amazing that we are able to resolve pixels that are twenty-five
times smaller in area than the blob produced on our retina by any point
source. Isaac Newton was aware of this conundrum. It's hard to make a lens
that does not have chromatic aberration, and our cornea and lens, like a lens
made of ordinary glass, refract blue light more strongly than red. Typically
our eyes focus correctly for the middle of the visible spectrum (green), so
if..."

I recommend the read for those interested. There's even an experiment you can
do yourself to experience your own eyes limitations!

------
gone35
Link to the paper and supplemental information:

[http://web.media.mit.edu/~gordonw/VisionCorrectingDisplay/](http://web.media.mit.edu/~gordonw/VisionCorrectingDisplay/)

------
asadjb
I'd love for this to become common, since I have worn glasses for near
sightedness since childhood. However, the biggest problem I see with this is
that it is customized on a per user basis.

So, while I can use my phone just fine, you can't cause it's calibrated to
correct my vision deficiency. I guess that it can be used on something
extremely personal; like a phone, but I don't see it becoming main stream for
most displays, like a shared tablet or a computer or something.

------
restalis
It's an interesting idea, but even for the narrow use-case of looking at a
display it does not replace for everyone the need of wearing glasses/lenses.
This kind of display may only "fix" something that affects both eyes in an
equal measure. If the image distortion have to be different from one eye to
another, then an individual-eye-level correction is needed.

------
zavi
Can this be done at the software level? I.e. feature built in to OS that
modifies displayed image in the same way this screen does.

~~~
sillysaurus3
_In the researchers’ prototype, however, display pixels do have to be masked
from the parts of the pupil for which they’re not intended. That requires that
a transparency patterned with an array of pinholes be laid over the screen,
blocking more than half the light it emits._

~~~
darkmighty
This is required to create the intended light field, but the light field
approach isn't the only one, you can do simple deconvolution also. I wonder
how well that would work.

------
gambler
Cool, but I think most of the people with glasses wouldn't want to rely on
gadgets even more. If you can see display, but can't see physical controls or
paper, that's an issue waiting to happen.

------
Hominem
I think this is a great idea, and wonder why it took so long. I'd love to be
able to take my glasses off and read in bed, or take my glasses off for a few
hours at work while in front of my monitor.

------
AlexMuir
[http://chronicle.nytlabs.com/?keyword=germany.vietnam.iraq.r...](http://chronicle.nytlabs.com/?keyword=germany.vietnam.iraq.russia)

------
nitrogen
I wonder if there's an analogy to be drawn between the pinhole mask they use
and the lithography masks used to etch ICs on silicon.

------
judk
Sounds like the old "Magic Eye" pictures

------
kghose
I was told the following legend about how shoes were invented. A haughty
princess wanted to leave her pristine castle and explore the world. But she
found the world very dusty. She told her wise men to come up with a plan to
cover the world in a lush carpet. The wise men pondered and said that this
could not be done. The princess threw a fit. The wise men pondered some more.
Then they said "Princess, we can not cover the world in a large carpet, but we
can, however, cover your feet in a small one". And the princess was pleased.
And that, kids, is how we got shoes.

~~~
eternauta3k
It's not crazy to go barefoot/slippers in your home though.

