
Perfect colors, captured with one ultra-thin lens - porsupah
http://www.seas.harvard.edu/news/2015/02/perfect-colors-captured-with-one-ultra-thin-lens
======
_paulc
This looks somewhat like the concept behind a fresnel zone plate [1] - in
general these are highly dispersive (only work at one frequency) so it looks
like they have done something to address this. Many (30+) years ago my final
year project at university was trying to dynamically synthesise lenses using
by electronically manipulating the phase response across a liquid crystal
substrate - technology wasn't really there at the time but always thought that
it would have been an interesting avenue to follow up on.

[1]
[http://en.wikipedia.org/wiki/Zone_plate](http://en.wikipedia.org/wiki/Zone_plate)

------
blt
If this actually works, it's got to be one of the biggest technology advances
of the past 20 years.

I'm guessing there's a catch, but I would love to be wrong.

~~~
simcop2387
My guess on a catch would be that it'll focus ONLY the wavelengths that you
select for. the others might be diffused or given wildly different focal
points. This might mean that for certain types of photography you're limited
in what you pick up (e.g. if you've lit a scene with mono-chromatic light
that's not one of the selected wavelengths).

~~~
acjohnson55
They did say in the article that they hope to produce devices that work for a
range of wavelengths. This would be great because 3 wavelengths would not be
sufficient for capturing natural images. We can approximate the cone responses
of natural images with 3 wavelengths, which is why RGB encoding and display
works for displaying images to humans. But in capture, you want to receive
broad spectrum light at the detector, _and then_ encode it to RGB.

~~~
semi-extrinsic
Neither the sensor elements in a camera nor the pixels in a monitor are
selective for a single wavelength, but rather a spectrum. So we don't actually
approximate cone response that way. The fact that we store the channel values
as three integers does not change this.

~~~
acjohnson55
You're right that monitor pixels aren't individual wavelengths, but the
spectrum of a reproduced image can be very different than the spectrum of the
natural image. My point was just that you don't want your flat lens harshly
filtering that natural image by only operating at 3 highly tuned wavelengths
-- that's not going to be enough to capture natural images, despite the fact
that we can reproduce them with 3 components.

------
tomswartz07
There's a link to another article at the bottom of this one that hints at
other possible uses. [http://www.seas.harvard.edu/news/2014/05/collaborative-
metas...](http://www.seas.harvard.edu/news/2014/05/collaborative-metasurfaces-
grant-to-merge-classical-and-quantum-physics)

What's really interesting about this is that the metasurface lenses might be
very useful in other high-precision optics: for example, densely packed
multicolor DVDs?

Perhaps even optical imaging of very, very small materials without the need
for electron microscopy. Who knows?

~~~
Dylan16807
Optical discs are generally using the smallest light they can manage, can
colors even help?

Plus they're focusing on a single point, so I don't think chromatic aberration
would be much of a problem in the first place.

If they can play fancy tricks to get a smaller focal point relative to
wavelength, _that_ would be a powerful improvement. But still probably
monochrome discs.

~~~
tomswartz07
It could be something akin to holography.

To put it simply, they can have the same optical density, but re-write over it
several times in other colors. The same lens can observe and read the exact
point, interpreting the data in (for example) blue, green, and red.

~~~
Dylan16807
That is something that's possible, but I doubt its practicality. The longer
wavelengths can't store as much data even under ideal conditions, and
squeezing them in causes interference that hurts the density you have on the
best wavelength.

Plus you would need a much more elaborate disc-pressing mechanism.

Much simpler to add more monochrome layers.

------
altcognito
Better images out of cell phones and SLRs, or is that not a logical
application of this technology?

~~~
striking
It's not just that the images will be better, it's that the sensors can get
smaller. The silly bump that represents the camera on the iPhone6/Plus
wouldn't be there if this existed. Google Glass could be far cheaper. Smaller
SLRs, even! And everything will have far better quality than before. Truly
revolutionary.

~~~
olla
Id rather say that this means the sensors can get bigger for compact cameras
and phones. Currently the only thing that limits us making pocketable full
frame cameras with zoomable optics is the size of optics, not the size of
sensor or other electrical components. Flat optics might make full frame
camera phones even possible.

~~~
rndn
I doubt this is a big advantage because you would capture everything that is
close to the sensor (dust, fingers, scratches etc.). A lens effectively
applies an extreme low-pass filter to these things because they are massively
out of focus.

~~~
olla
From my own experience I can say it would be an enormous advantage when
shooting in low light. At least for current sensors. Where I am living, for
four months in winter have only limited hours of dim daylight and current
phone sensors are pretty much useless. You are right about dust though and
extra care has to be taken, but it seems as a small inconvenience to me
compared to sub-acceptable noise levels.

------
chinpokomon
The key part is that it has different refraction for specific wavelengths.
This implies that you will still need filters to restrict the light to those
wavelengths or you will still have chromatic aberration. Projecting light
through such a lens wouldn't have quite the same problem as the source light
could already be tuned.

In short, good for projection, but maybe still bad for capture.

------
sp332
"Light shining on it bends instantaneously, rather than gradually, while
passing through." I don't think light bends gradually going through a normal
lens. It bends at the surface.

~~~
kenbellows
I don't think that makes sense; otherwise, why would the thickness of the lens
matter at all?

~~~
sp332
It doesn't. Light can bend when it encounters a change in the index of
refraction, e.g. air to glass or vice-versa.
[https://en.wikipedia.org/wiki/Refraction#Explanation](https://en.wikipedia.org/wiki/Refraction#Explanation)
Light travels in a straight line after it has entered the glass, and again
after it exits. Only the angle and the difference in refractive index matters.
That's why you can build a Fresnel lens which is, so to speak, all surface.
[https://en.wikipedia.org/wiki/Fresnel_lens#Description](https://en.wikipedia.org/wiki/Fresnel_lens#Description)

~~~
kenbellows
that's really interesting. apparently I need to read up

------
djKianoosh
so, no chromatic aberration. no lens flare either?

------
argimenes
It sounds like an Apple marketer devised the article's title ... ;-)

