I'm guessing there's a catch, but I would love to be wrong.
What's really interesting about this is that the metasurface lenses might be very useful in other high-precision optics: for example, densely packed multicolor DVDs?
Perhaps even optical imaging of very, very small materials without the need for electron microscopy. Who knows?
And of course the ever-present dream of "smart" contact lenses that can adjust how incoming light is magnified on an adaptive basis or offer AR functionality without any headgear at all.
Obviously that sort of application is years away regardless of how this particular project works out. Still, this stuff does lead to some fun brainstorms and geeky fantasies.
Plus they're focusing on a single point, so I don't think chromatic aberration would be much of a problem in the first place.
If they can play fancy tricks to get a smaller focal point relative to wavelength, that would be a powerful improvement. But still probably monochrome discs.
If you're driving down the road and someone is shining two lights through the trees. One is white light and merely changes brightness, while the other rotates through three colors. Which light is easier for you to detect changing?
To put it simply, they can have the same optical density, but re-write over it several times in other colors. The same lens can observe and read the exact point, interpreting the data in (for example) blue, green, and red.
Plus you would need a much more elaborate disc-pressing mechanism.
Much simpler to add more monochrome layers.
Watch Apple introduce a 13 or 16MP camera this year that has no bump. Why? Because this time it will actually use a cutting edge sensor, rather than a 3 year old one.
In short, good for projection, but maybe still bad for capture.
Being able to directly deflect at a flat, thin plane eliminates the need to consider dispersion. Well, as long as you can deflect all wavelengths at the same angle. They talk about demonstrating 3 wavelengths and seeking to extend this to more.