
Sources: Google is buying Lytro for about $40M - prostoalex
https://techcrunch.com/2018/03/20/sources-google-is-buying-lytro-for-about-40m/
======
bwang29
Lytro has really cool technology but never really found a good use case for
consumers. I owned both the 1st generation and 2nd generation camera, as much
as I hated the UI of the software and usability of the hardware, I can clearly
see the team has tried really hard to make it usable. They tried building an
editing app, a community, a community 2.0, multiple hardware batch revisions.
I wonder why they didn’t stop sooner or up to which point they feel like this
direction is done. I felt they never really listened to the photographer
community during the process, or answer support sufficiently, they have done
pretty much the exact opposite of user first or user driven product
development in my opinion, and this still left me confused today. Burning a
few million dollars a month, I feel it really took a series of very bad
decisions to lead to this financial outcome.

~~~
Stratoscope
I discovered Lytro at the Maker Faire when they had their "borrow a Lytro"
demo. It was a great bit of marketing: take a Lytro around the Faire for an
hour (where are there are always plenty of interesting subjects) and afterward
they put all your shots in an online gallery for you. I got a fun shot or two
that day and promptly bought one of their first cameras.

But it wasn't a very good camera, and they completely missed the boat on
making it a storytelling tool.

I took my Lytro with me when I walked with Team Torani (the Italian syrup
people) in the Columbus Day parade that year, and did not get even one good
shot. I wished I'd brought my little point-and-shoot instead, or just about
any other camera, because I would have gotten dozens of good shots.

The pitch at the time was "you can put your photos on lytro.com, and people
who view your gallery can click anywhere to refocus!" But who wants to do
that? If you're looking at a gallery of photos, do you want to go clicking
around to bring different things in and out of focus? I don't.

But for a photographer, this had some potential as a storytelling tool.
Imagine having a photo that would do its own "pull focus" [1] when you view
it, like in the movies. The photographer could first focus on one subject and
then have it slowly change focus to something else to reveal something new to
the person viewing the photo. Kind of a Ken Burns effect [2] but in the focus
plane instead of - or perhaps in addition to - panning and zooming.

I remember one photo I got at the Menlo Park street fair that would have lent
itself to this. I had the camera down around knee height, and a dog came up
with his nose inches from the lens. In the background was his owner, an
attractive young lady. It was a nice shot, and I imagined having it start with
the dog's nose in focus and then gradually pulling the focus back to the
owner.

Granted, this wasn't _much_ of a story, but it was something. I couldn't even
do that. All I could do on lytro.com was set the initial focus, and then leave
it to any viewers to go clicking around to fiddle with the focus themselves.
But nobody wants to do that.

It seemed like an opportunity lost, because they were so focused (pun
intended) on the amazing technology and not on how to use it to tell stories.

[1] [https://vimeo.com/233143268](https://vimeo.com/233143268)

[2]
[https://en.wikipedia.org/wiki/Ken_Burns_effect](https://en.wikipedia.org/wiki/Ken_Burns_effect)

~~~
angrygoat
I've got a Lytro, and haven't used it for years, for many of the reasons you
articulate.

To my mind their mistake was not having an open source library that could work
with their capture format, including basic support for focusing and producing
an output image for display.

Pretty much every new photography tech has had a community of amateurs and
professionals spring up around it and (hopefully) innovate.

They missed out on that; all we got was the fun-for-a-few-minutes clicking
around thing, which gets old pretty fast.

~~~
Stratoscope
> To my mind their mistake was not having an open source library that could
> work with their capture format, including basic support for focusing and
> producing an output image for display.

Oh, now you are making me wistful for what might have been.

If they'd had any kind of library or API, I could have played around with my
Ken Burns idea by writing a few lines of code!

I've seen this same mistake over and over again going back to the earliest
days of GUI apps: designing the user experience with hardcoded interactions
instead of building the UI on top of a public API that enables others to
create and experiment with their own interactions.

~~~
nbush
Reading your parent comment and the replies has me kind of dumbfounded at the
wasted potential. Did they ever address these ideas directly? Would opening up
the format in even the most limited ways (like focus) have exposed the IP?

------
Deimorz
If you have a VR headset, I highly suggest trying out the "Welcome to Light
Fields" demo that Google released on Steam last week (it's free):
[http://store.steampowered.com/app/771310/Welcome_to_Light_Fi...](http://store.steampowered.com/app/771310/Welcome_to_Light_Fields/)

It's absolutely incredible and seems like really promising technology. Even
with the relatively low resolution of current headsets, the image quality is
amazing and you can see some slight lighting changes as you move your head,
which makes it feel far more immersive than a static photo. There should be a
lot of potential to use this for extremely realistic virtual tours.

Note: I had some slight issues with it "hitching" at the start of each scene
in the guided tour (which made me feel a bit of motion-sickness), but that was
probably because I didn't install it on an SSD (or my PC is just not quite
fast enough overall). There weren't any issues when viewing the scenes
individually afterwards.

~~~
rjth
As someone who has been working with stereo 180 videos in VR for the past few
months, this technology makes such a huge difference. While stereo photography
looks broken from most angles, Google's lightfields demo just felt natural.
Honestly, one of the best VR experiences I've tried in a while.

~~~
owenversteeg
Huh, that'd certainly explain why Google wanted to pay $40MM for it. Thanks!

------
nradov
Well that was the logical outcome. Their technology was interesting but it
clearly wasn't viable as a stand-alone compact camera. Everyone expected it
would eventually be integrated into smartphones.

~~~
guardian5x
It could be smartphones, i also think the technology would be interesting for
streetview.

------
rweba
This should not be viewed as a failure.

This is the nature of startups, even if you do every single thing right, work
hard, manage the cash flow, innovate, etc., there is a more than 90% chance
you won't get that billion dollar plus exit.

The only unique thing here is that Lytro got a lot of hype and raised a lot of
money, so people naturally expected a unicorn.

What I am saying is these guys don't necessarily have anything to be ashamed
of. In the worst case they gained a lot of experience that they can apply to
their next idea.

That said, I am curious: Will the founders get ANY money out of it? A comment
claimed that they raised $200 million at $360 million valuation, so
theoretically they may end up with nothing.

~~~
inuhj
My estimate is that the founders took money off the table during those rounds.
It's doubtful they'll see anything from the sale. I wouldn't worry too much
about them I'm sure they're sitting fine.

------
neves
Can someone explain to me reasoning of the deal? It looks to me more some kind
of Silicon Valley brotherhood deal. Sure 40m to Google is change money, but it
is a failed company and they could duplicate the tech with far less.

~~~
neves
Maybe it is the 59 patents...

~~~
kypro
The fact many of the employees seem to have been given severance would imply
that to me.

~~~
jmalicki
Are they not even keeping the computational photography engineers? I can see
jettisoning the rest, but I presumed this might be an acquihire of some great
minds in computational photography to beef up the Pixel Camera team?

------
mseebach
I tried to click around on Lytro's website, but it left me pretty baffled.
What the f... is it?

~~~
Jtsummers
Lytro makes light field cameras. A single sensor can be used to capture enough
information to do a lot of computational imagery work like the somewhat
gimmicky example of changing focus. But you also have enough data to do things
like accurately apply filters to parts of an image based on depth information.
A good simulation of viewing the image through caustics or other distortion
effects. And, to me the most interesting, reconstruct 3d information from a
single sensor.

Downsides, a 16 megapixel sensor only got something like 4 megapixel of image
resolution because of how much data has to be captured for the light field. So
consumer grade sensors resulted in (by contemporary standards) very low
resolution images. And all those neat effects weren't really available, except
maybe to their commercial partners (later products and services).

[https://en.wikipedia.org/wiki/Light-
field_camera](https://en.wikipedia.org/wiki/Light-field_camera)

~~~
mseebach
Thanks!

------
bb88
I thought the future of light field cameras would be to replace the bulky
DSLR's and their lenses.

If you can do that, then autofocus systems go away completely. What you're
left with is the shutter and zoom, and the computer's ability to make
beautiful bokeh from any image.

~~~
fudged71
The problem is the sensor cost. You are sacrificing pixels for
information/depth, meaning you need more pixels for the same result. Most
people aren't willing to go back to a 2MP camera when much better image
quality cameras are on the market.

~~~
bb88
Maybe not the home user, but people are willing to pay lots of money for
photography gear if it gets the results they want.

Broncolor, Hasselblad, Leica and Zeiss, all come to mind as specialty
companies that make money from selling expensive gear.

------
cmac2992
Awesome. After playing with Google's go pro light field images, it's really
hard to go back and enjoy regular 360 photos. The quality and 6dof sense of
presence. was amazing. Hopefully the addition of lytros tech makes it even
better.

~~~
modeless
These are the mentioned light field images:
[https://www.blog.google/products/google-vr/experimenting-
lig...](https://www.blog.google/products/google-vr/experimenting-light-
fields/)

They can only be experienced properly with a positionally tracked VR headset,
but they are extremely cool if you have one.

~~~
owenversteeg
HTC Vive, Oculus Rift, and Windows Mixed Reality headsets... and it seems like
Windows Mixed Reality headsets are available for around $200 on Amazon. Wow,
we live in the future. I guess the main cost here is the PC attached to
them... probably costs about $500 minimum if I'd have to guess for something
with the specs required.

[edit] wow, you can get an Asus pre-built desktop for $440, damn! i5-7400,
12gb DDR4, 2TB HDD, graphics good enough for VR... wow. Anyone know if it's
possible to build your own for a competitive price, or if that'd drive up the
cost? I remember the days when it was cheaper to build your own...

~~~
cmac2992
What GPU is in that prebuilt? I think you'd probably want a 1070 especially
for windows MR headsets.

Right now with gpu and memory prices inflated due to cryto mining, prebuilts
are very competitively priced. If you want to build your own, definitely go
with a motherboard gpu bundle otherwise you'll be paying $$$ for a gpu.

------
jonbarker
Light field cameras always fascinated me because they are the example of the
literal opposite of evolutionary biological inspiration (this is not an insult
by the way). The first biological 'eyes' were probably light sensitive spots
on skin with no lenses or shape, which then became cavities for directional
sensing and then lenses and retinas developed to enable image recognition.

~~~
electrograv
The compound eyes of some insects are very similar to how a lightfield camera
is optically structured.

------
m3kw9
This tells me the tech isn’t too useful from the price and from the buyer and
the amount of time it took to get a sale on that price.

That’s beside the fact that it does this selective focus thing which should be
automatic anyways or selected during the shot.

~~~
Jtsummers
Along with selective focus (which struck me as gimmicky) you do get 3d images
from a single sensor with better detail than a stereoscopic setup (2x standard
cameras). This is useful for computational work as you can get better spatial
data from one source or if you add multiple sensors you can get even more
improvement.

I always figured this would be used in a fashion similar to synthetic aperture
radar, but using image data, and applied to autonomous vehicles and similar
applications. It's better than edge detection from a standard single sensor.
And you have more to train your algorithms on since they can recognize what is
an actual object more easily versus what is an image of an object.

------
micheljansen
For AR & VR this makes perfect sense. In VR space you do not want to be stuck
with a single predetermined focus point, but have the freedom to move around
the light field. Lytro's patents alone are probably a steal at $40M.

------
Dirlewanger
Hasn't Photoshop had this tech for a couple years now? I remember seeing on HN
a video of Adobe demonstrating it at a conference.

~~~
John_KZ
Lytro captures true depth information. Sure, you can blur things in Photoshop,
but it simply doesn't look like the real, depth-of-field based blur on real
lenses, or the calculated one of light-field cameras. Re-focusing on photoshop
is just a gimmic to fake the real thing (and if you're experienced in
photography you can always tell)

------
evilset
I'll be waiting for you OASIS, but not so soon.

------
watersb
When I saw Google's spherical-photo capture bot -- basically an array of Lytro
cameras -- I figured this would happen soon.

~~~
ipsum2
Can you link me to this? I only know that Google is using GoPros to do light
field capture: [https://petapixel.com/2018/03/15/google-built-
arc-16-gopros-...](https://petapixel.com/2018/03/15/google-built-
arc-16-gopros-light-field-photography/)

------
chirau
I remember Ben Horowitz singing praises for this company. Must be a good pay
day for A16Z.

~~~
nigelcleland
From the article, the sale price is estimated at $40 million, whilst the
company had raised $200 million, most recently at a $360 million valuation.

Most likely not a good pay day, definitely a significant haircut for some of
the investors to say the least.

~~~
vadimberman
Agree, and given the standard "liquidity preference", the founders are likely
to net $0, except for the Google bonuses for the engineering (Jack Barker-like
Rosenthal is not likely to stick there).

It's sad, really. It's one of the really innovative companies, but it's not
easy to sell tech ahead of its time. Yet another incident that will encourage
to pick copycats over real innovation.

I guess Google is no longer as generous with the acquisitions.

~~~
fwgwgwgch
Can you explain what you mean by "liquidity prrference"? My understanding is
that you get money in proportion to your vested share at the time of sale

~~~
nitrogen
Search for "liquidation preference" on your favorite search engine

~~~
sooheon
Sometimes you feel like asking trusted strangers online for a synthesized
summary as it pertains to the topic at hand.

