
Scientists use camera with human-like vision to capture 5,400 FPS video - bookofjoe
https://petapixel.com/2019/07/09/scientists-use-camera-with-human-like-vision-to-capture-5400-fps-video/
======
igorkraw
Haha, I worked with a relative of this camera during my bachelors :-) the DVS,
e-DVD and DAVIS are some of the more useful results of neuromorphic
engineering research, and really shine if you combine them with NNs or other
noise-tolerant signal processing methods

~~~
p1esk
Have they been combined with NNs? If so, how?

~~~
igorkraw
AFAIK, mainly with SNNs or recurrent networks, since they can learn how to do
detection etc. on the stream of events and don't need "frames". I'd have to
google for publications as well since this was a while ago

------
mzs
That's basically a copy/paste of [https://www.engadget.com/2019/07/09/event-
camera-slow-motion...](https://www.engadget.com/2019/07/09/event-camera-slow-
motion-hdr/) but without the github link [https://github.com/uzh-
rpg/rpg_e2vid](https://github.com/uzh-rpg/rpg_e2vid)

------
lota-putty
1\. Why didn't they compare it with contemporary 5000+ FPS camera[s]?

2\. What's the size difference as opposed to contemporary 5000+ FPS camera[s]?

With good Dynamic Ranging and affordability, it's good for independent film-
makers?

~~~
Hnrobert42
Yeah. That comparison video is annoying. At one point they have three videos
side-by-side with their video in the middle. In the next comparisons, theirs
is on right. Aaaaaggh. Little things like that drive me nuts.

I have no expertise in this realm but some of their bullet videos seemed a
little “artificial” or CGI. Maybe it is that some parts of a frame had much
more detail/precision than other parts of the same frame.

------
dharma1
Some "normal" cameras for comparison:

[https://www.slomocamco.com/cameras/](https://www.slomocamco.com/cameras/)

3600fps at 720p, raw 10 bit dng, £5000.

Some Phantom cameras have faster frame rates
([https://www.phantomhighspeed.com/products/cameras/ultrahighs...](https://www.phantomhighspeed.com/products/cameras/ultrahighspeed/v1212)),
but you'll pay over $100k

~~~
bookofjoe
[https://www.wired.com/2015/04/apple-watch-
design/](https://www.wired.com/2015/04/apple-watch-design/)

>Yet what Dye seems most fascinated by is one of the Apple Watch's faces,
called Motion, which you can set to show a flower blooming. Each time you
raise your wrist, you'll see a different color, a different flower. This is
not CGI. It’s photography.

"We shot all this stuff," Dye says, "the butterflies and the jellyfish and the
flowers for the motion face, it's all in-camera. And so the flowers were shot
blooming over time. I think the longest one took us 285 hours, and over 24,000
shots."

He flips a few pages further into the making-of book, onto the first of
several full-page spreads with gorgeous photos of jellyfish. There's no
obvious reason to have a jellyfish watch face. Dye just loves the way they
look. "We thought that there was something beautiful about jellyfish, in this
sort of space-y, alien, abstract sort of way," he says. But they didn't just
visit the Monterey Bay Aquarium with an underwater camera. They built a tank
in their studio, and shot a variety of species at 300 frames-per-second on
incredibly high-end slow-motion Phantom cameras. Then they shrunk the
resulting 4096 x 2304 images to fit the Watch's screen, which is less than a
tenth the size. Now, "when you look at the Motion face of the jellyfish, no
reasonable person can see that level of detail," Dye says. "And yet to us it's
really important to get those details right."

~~~
dharma1
Yeah the Phantoms get a lot of use in sports and nature documentaries. The
level of detail is really a sight to behold . These guys work magic with them
in food advertising:

[https://foodfilm.fr/](https://foodfilm.fr/)

------
pontifier
Several years ago I had an idea for a camera made from harmonically
oscillating pixels that shift frequency based on intensity. It would have many
of the advantages of event cameras, but with extremely accurate absolute
brightness measurent as well.

Does anyone know if this exists?

~~~
thatcherc
Sounds like MKIDs - microwave kinetic inductance detectors. They're arrays of
little LC oscillators where the inductance changes when a photon hits. They
can actually resolve frequency and time of arrival of a photon, which is super
useful, especially for photon-limited astrometry. [0] [1]

[0] -
[https://wikipedia.org/wiki/Kinetic_inductance_detector](https://wikipedia.org/wiki/Kinetic_inductance_detector)

[1] -
[https://web.physics.ucsb.edu/~bmazin/mkids.html](https://web.physics.ucsb.edu/~bmazin/mkids.html)

~~~
no_identd
[https://en.wikipedia.org/wiki/Kinetic_inductance_detector](https://en.wikipedia.org/wiki/Kinetic_inductance_detector)

FTFY

------
0xfaded
[https://inivation.com/buy/](https://inivation.com/buy/) ($3000-$6600 usd)

Unfortunately not quite at "buy to play around with" prices, but for the right
application within the realm of affordable.

I work on underwater computer vision and would love one of these. Limited
lighting = long exposure times. Motion blur can be mitegated by moving slowly,
but then you have all sorts of lighting phenomena which makes not over/under
saturating the image a real problem.

------
bb101
This is revolutionary, even if only from a dynamic range standpoint. It's
somewhat disappointing to see the first comment under their article is "Looks
like VHS" ... really?

------
amelius
So the actual framerate depends on the number of events that take place in a
time interval?

It seems to me that an "explosion" would generate a lot of events.

Also, in what way is this different from a codec, i.e. isn't this just a much
simpler form of mpeg?

And will this produce artifacts, or drop events?

------
bookofjoe
full paper:
[http://rpg.ifi.uzh.ch/docs/arXiv19_Rebecq.pdf](http://rpg.ifi.uzh.ch/docs/arXiv19_Rebecq.pdf)

------
Sparkenstein
awesome, now just waiting some manufacturer to embed this in a cellophane

