
Camera freezes time at 10 trillion frames per second - craftyguy
http://www.inrs.ca/english/actualites/worlds-fastest-camera-freezes-time-10-trillion-frames-second
======
ourmandave
_“It’s an achievement in itself,” says Jinyang Liang, the leading author of
this work, who was an engineer in COIL when the research was conducted, “but
we already see possibilities for increasing the speed to up to one quadrillion
(1015) frames per second!”_

Just getting started... o_O

~~~
jastanton
Honest question, what can be seen at one quadrillion fps that 10 trillion
cannot already see? This question is out of pure ignorance and wonder, like
"Why would you ever want to move faster than a horse". But this is at a scale
I just cannot think in anymore.

~~~
TylerE
Imagine expirmental physics. Being able to watch things like explosions in
super minute detail. Watching light propogate.

~~~
mediawatcher234
Umm good luck getting enough photons to see anything. Shot noise

~~~
TylerE
So shoot at 1 quad, and average 1000 frames to one output frame? Light will
still be moving less than 1mm per frame.

~~~
dTal
Pretty sure the light is integrated over a frame anyway, so that will get you
the same results as just shooting at the lower framerate.

------
ttty
Where is the video? That's what I'm looking for

~~~
thanatos_dem
Here’s a video from 2011, but it’s only at 1 trillion FPS using the same laser
pulse approach -
[https://m.youtube.com/watch?v=-fSqFWcb4rE](https://m.youtube.com/watch?v=-fSqFWcb4rE)

Also, the music is pretty cringey, so I’d just leave it muted...

~~~
SketchySeaBeast
Watching this confuses me as to the timing of it all... if the conceit is that
we're watching the bundle of photons move through the bottle that's because
the photons from the source are hitting the camera, right? So is it the light
moving through the bottle + the travel time to the sensor? Should refraction
and reflection across the surface cause a lot of weird visual interference (as
the bottle's size is no longer insignificant the time scale for lights
velocity)? Is that what we're seeing there?

~~~
jcims
I had this similar realization when I watched that video from 2011 the first
time around (hard to believe it was 7 years ago).

It also gave me this odd sense of blindness, in that I cannot actually see
what's in front of me, only what literally interacts with my retina, almost
like it's made of tastebuds for light. Still weirds me out when i think about
it.

~~~
plasticchris
Your eyes are terahertz radio receivers!

------
edh649
Reminded me of this TED talk on femto-photography:
[https://www.ted.com/talks/ramesh_raskar_a_camera_that_takes_...](https://www.ted.com/talks/ramesh_raskar_a_camera_that_takes_one_trillion_frames_per_second?language=en)

------
inventtheday
"This new camera literally makes it possible to freeze time"

It literally freezes time? Come on..

------
femtocycle
How close is this to planck time?

~~~
dfee
Hijacking this as it’s getting intelligent responses: what’s the greatest
resolution we could measure today, and what’s physically possible?

~~~
InclinedPlane
Well, let's do some quick and dirty math.

Let's say you have a very bright light source, and that light source results
in shining a full 100 Watts of light into the sensor of your camera (I'm using
this not as a typical example but because it will make it easy to scale the
answer). Photons of visible light have an energy of at minimum 1.5 electron-
Volts (800nm red light), which means that 100 Watts of light represents 4.2e20
photons per second.

And that means that with only 100 Watts of light reaching your sensor you
cannot attain an fps higher than 4.2e20, because at that speed you'd only get
on average around one photon per frame. More realistically you need tens of
thousands to millions of photons per frame to have some meaningful level of
dynamic range and spatial resolution, which limits the fps to around a
quadrillion fps per 100 Watts of light falling on the sensor.

Though once you get into that range you also have problems of signalling, we
don't really have electronics that work at those speeds.

------
cronix
I imagine this would give discoveries in the LHC a boost, or at least some
amazing footage of atoms colliding and exploding :) That would be amazing!

~~~
gus_massa
No, for a few reasons:

First they are producing too many collisions (probably many quadrillion of
collisions, I can't find the exact number). Most of the are boring, so they
have a lot of filters to select and save only the slightly interesting
collisions, because otherwise it would be impossible to save the data. IIRC
they only got a few thousand of Higgs bosons, he signal to noise ratio is
almost 0.

Second to film something using light, the object must be bigger than the
wavelength of the light. There are some tricks to reduce this a little, but
you can't film elementary particles directly. The old method was to let the
particles create small bubbles or drops of water inside a recipient, and then
photograph the chain of dots with visible light. In the new equipment the
process is more complicated, but the interesting part of the collision is too
fast and too small. They only measure the particles that are flying away after
the interesting part happened, they only measure the leftovers and try to
reconstruct the actual collision.

Third, the interesting collisions have strong quantum effects, and the quantum
effects "disappear" when you are using light that is strong enough to get an
accurate position of the particles. You can imagine that if the light is
strong enough, then the light itself will bounce against the particles and
modify the direction and the collision. It sounds somewhat like magic, but
it's possible to write this correctly and precisely using a lot of math and is
one of the bases of quantum mechanics.
[https://en.wikipedia.org/wiki/Uncertainty_principle](https://en.wikipedia.org/wiki/Uncertainty_principle)

------
izzydata
How many frames per second would you need to film at for a photon of light to
be in an identical position on 2 frames?

~~~
emeraldd
For 635nm red laser light, you'd need to be sampling somewhere in the order of
9.4 x 10^14 times a second to get two samples per cycle. Based on roughly
300000000m /635nm x2 but then the question comes down to how many cycles make
up a photon and does that question even. Make sense in the first place.

~~~
abhishekjha
>9.4 x 10^14

Can you explain how did you get this number for the red wavelength?

Also can we fundamentally "see" a photon?

~~~
emeraldd
I was working from the idea of a nyquist or sampling rate needed to
"perfectly" reproduce a wave form. I know this works for lower frequencies but
I have no idea how well it translates to something like optical
frequencies/light. You can find frequency from speed / wavelength: f = v/l or
in our case f = c / l since we're dealing with light.

I was using 635nm as the wavelength (basically a red laser). That gives you:

3 x 10^8 / 635 x 10^-7 = 4.7 x 10^14

Which should be about 470 terahertz (4.72 x 10^14) give or take a bit. To
sample that "perfectly" you'd need to sample at twice the frequency or about
940 terahertz or (9.4 x 10^14).

As to your second question, I know that there are single photon detectors.
Past that, you've got me. I don't know if that can be classified as "seeing"
or not. As to size, there's [https://briankoberlein.com/2015/04/14/thats-
about-the-size-o...](https://briankoberlein.com/2015/04/14/thats-about-the-
size-of-it/) but that might or might not make sense.

That's about the limit of what I'm willing/able to say on the subject.

------
gjm11
This phys.org "article" is simply a copy of the INRS press release you can
find here: [http://www.inrs.ca/english/actualites/worlds-fastest-
camera-...](http://www.inrs.ca/english/actualites/worlds-fastest-camera-
freezes-time-10-trillion-frames-second)

Actual paper:
[https://www.nature.com/articles/s41377-018-0044-7](https://www.nature.com/articles/s41377-018-0044-7)

Website for the INRS lab's project on this:
[http://coilab.caltech.edu/research_6.html](http://coilab.caltech.edu/research_6.html)
(includes PDF of the above paper and others)

As almost always, the phys.org "article" adds nothing of value to the press
release it reproduces.

~~~
givinguflac
Thanks for the links! I see your point, but lots of people wouldn’t know about
this if it was just a press release. I know I don’t have enough time to check
for them.

~~~
jiveturkey
Yes, of course. The person posting the link, however, should do even the
tiniest bit of legwork.

------
degenerate
It doesn't really capture 10 trillion frames per second. The laser pulses at a
regular interval, and the combination of the regular pulsing + multiple
pictures taken at different points along those pulses makes for 10 trillion
distinct frames captured within the timeframe of 1 second. There is no actual
video captured.

I get that being a tech journalist is difficult, you have to juggle between
the tech and the layman. But after writing a headline, read it back to
yourself and ask: _will this put the wrong idea in people 's minds?_ If the
answer is yes, rewrite it... even if it sounds less cool.

~~~
endorphone
I don't quite understand how your original point refutes the headline. They
are saying that it captures frames at a rate that, measured in seconds, would
be ten trillion frames per second.

The headline seems entirely accurate.

~~~
quadcore
How long can they maintain that frame rate for?

~~~
batmansmk
From what I understand, it is a camera, so as long as you can record data fast
enough. I didn't read any limitation of the current setup across time (there
is one across space though). So far, it is only 25 frames captured, it is
fundamental research, not yet a product people can get value of.

Note that what we want to observe with those cams are very short, transient
phenomenons. When I was doing my internship at a particle accelerator called
GANIL, we were _only_ recording 0.5 second, which already represented close to
1TB worth of raw data. It takes months to interpret and analyze results.

EDIT: typo

~~~
aogl
That's incredible. How long did it take to write 0.5s of data to disk? I'm
guessing there's no way to sustain this as you'd be so far behind after only a
single second. I'm pretty sure we can still only store a few gigs per second.
Please correct me if I'm wrong. Very interesting though!

~~~
jboy55
The best way to think of this is it might take 100 seconds to 'record' those
10 trillion frames that occur in 1 second.

That doesn't seem to make sense, but imagine this. You want to shoot 100
frames of the first millisecond of an airsoft pellet leaving a gun, but you
have a camera that only shoots around 2 frames per second.

Your airsoft gun shoots 1 ball exactly (1ns accuracy) every second, exactly
the same velocity and direction.

You have your camera, that only captures 2 frames every second, but this
camera has an insane shutter speed, 1 microsecond, and has a shutter with that
you can time to the gun exactly.

You can also delay the release of the shutter by 1 microsecond increments.

So, you start by taking 1 picture, 10 microseconds after you shoot your
pellet. Then in the next second, 20 microseconds, you do this 100 times. You
stitch this all together, and you have a video in super slow motion of an
airsoft pellet leaving a gun. It just happens to be 100 different airsoft
pellets.

~~~
tomsmeding
I agree that the article isn't very clear on this, but I believe you're
describing the previous work.

> Using current imaging techniques, measurements taken with ultrashort laser
> pulses must be repeated many times, which is appropriate for some types of
> inert samples, but impossible for other more fragile ones.

The new innovation here actually records the frames right after each other of
one single event:

> The first time it was used, the ultrafast camera broke new ground by
> capturing the temporal focusing of a single femtosecond laser pulse in real
> time (Fig. 2). This process was recorded in 25 frames taken at an interval
> of 400 femtoseconds and detailed the light pulse’s shape, intensity, and
> angle of inclination.

------
p1esk
It would be cool to record single photon splittings along some path, as they
happen in real time [0].

[0]
[https://www.nature.com/news/2010/100728/full/news.2010.381.h...](https://www.nature.com/news/2010/100728/full/news.2010.381.html)

