Your camera synchronization signal in the wire is traveling about 2/3 of that speed. Can you imagine all the cameras in your array receiving and interpreting that signal at the same time?
If you mean, how are they seeing the non-scattered light, you are correct that they are not.
at certain points, the light pulse seems to "leap ahead" in fits and starts on its journey. is that because of the way the medium scatters the light or something?
I'm not sure it'd yield anything we don't already "know" - but it might be worth trying just because.
I like that they remind us that the video is slowed down and that time doesn't actually move at 10 picoseconds per second.
Whoa. Stares at hand
Whoa. stares at the sun
> The Planck time is 5.39 × 10-44 seconds. No measurable time can be shorter than that according to quantum physics.
Converting to FPS, that gives us:
> One thousand eight hundred and fifty-five billion billion billion billion frames per second. 18.55 septillion FPS!
So if like me you wondered if a trillion FPS is close to the maximum possible frame rate in the universe, the answer is nope!
Planck time is the time at which, if you want to do physics, you have to take into account quantum mechanics and gravity (i.e. understand quantum gravity). Something we don't know how to do. But it doesn't necessarily mean time is discrete. Maybe it is. Maybe it isn't.
I'm no physicist, but the layman explanation I've heard is that the concepts of Planck length and Planck time do not imply that spacetime is discretized into little "voxels"; instead these quantities are just limits on the uncertainty of any possible measurement.
As I understand it there are two contributing facts:
a) the Heisenberg uncertainty principle states that there's a tradeoff between certainty in position vs momentum, so if you're more certain in a particle's position you're less certain of its momentum. (For photons, momentum is proportional to frequency, i.e. wavelength, and frequency is proportional to energy.)
b) By mass-energy equivalence, anything with energy has mass, therefore higher frequency photons are more "massive". A single photon of sufficiently high frequency would form a black hole.
Putting those two together, to measure distances accurately you need higher and higher frequency photons with shorter and shorter wavelengths. For example, radar creates blurry images at ~5cm wavelengths, while ordinary photographs can be razor sharp with wavelengths of only ~500nm. The Planck length is just the wavelength at which the photon would have so much energy that it would collapse into a black hole and break our current mathematical models. That's why it's nonsensical--with current models--to talk about lengths smaller than a Planck length, but it doesn't mean that space itself is quantized. Similar argument for time.
(Also, the same logic applies to other particles like electrons, protons, and even up to macroscopic items like baseballs; everything has a wavelength...)
It might really be a boundary, but we don't know
What are those predictions? Do you mean that
> we know that it must break down at Planck time
? If so, do you have a source for this prediction? (I'm not asking why present-day physics must break down at extremely small scales. That prediction is more or less well established. What I'm asking is: Why is it the Planck scale that's so significant?)
My impression was that most statements involving the Planck scale these days are rather numerology than actual, verifiable science .
The phase shift is 1/(4 trillion) sec, but the actual framerate is much much slower.
I wonder if they also average the same phase-out shift multiple times to reduce noise (improving the bit depth of a single frame).
intel has some room to grow I see
Trying to do locales in Canada is always fun, since we can't make up our mind. "How far? 100km." "How tall? 5ft"
Not really. I think you'll struggle to find anyone who uses that now.
Love to see an ifixit tear down video on that CalTech camera. :-)
Also, not a physicist, but my understanding is that PRL is prestigious in its own right.
So in the double slit experiment you could see the path the light took only if you allowed for a portion of that light to be reflected towards the camera. But then if you did that you wouldn't end up with anything particularly new or interesting as far as I can imagine.
My intuition tells me that what we consider a "frame" in our daily experience (24 up through maybe 144) would be pretty different from what this "frame" would be, in terms of how it is captured and how it is subsequently rendered.
Frames this short mean that there is a sort of volumetric aspect to what is being catured. From the point of the camera, when light collection begins an area sweeps out in front of the camera and back in time at the speed of light, and when the light collection stops, the end of the area sweeps out from the camera lens in the same direction. The result of this is a spherical shell volume travelling backwards in time that represents the spacetime locations that can emit light and appear in the frame . The resulting picture is an integration of the photons emitted in this 4D spacetime volume, in the direction of the camera, projected onto a 2D image.
(That's assuming a point-like camera and instantaneously turning the camera on and off; in reality it'll be fuzzier but the principle holds.)
The difference is that at these incredible fast speeds, the resulting 4D spacetime volume is of human size in most directions in the places we care about , measured in human-sized measuring units like "centimeters" rather than "light-seconds". Normally we can neglect thinking about this volume because we simply spray so "much" out that we don't have to think much about capture exactly what we want, in much the same way that in normal day-to-day life we tend to act as if lightspeed is simply infinite. In this case, we actually have to think about it to get what we want.
A similar effect can be seen in network equipment, for what it's worth; at the highest network speeds, we've now significantly passed the point where a given bit being transmitted is now a human-sized fraction of a wire. If you could "snapshot" a 100Gb network cable, you could see bits on the wire. If I'm getting my math right, each bit is on the order of 2-3 centimeters, give or take the medium not being fully light-speed.
: It may be more intuitive for a moment to instead imagine that the camera is starting to emit the shell at the time that it is turned on, and stops when turned off, forward in time, as that creates a more intuitive initial image of what-seems-to-be-cause preceding what-seems-to-be-effect. Imagine the camera emitting light instead of collecting it. It may be easier to then imagine this shell going out farther and farther, getting larger as it goes (and the light inside dimming as it has to fill that volume with the same amount of energy it started with), indefinitely out into the universe. But since the camera is receiving instead af transmitting photons, the reality is the same image, just with time reversed; as the time goes farther back, the shell is farther away from the camera and larger.
: Technically it extends all the way in the direction the camera is all the way out in space back to the Big Bang or the CMB or something, but unless you're deliberately photographing space, and why you'd do that with this camera I have no idea, you don't have to worry about that.
Good to know that's not the actual speed of light!
Anyway, it looks like the photons are "crawling" by their patterns of speed, rather than traveling at a fixed speed.
My intuition is that the camera described in this article would be subject to ITAR were it developed in a western nation. But, I doubt China gives any fucks about that.