
Why was it so hard to take a picture of a black hole? - ColinWright
https://www.askamathematician.com/2019/04/q-why-was-it-so-hard-to-take-a-picture-of-a-black-hole-what-are-we-even-looking-at/
======
wizardforhire
Uhhhh this statement in the article about halfway down is incredible!

“You need to be able to tell the difference between one wave front and the
next, and if the next wave front is 1.3 mm behind and traveling at the speed
of light, then you need to reliably distinguishing between events 4
picoseconds apart (4 trillionths of a second). So, every telescope needs a
shiny new atomic clock and a really fast camera. You begin to get a sense of
why the data consolidation looks more like a cargo shipment than an email
attachment; trillions of snapshots every second of not just the waves you’re
looking for, but also the waves that will cancel out once all the data is
processed. Add to that the challenge of figuring out (generally after the
fact) where every detector is to within a fraction of a millimeter even as
their orientation changes and the Earth rotates, and you’ve got a problem.”

Just stupefyingly complex and amazing!

~~~
RedOrGreen
Calling it a "really fast camera" elides much of the actual difficulty. We're
trying to tag individual wavefronts of light at different telescopes, record
them, and then play them back at a central "correlator" with the appropriate
delays so that the waves come to a focus.

For a wavelength of 1.3 mm, we'd want the time tagging to be better than a
quarter of the wavelength at least - say 0.3 mm. The speed of light is 300
mm/ns (a foot per nanosecond is the shorthand beloved of circuit and chip
designers). So, for 0.3 mm, we're going to have to get down to a wavefront
tagging accuracy of 0.001 ns.

No clock is going to get there, but if we can get ~close enough, we can use a
procedure called fringe fitting to determine the clock corrections by looking
at the wavefronts. (Does it line up this way? How about this way? How about
now? Yes, it's as laborious as it sounds, but computers, eh.)

This is all in the calibration of data, before we do the Fourier inversion to
create images - the magic of radio interferometry is that we can record the
signal to disk while preserving phase. Optical photons can not be recorded and
played back with phase preserved - optical interferometry has to split up the
photon streams and send different parts to be correlated against streams from
other telescopes, so you run out of signal quickly. Meanwhile, we can record
radio waves at the 27 VLA dishes, say, and play them back for correlation on
all 27*26/2 = 354 baselines, no problem. That's why radio VLBI is a thing, but
not optical VLBI.

Even as a professional radio astronomer, the underlying physics is deep and
almost magical.

~~~
hinkley
Shannon says you have to record at twice the frequency of the signal to see
the signal. What makes you guys double that again?

~~~
Balgair
Because you'd also like to know the phase of the waves. If you get unlucky and
get just DC offset of the waves (think of it as the 0 crossing of a sine wave)
then you've no idea what the phase is. The peak could be before the 0 crossing
and the minimum could be afterwards, or it could be the opposite (pi offset in
phase). Granted, this is unlikely to occur, you're much more likely to not run
into these scenarios. Ideally, you want 4 pieces of data per wave, the 0
crossings and the max/min, for each wave. From that you can get the amplitude
and the phase for certain (again, luck is involved)

~~~
apcragg
Can you not make a coherent quadrature detector? I was under the impression
that those were pretty common in optical communication systems. Edit:
nevermind, 230GHz, so millimeter wave but mixers and oscillators exist at that
frequency, are the noise figures just too high to use?

------
enriquto
Bouman's 2014 PhD thesis is refreshingly readable:

[https://people.csail.mit.edu/klbouman/pw/papers_and_presenta...](https://people.csail.mit.edu/klbouman/pw/papers_and_presentations/thesis.pdf)

Everybody can understand, at least the introduction. The author treats two
related problems. The first is the well-known (today) integration of data from
several separate antennae. The second is the problem of recovering an image of
an object from the light reflected on a white wall. They are formally quite
close, and the underlying math is, at many points, the same.

~~~
joshvm
Previous HN discussion about seeing round corners:
[https://news.ycombinator.com/item?id=17902177](https://news.ycombinator.com/item?id=17902177)

------
almostarockstar
> Once we set up an array of space telescopes throughout cislunar space (the
> volume inside the Moon’s orbit) we’ll get pictures of the SMBs in the cores
> of every nearby galaxy and that’s when the science really gets started.

This gave me goosebumps. That's the kind of stuff that justifies a permanently
inhabited moon-base.

~~~
posix_me_less
How? It justifies telescopes on Moon, not people on Moon.

~~~
vl
It justifies telescopes on the really high orbit, but not on the Moon: unless
telescope is manufactured on the Moon, logistics of flying large telescope
there and safely landing it are so complex, that it’s way easier to leave in
Earth’s orbit and get same results from there.

~~~
hughes
Stationkeeping within a fraction of a millimeter is much easier when your
instrument is on solid ground. It might simplify the platform to the point
where descent into the lunar gravity well is worthwhile.

~~~
Armisael16
Building on the surface destroys the entire benefit of the project. The moon
isn’t large enough to give you the effective ape rather size you need - if it
was you’d be fine on earth.

~~~
hughes
The point would be to combine this with the earth-based telescopes to create a
400,000km aperture.

------
PopePompus
If you took the straw from a Big Gulp (1/4 inch in diameter) and made the
straw 20,000 miles long, then looked at the sky through that straw, the patch
of sky you'd see would be the size of the shadow of the M87 black hole.

~~~
robin_reala
Which is 6.5mm / 32,000km approximately.

------
AdmiralAsshat
The _real_ story is that my childhood was filled with lies, and every "photo"
of a black hole I saw in gradeschool science books were really artist
renderings.

~~~
dennisgorelik
That story is still true today: we only have artist renderings of the black
holes and do NOT have real pictures.

The data manipulations that this EHT team did to process their raw data - is
NOT acceptable from the perspective or a correct scientific experiment.

They got their images only when they allowed themselves to creatively
interpret data from their telescopes

~~~~~~~

[https://youtu.be/UGL_OL3OrCE?t=1177](https://youtu.be/UGL_OL3OrCE?t=1177)

19:37

What you can do is to use methods where you [have] do not need any calibration
whatsoever and you can still can get pretty good results.

So here on the bottom at the top is the truth image, and this is simulated
data, as we are increasing the amount of amplitude error and you can see here
... it's hard to see ... but it breaks down once you add too much gain here.
But if we use just closure quantities - we are invariant to that. So that
really, actually, been a really huge step for the project, because we had such
bad gains.

~~~~~~~

They also deleted multiple critical comments from that video presentation.

E.g. "Pratik Maitra" posted multiple comments that later disappeared.

~~~
ims
Do you think the fact that the CT scanner at your local hospital needs to be
calibrated and computationally reconstructed from X-ray intensities mean it
does not result in an "image"?

When we use side-scan sonar to create representations of the ocean floor (e.g.
[https://commons.wikimedia.org/wiki/File:Laevavrakk_"Aid".png](https://commons.wikimedia.org/wiki/File:Laevavrakk_"Aid".png)),
they are computationally reconstructed from the raw data which are not
intrinsically recognized as pixels without reconstruction. Are these not
"images"?

What is your actual contention here? Is it that any representation which is
not the result of a traditional visible-light camera doesn't count as an
"image"?

If so it's an irrelevant distinction to make. If not, you need to articulate
in a specific and informed way why the _way_ they reconstructed the image was
wrong or could be improved.

It seems from your blog that you don't really understand what a "prior" is and
why it might be useful for this kind of signal processing.

~~~
dennisgorelik
> CT scanner at your local hospital needs to be calibrated

Of course the scanner (and any other measurement tool) need to be calibrated.
Specifically, the scanner (and telescope) needs to be pre-calibrated based on
already known samples.

In case of telescope, it needs to be precalibrated based on known images of
remote stars.

Katie Bouman (the face of EHT imaging team), however, claims: "you [have] do
not need any calibration whatsoever and you can still can get pretty good
results"

Check it out, she actually said that:
[https://youtu.be/UGL_OL3OrCE?t=1180](https://youtu.be/UGL_OL3OrCE?t=1180)

I am surprised that only few people caught that flaw.

------
oevi
A very good explanation on what we are looking at:
[https://www.youtube.com/watch?v=zUyH3XhpLTo](https://www.youtube.com/watch?v=zUyH3XhpLTo)

------
benj111
So its kind of touched on in the lensing part, but gravity acts on light.

So the black hole is slowing down the light travelling towards us.... But the
speed of light is a constant so... 'time' is slowed down?

So does the mean that the light is 'older' than the 54 million years it took
to reach us?

I'm over 50% sure there's a flaw in my reasoning somewhere here...

~~~
kl4m
The light is redshifted when it "climbs out" of a strong gravity field but
still moves through space at a constant speed.
[https://en.m.wikipedia.org/wiki/Gravitational_redshift](https://en.m.wikipedia.org/wiki/Gravitational_redshift)

~~~
benj111
Ok that makes sense.

2nd stupid question: I though gravity acting on light was a 'light as
particle' thing, rather than 'light as wave' thing? If that were the case
gravity acting on light as particle manifesting in light as wave doesn't seem
consistent.

~~~
Iv
A photon does not switch from particle behavior to wave behavior. Physicists
do. These are two different models for the same thing.

In some ballistics problems, you'll consider a flat earth (baseball problems)
in some others a spherical one (spaceflight). It is about which model makes
the math simpler.

~~~
SiempreViernes
I'd say physicists are pretty constant in their coffee seeking behaviour (the
dominant operation mode).

------
lisper
Here is Katie Bouman, one of the EHT team members, explaining it in
excruciating detail:

[https://www.youtube.com/watch?v=UGL_OL3OrCE](https://www.youtube.com/watch?v=UGL_OL3OrCE)

And an interesting side note:

[https://www.vox.com/science-and-
health/2019/4/16/18311194/bl...](https://www.vox.com/science-and-
health/2019/4/16/18311194/black-hole-katie-bouman-trolls)

------
lordnacho
So if it's a matter of increasing the effective aperture, does that mean we
can launch some spaceships and do the same trick with an arbitrarily large
interferometer?

~~~
joshvm
Yes, in theory. The challenges are likely to be dish size and getting accurate
enough positioning. Not to mention the downlink capacity required (and also
onboard storage).

~~~
SiempreViernes
What I know of space VLBI is that it exists, and in the one first hand account
it didn't give extra detail.

Do you think star tracking doesn't give good enough pointing solutions?

~~~
joshvm
It's not just pointing (2D) but also getting the baseline between telescopes.
This is also being investigated for the LISA system
[https://en.m.wikipedia.org/wiki/Laser_Interferometer_Space_A...](https://en.m.wikipedia.org/wiki/Laser_Interferometer_Space_Antenna)

------
frogpelt
He makes an interesting statement: "Science advances when we’re wrong or
surprised or both."

~~~
dredmorbius
Theoretical advances require failures of existing worldviews.

They either don't make accurate predictions or explanations, or they fail to
account for phenomena.

Science is the internalisation of expecting, admitting, and embracing error.

------
nailer
If you'd like to skip the very lengthy introduction, search for "The reason
this hasn’t be done before".

------
dclowd9901
I’ll admit: I wasn’t a person who was wowed by the picture of a black hole.
For all intents and purposes, it isn’t a very good picture.

Reading this write up gave me a much better appreciation for the difficulty of
actually just capturing that image, which is, I'm sure, what people wanted me
to focus on when seeing the photo, but that sort of context requires a write
up like this, and can’t be relayed through a blurry small picture.

------
perfmode
does anyone have a link to the original, full-size image of the black hole? i
can only find 800x600 versions laying around. I want to know how large the
original is. =)

~~~
swebs
[https://www.nsf.gov/news/mmg/media/images/A-Consensus.jpg](https://www.nsf.gov/news/mmg/media/images/A-Consensus.jpg)

------
mises
Honestly curious: why did Bouman get the credit she did? According to Github,
she didn't commit nearly as much code, 2.5k LOC vs 850k lines. I understand
she migut have had a greater role in management etc.; my point is more that it
was a team effort (which she reiterated). Is this a case of the media trying
to "celebrate diversity" by unfairly cutting out the whole team which
accomplished something incredible? Or is she just the public face and
responsible for their media presence?

~~~
eloisant
Because that's how the world works, project leaders get fame and line workers
don't.

Somehow people are OK when Elon Musk gets credit for Tesla and SpaceX, Steve
Jobs for the iPhone but suddenly if it's a women people will dig the github
accounts.

~~~
MarkMc
I don't think the Elon Musk or Steve Jobs analogy is accurate. Katie Bouman
was co-leader of one of four imaging teams on the project. It seems to me the
other person who co-led that team deserves equal praise and fame, and perhaps
also the leaders of the other 3 teams.

Having said that, the look of excitement and joy on Dr Bouman's face in that
photo is so lovely and relatable that the pic was always destined to go viral.
So in that sense you could say she was bound to become the human face of the
project.

------
tardq
Well, the thing about a black hole - its main distinguishing feature - is it's
black. And the thing about space, the colour of space, your basic space
colour, is black. So how are you supposed to see them?

~~~
dclowd9901
It seems like the main issue here was that it was very far away and
comparatively small to other intergalactic features like, say, galaxies
themselves.

~~~
mamon
Forgive my ignorance, but black holes are quite common in the Universe, right?
Why couldn't we simply take a photo of one that's closer to us?

~~~
hombre_fatal
One reason is that the closer holes were so much less massive that they were
no easier to snap.

Just compare it, Messier 87, to ours, Sagittarius A*:
[https://en.wikipedia.org/wiki/List_of_most_massive_black_hol...](https://en.wikipedia.org/wiki/List_of_most_massive_black_holes)

------
dennisgorelik
"The new black hole picture isn’t really a discovery, but it is a stunning
accomplishment."

That "picture" is a stunning accomplishment in deception -- following the
steps of Elizabeth Holmes and Bernie Madoff.

This EHT team took white noise from their telescopes, then creatively
converted that white noise into one of theoretical pictures of a black hole.

==============

[https://youtu.be/UGL_OL3OrCE?t=2242](https://youtu.be/UGL_OL3OrCE?t=2242)

37:22

And you can notice like at the bottom we get really terrible reconstruction,
just cause if it fits the data very well, because you know it maybe wants to
smooth out the flux as much as possible and we don't select things like that
in the true data.

==============

They simply delete image interpretation because it does not fit the
theoretical image that they want to see. How convenient. They call it
"Calibration Free Imaging":

[https://youtu.be/UGL_OL3OrCE?t=1179](https://youtu.be/UGL_OL3OrCE?t=1179)

