At present there are two LIGO facilities - one in Hanford, Washington and another in Livingston, Louisiana. This is necessary for both denoising (it's unlikely that a seismic event or random perturbation would effect both simultaneously) and triangulation via parallax.
Right now having just two facilities (that are relatively close together) limits localization to broad regions of the sky. Additional facilities are under way/in discussion for Europe, Japan, and India. This would significantly improve both the sensitivity of the array and its ability to localize events in a smaller region of the sky. Hopefully these projects get funded. LIGO stands to resolve some of the biggest open questions we have in cosmology.
A precursor concept of a zero-drag satellite can be seen in Stargate SG1: "The Serpent's Venom", where an armed mine from an orbital minefield is brought aboard a shuttle, and the pilot must match the mine's trajectory changes.
Also, your comment would be much nicer without the first sentence. And if you think about it, umbrellas are pretty neat. The folding mechanism is nifty, if you ask me.
I have a similar reaction to Egan's Permutation City when the couple living in the simulation adjust the clock rate so the computer running them only executes one clock tick every second, every day, every year, every million years, every billion years, every trillion years, etc. Puts me in a contemplative mood.
The assumption here is that no force has acted on the energy arriving enough to distort two arrays on earth.
No physicists here but you don't seem to be either. Irc from what had been said with the first event gravitational waves pass through matter and aren't disturbed like em waves, that's why they supposedly open new doors. Secondly, just take it at face value, if they say so it is so, it might seem improbable and "horribly prone to manipulation" but they probably thought of that and have machines precise enough to compensate for whatever effects you or me imagined e.g. "LIGO is designed to detect a change in distance between its mirrors 1/10,000th the width of a proton". It's like your grandma, never having touched computers, expressing opinions on the next release of some developer tool.
Also they travel at the speed of light, the rotation of the earth is pretty much constant (if not negligible).
The Virgo detector, outside of Pisa, is in the final stages of installing its advanced instrumentation. With LIGO+Virgo a source like the first detection, from September (a very loud event), could be localized to a patch of sky about 10 square degrees in area.
Wow, that's much bigger than I expected. So basically we can only determine the general direction?
With current-generation detectors installed at three stations which is all we can hope to achieve at the current level of funding in the very near future.
More money, more detections at lower energies, better idea where they come from. There are plenty of instruments in cosmology, astronomy, and physics that have already been designed and planned out, but which do not have the funding to be built; Wait 20 years and five percent of them might come to fruition. You could pour trillions of dollars on these problems without running out of novel questions that we've already proposed ways of answering, and novel results from exploratory instruments we've already proposed building. We spend about 30 billion a year on basic science research according to the NSF, spread over all fields. For comparison, the military gets upwards of 600 billion.
We have good ideas about how to make gravitational wave detectors much, much better, but not how to make them much, much cheaper.
> So basically we can only determine the general direction?
10 square degrees is about 0.3% of the full spherical sky.
I used the word "parallax" poorly to refer to the offset between the detectors - localization is most definitely not done via parallax. It's arrival time + some signal processing, so with two detectors, resolution is limited.
Additional details and an image depicting the improvements made by adding a third, properly positioned detector are available at: https://www.ligo.caltech.edu/news/ligo20160404
1) That $1.1 billion was for two facilities
2) LIGO underwent a huge upgrade over the last decade, making the instruments tremendously more sensitive, and the cost of this upgrade is included in there as well.
So I suspect that building a third facility with the sensitivity of the two existing facilities would be a much more affordable endeavor. The first time around is always the most expensive.
That's an impressive swimming pool ;-)
I don't think that's technically true. You could build a bunch of catapults on the edge of the sphere, and when they all launch rocks at the center of the sphere, they would eventually form a black hole. The catapults could be arbitrarily far from each other as the radius of the sphere increases, such that they would not really do much to each other gravitationally. You'd just have to wait a real long time for the rocks to hit the middle.
> Building black holes with anything other than stars or giant gas clouds (in the early universe) turns out to be hard;
Well yes, galaxy-sized intelligently designed structures don't really happen.
I take your point however, that you could coordinate in some way separated masses, but at some point you'd probably run into issues with the aforementioned galactic-scale of engineering.
I found it extremely amazing as well! :p
It's used in normal FM radio.
EDIT: And there would probably have to be some control of the relative phase of each component wave as well.
(Disclaimer: Not an actual physicist.)
If a black hole is always colder than the relic and cosmological horizon radiation even as we approach de Sitter vacuum, then it won't evaporate. No evaporation, no missing information -- it's just somewhere else in spacetime.
If a black hole only mostly evaporates, the remnant holds information, and so no information loss paradox. Just a problem about how to assess the entropy.
If a black hole generically isn't no hair (i.e., if no hair doesn't hold up at all, or only holds up in cases like exactly spherical, stationary black holes in true vacuum) then the hair holds information, and so no information loss paradox. Just a problem about the nature of the hair. Hawking, Perry and Strominger are leaning towards "soft hair" as extremely long wavelength particles (wavelengths on the order of the Hubble diameter), and while there are lots and lots of as-yet-unanswered questions in that approach, it is not ridiculous.
There are other possibilities too. This is an issue which has had forty years of study.
The problem is that when you take a _model_ black hole deliberately arranged so that it has the greatest chance of being wholly determined by the eleven free parameters of no hair, and put it in dS electrovac and run that to the far future, so it also has the greatest chance of evaporating, there are a couple of problems.
Firstly, fully classically, we cannot know (because of no hair) if we grew the black hole by throwing in one shell of matter or two shells of matter of half the mass each, or ten shells of matter whose differing masses sum up to the same as the previous two cases. If we don't evaporate, we don't care; if we have a remnant, we don't care; if we fully evaporate, we have lost the information about the number of shells and their individual masses. Worse, when we throw in the shells we have the Hawking radiation temperature rise from T_start to T_end. If we arrange it so that an observer can measure the Hawking radiation as we throw in each shell, that observer will see T_start, T_start+1, ... T_end-1, T_end, so that observer has information that is otherwise lost. So, hair?
Secondly, we can make this worse by turning classical matter shells into individual quantum particles. I need to simplify here. If we throw in two alphas and a handful of electrons, and have no hair, an outside observer will not be able to know if you threw those in, or threw in two 4-He atoms, or 8 2-H atoms, and so forth. The no hair black hole evaporates into photons, and thus you have a version of the Knapsack Problem, with the unfortunate side effect that by the AMPS argument, you must have violated at least one of unitarity for at least some observers, the equivalence principle (in particular the "no drama" condition at the horizon of a sufficiently massive black hole that spacetime curvature is effectively flat at the horizon), the holographic principle (in particular, AdS/CFT's view of it), or semiclassical gravity as an effective (in the Wilson sense) theory of gravitation outside the horizon. People argue about which the worst thing to give up would be, and many qc-gr papers have been written in the past couple of years where one or more is gleefully abandoned, with the consequences followed to logical extremes.
The Hawking-Perry-Strominger approach only deals with the classical problem, hoping to extend that into the quantum realm.
There are several other ideas theirs competes with, many of which start with some quantum view hoping to extend that to realistically massive black holes.
In any event, resolution has proven to be non-trivial.
However, the tl;dr version is that yes, adding in some hair MAY solve the information loss paradox, but there are other approaches, and no good way to choose among them.
Real black holes might indeed be very hairy at all times. They certainly don't live in de Sitter vacuum (or even dS electrovac) today or any time soon, and they even less sit in AdS (and our universe's quantum fields are not conformally invariant). However, these still seem like acceptable conditions for _models_ of black holes where one is probing the nature of the mathematical theories that describe them in detail, and in particular General Relativity.
The analogy I've come up with to illustrate my point is the 'thermodynamic information paradox':
An isolated system will tend towards a stationary equilibrium state, uniquely described by just a few macroscopic parameters. This is our version of the no-hair conjecture.
Now, instead of a completely isolated system, we allow interaction via absorption and emission of radiation. We assume that no matter the incoming radiation, outgoing radiation will obey totally probabilistic thermal laws as there are no hairs. Now, if we were to radiate away all energy (eventually reaching zero temperature), information will have been irretrievably lost.
However, this is clearly nonsense as it tries to apply conclusions drawn under idealized conditions to non-ideal situations: Real thermodynamic systems fluctuate and have hairs. In fact, outgoing thermal radiation necessarily disrupts equilibrium, so the whole question is ill-posed.
Hair or not there is _enormous_ entropy in black holes that have not yet evaporated. Since we are fairly sure that no physical black holes in our universe have evaporated or are likely to evaporate in the next 10^65 years (for the smallest extremely isolated unmerged stellar black holes), and black holes deep inside galaxies will be around much much longer, it is perfectly reasonable to ignore the outgoing radiation and say that black holes are _effectively_ bald until our eventually much bigger Hubble volume is much closer to de Sitter vacuum. Indeed, mergers of black holes in nature are expected to undergo rapid "balding" as their complicated dynamical modes decay. So I can't agree with your last paragraph.
It's relevant to consider the origin of the Hawking radiation. Differently accelerated observers looking at a region of a QM field that's in thermal equilibrium will disagree on particle count. The formation of a black hole horizon creates a dynamical spacetime in which there is an acceleration between past observers (before the horizon forms) and future observers; the future observer sees particles within a few Schwarzschild radiuses of the black hole that the past observer does not. We usually consider a photon field for simplicity, but the particle count difference applies to all matter fields generically.
These particles are effectively created by fossil curvature which classically puts them on complex geodesics that go to infinity. In departing the region near the black hole their contribution to the local energy-density drops, which dynamically shrinks the horizon area. That in turn reveals previously hidden fossil curvature (which runs all the way to the singularity), which again is equivalent to an acceleration between past observers (before the "shrink") and future ones (after the "shrink), with the latter seeing particles that the previous ones do not. Voilà: evaporation, provided that nothing replaces at least the energy density of the escaping Hawking radiation particles (since that increase the area of the horizon, hiding behind it some of the previously observable-outside-the-horizon fossil curvature and thus some of the dynamic production of particles).
The quantum picture is slightly different because the particles are extremely low energy, with wavelengths proportional to the curvature radius at the horizon. "Soft hair" makes sense when one realizes that the curvature at the horizon of supermassive black holes can be arbitrarily flat (i.e., you can in principle have particles with wavelengths as long as or longer than the Hubble length). That's a very different picture from fully classical pointlike particles. However it's not clear that that's sufficient to dent the enormous difference between the number of microstates inside the horizon and the eleven variables of the macrostate of a no-hair black hole.
Additionally, even if it did wipe out much of the black hole's entropy, there does not yet seem to be a way to unitarily evolve from pre-horizon state to fully evaporated state; the details of what exactly was in the black hole as the horizon formed and what drifted in later are not obviously encoded in "soft hair", and the assumption is that the encoding would involve strong gravity somehow (and it must as the horizon area gets small).
So even taking your idea of adding a bunch of additional macroscopic parameters to no hair's eleven (at a particular time coordinate), you still have a non-negligible fraction of a galaxy worth of mass inside the horizon of an SMBH that fits comfortably within Jupiter's orbit, and absent a quantum gravity theory you'll have little idea about how to count all the compatible microstates. The information loss you are blithely accepting in your second last paragraph is difficult to conceive: for a spherically symmetric black hole of merely one solar mass the horizon area is about 10^77 in Planck areas, which gives us the holographic entropy bound. 10^77 is greater than the value of S for all the gas, dust and noncompact stars in the Milky Way.
(Finally, I want to note that there are of course many approaches to the post-AMPS black hole information problem very different from Hawking et al.'s "soft hair"; however super- and ultramassive Schwarzschild black holes in dS vacuum are good model probes of each of these possible theories).
P.S. How do you know all this? Are you a physicist?
IANA physicist, and have 0 training in physics, so someone please tell me if I'm wrong
Astronomers are looking for such photon signatures simultaneous with collision. With two LIGO detectors we can determine the collision time, but the sky location to within an arc. The third detector will give more location precision. Several more are under construction.
Does the gravitational wave contain 0.9 solar masses of energy?
Or in less sensical units, the energy from detonating 1% of our galaxy's mass worth of TNT.
"Which of the following would be brighter, in terms of the amount of energy delivered to your retina:
1. A supernova, seen from as far away as the Sun is from the Earth, or
2. The detonation of a hydrogen bomb pressed against your eyeball?
Answer: the supernova, by nine orders of magnitude."
Just as a nuclear bomb converts a small amount of mass to radiant energy (grams?), a merger can convert mass into other forms of energy, too.
Gravitational waves, of which there are probably many passing through us all the time from many sources, carry a lot of power. The only reason we don't see them every day is that spacetime itself is extremely stiff.
Far away in linear approximation (which is what we saw here), the lattice picture is quite appropriate as a first order analogy.
As energy radiates outwards, it becomes diluted on a sphere that increases its surface area as r^2. Thus the local energy density goes with inverse square law in the distance to the observer. But we don't measure the local energy density of gravitational waves, but the local amplitude. And in wave mechanics energy is the square of the amplitude. So the amplitude goes down with 1/r.
Needless to say, you would not survive if 1 solar mass of energy were released in any other form (even neutrinos). The key feature is that that tremendous amount of energy results in only a minuscule deformation of spacetime, i.e., spacetime is extremely stiff.
By the way, the distances are way smaller than merely subatomic. The strain sensitivity is 10^-21, which for the ~ 1 km arms of LIGO is a shift of just 10^-8 the width of an atom.
But it is close!
A supernova releases "few times 10^45 J of neutrino energy" , so let's say 5. 5e45 J is about 6e28 kg, while solar mass is 2e30 kg. And neutron radiation from a supernova would get fatal when closer than about 2.3 AU . So we have a factor of 30 from the masses and a factor of 5 (1^2 AU vs. 2.3^2 AU) from the distance.
So about 1/150 of solar mass released in neutron radiation would be survivable at the distance of 1 AU.
The formula for gravitational potential energy is just mgh (mass times gravitational acceleration times height), and g is just GM/r^2, so the potential energy of one black hole in the other's gravitational field would be GMm/r, which would be the same for the other, so the total gravitational potential energy would be twice that.
Also, the schwarzschild radius of a black hole is about 3km per solar mass (2e30 kg).
Which means that before their event horizons touch the two black holes should be separated by a distance of at least 65.1 km.
So, 2 * G * 14.2 * 7.5 * (2e30kg)^2 / 65.1 km is...
divide by c^2:
9.7e30 kg or 4.86 solar mass
So the system actually had nearly 5 solar masses of gravitational potential energy in it, some of which was radiated away as gravitational waves.
As more detectors come online they will be able to triangulate more accurately. The next one is supposed to come online in 2018 iirc.
"The data is in fact completely open and you could analyse it yourself! In addition to the GW150914 event there are also two others that rise somewhat above the background ("GW151012" and "GW151226"). You can see them by eye in the above plot. They are clearly not statistically significant enough to announce a discovery alone, but still they are tantalising... with room for improvement to design sensitivity (by a factor of ~2 which increases the spatial reach by 2^3) and the construction of a third detector in India to triangulate the signal, the future of gravitational wave astronomy is exciting."
"Two matched-filter searches used coincident observations between the two LIGO detectors from September 12, 2015 to January 19, 2016 to estimate the significance of GW151226. One of these searches was the off-line version of the online search discussed previously. The off-line searches benefit from improved calibration and refined data quality information not available to online searches."
'The garden of forking paths: Why multiple comparisons can be a problem, even when there is no \shing expedition" or "p-hacking" and the research hypothesis was posited ahead of time'
Black hole collisions were expected to be a fairly common occurrence, but we had no way of knowing for certain how frequent they are when only one had been detected. Now that another one has been detected a fairly short time afterwards, it supports the prediction that we should observe several of them every year.
If black hole collisions were very rare events, it would be harder to use them as a natural laboratory. It's much easier to demonstrate that some small deviation from theory is real when you have multiple observations of it.
Therefore, this second observation in a short time confirms that gravitational waves are likely to be a useful tool to explore fundamental physical principles.
No, it just means there's a high probability that these events occur on a regular basis. Of course, "regular basis" really means we might be able to approach gaussian statistics within the lifetime of the experiment.
The more you read about how this particular instrument is constructed, the more incredulous you'll be that it could possibly work at all, or if it does, that it could pick up anything but noise. So it's very good news that these events are relatively common. (Well, good for the research teams at the LIGO installations, but bad for anyone within a few hundred light years of the events being detected.)
PS: Watch this brilliant video  if you want to know what gravitational waves are.
Remember, this was imagined by some dude 100 years ago. That's nucking futs.
What are the implications of this on black hole entropy and temperature? Can black holes evaporate from gravitational radiation alone without Hawkins radiation?
What are the implications of this on the mass of any object in the universe, since all objects are related to each other gravitationally and the universe is expanding? Does it mean objects are constantly loosing mass and the universe is filled with energy from this release? Can dark energy be related to this process? Can universe be expanding because matter is constantly lost into the gravitational waves?
What are the implications of gravitational waves on the fabric of spacetime, if objects are constantly leaking gravitational waves in a nonstatic universe?
At cosmological scales the "objects" are the various matter fields (in the most general sense, everything that is neither the cosmological constant nor the gravitational field itself) and we concern ourselves with energy-density, which drops with the metric expansion of space.
Sean Carroll goes into some detail here:
Back to GWs: these change the peculiar motions of matter field content, at cosmological scales giving tiny nudges to galaxy clusters compared to what is imparted by Dark Energy. So even though we lose the conservation of energy, we get back the conservation of energy-momentum.
A dumbbell-shaped mass is the sum of a gravitational monopole and a gravitational quadrupole. That gravitational radiation originates only from quadrupolar sources (which are far less-efficient radiators than an equal-sized dipole) is one reason that gravitational waves are hard to detect.
No, gravitational waves are only generated when you have large masses moving. When two black holes merge, they become a single spinning mass which then becomes uniform very quickly. Once the mass is uniform, it doesn't generate gravitational waves from rotation.
I have yet to study cosmology (I did some research in astroseismology), so I can't answer general cosmology questions.
There are also Jupyter tutorials on processing GW signals at https://losc.ligo.org/tutorials/
A semi-hand-wavy way we know that the objects in this particular inspired are black holes is that for them to be orbiting each other this fast, they must be very close. For them to be this close, they must be very small. Together, this implied the objects have a certain size which is smaller than their Schwarzschild radius, which means they are probably black holes.
To give perspective on where I was coming from: from the (human) perspective of looking at noise in 1/sqrt(Hz) units, the signals are actually less-than-averaged if they're less than a second in duration.
It's not like a continuous-wave source where one can integrate for months and hammer down the uncertainty to far lower strain values.
Yet the article makes no mention of this whatsoever.
The waves on the ocean are also turbulent distortions by the wind which pack together was waves (a group of 14 waves with the 7th as the highest and all waves after the 7th barely noticeable).
It seems that the 7th spike measured at LIGO is also the highest. Maybe the grouping of turbulent distortions as waves is something universal?
Is there any way thinking like that I can get an approximate idea of what the gravitational wave would be in that scenario?
Imagine you have a large mass (sun) moving in a straight line, and an object (sensor) 1 light minute from that mass. The gravity from the sun takes 1 light minute to reach the sensor - that would mean that the sensor is attracted to where the sun was 1 minute ago.
That can't work - it violates all sorts of conservation laws.
Instead what happens is that the gravitational force itself is ALSO moving in a straight line! So when the gravity from the sun reaches the sensor it attracts the sensor to where the sun is now because both the sun, and the gravitational force, are moving together.
This works out very nicely.
But what happens if the sun is moving in a circle? Otherwise known as accelerating?
The gravitational force can't know what the sun will do in the future (that it will move).
So as the sun moves it forces the gravitational force to change - before the force was moving in direction a, now it's moving in direction b. This change in the gravitational force is known as a gravitational wave.
This wave, because it is accelerating, has the ability to impart change in other objects, otherwise known as imparting energy. So gravitational waves can carry energy! Potentially huge amounts of it.
(And since they carry energy, they themself have mass, and therefor gravity, but these second-order effects as they are called, are too confusing, and weak, and everyone ignores them.)
Back to the black hole - as it orbits the other black hole (as they orbit each other), they change direction very very rapidly, causing huge gravitational waves - the waves steal energy from the orbit, causing the two black holes to fall into each other with smaller and smaller orbits, i.e. a spiral.
My problem is this: Near a black hole time dilation is enormous, huge gravity, plus huge velocity. So to an outside observer the black holes appear basically frozen and don't move. If they don't move they don't make gravitational waves, so we should detect nothing.
I have no answer to this question.
What other experiments are running that would generate similar excitement from the science community that I probably haven't heard of yet?
Would love to hear an audio facsimile of what this might "sound" like.
"The amount of matter converted to energy in the atomic bomb dropped on Hiroshima was about 700 milligrams, less than one-third the mass of a U.S. dime."
From the first detection
That these waveforms (the first one, in particular) appear to follow the basic GR plan at the time of the merger is actually kind of a bummer. As a physicist who makes precision tests of gravity for a living, I'd really hoped that we'd see something unexpected. Instead, the new thing that we learned is that GR continues to be an accurate model, something that was perhaps unexpected on its own.
GW physicists are hard at work on the future generations of GW detectors... In 50 years, a signal like GW150914 might have an SNR of >1000, so we'll see all kinds of detail that may hold more information than what we've seen so far.
Clifford Will's "The Confrontation between General Relativity and Experiment" is an excellent introduction to the state of the art (this is the stock reference for everyone in the field).
That is how I read it, yes.
> The cataclysmic event saw the black holes, one eight times more massive than the sun, the other 14 times more massive, merge into one about 21 times heavier than the sun. In the process, energy equivalent to the mass of the sun radiated into space as gravitational waves.
So the final black hole had rather less mass than the sum of the original two.