> - Briefly 10^34 megatons of energy were released every second
That quote caught my eye too. What's the full unit on that? Is that literally the "m" you'd plug into E=mc^2, or was there an elided "...of TNT", like we'd use to describe nuclear weapons?
It appears it is "... of TNT". Wolfram Alpha converts "10^56 ergs to imperial megatons" to a value of: 1.095×10^23 long megatons (using E = mc^2). While it converts "10^56 ergs to megatons" to a value of "2.39×10^33 megatons of TNT" which is much more like that number.
It must be TNT equivalent. One solar mass is 1.99 × 10^30 kilograms, and we know that 2 solar masses were converted in total, so 4 x 10^30 kilograms, which is far less than the "megatons" mentioned, in terms of pure mass.
I wish folks would avoid mixing military units and general relativity units like this, it's confusing.
"Megaton" isn't really used anywhere except for explosive yields, where it always means TNT. As far as the 'native unit' astrophysicists will tend to use ergs for events like supernovae.
I thought it was ambiguous in the context of an article talking about converting mass to energy. Without running the numbers, I couldn't tell if that was a unit of the input mass or the output energy.
Or, at least give the SI unit first, then a day-to-day formulation in parentheses.
The SI units are not just more useful for scientific minded readers, but also for international readers in general, who don't share the same cultural background and hence have no feeling for these "day-to-day formulations" anyway, because it isn't their day-to-day. For example, a German author might write:
| The area is as large as 7.2 km², which are 1000 football fields.
Without the SI unit (7.2 km²), this would be very confusing. Of course, the author meant association football (soccer) fields, not American football fields. But who in the international audience would have caught that, especially if I didn't state author's national background upfront? Even more importantly, which percentage of the international readership has developed a intuition about the size of a football field?
One kiloton of TNT equivalent in energy release is 4.2 terajoules (a megaton is 4.2 petajoules). And yes, this is the unit most commonly used for the yield of nuclear weapons.
Anyone familiar enough in this area that can help me answer these questions:
-gravitational waves also propogate at c so how did they escape the event horizons from which they originated?
-what and where was/is the "more energy than all stars emit as light in the universe"--10^34 megatons--released from? From matter in the accretion disk orbiting the black holes?
-the article says these two were spinning non-uniformly. Can we know if the bh's are spinning or just the stuff around them?
1. The waves didn't originate from inside the event horizons, but from the area around the black holes, where space is still very heavily warped.
2. Imagine you have two serious dents in some stiff plastic sheet. By tapping on the plastic with a hammer, you can't get rid of them, but you can sort of move the dents around. Now imagine that you maneuver the dents towards each other, so they merge into one bigger dent, and as that happens the sheet makes a dull thumping sound as the rigid material snaps into a radically different shape. That's a little like what happened here.
3. Yes, black holes maintain the spin they had before they became black holes (and in fact their angular momentum is vastly increased in the process, just like any spinning thing that reduces its radius).
Since black holes are rotationally symmetric they cannot shed angular momentum via gravitational waves since the production of those requires some asymmetry[0]. But as a sibling post pointed out they can impart some of their angular momentum on objects within the ergosphere which then may escape and carry away the energy.
Before merging, yes, they do shed angular momentum before merging. In fact, if you calculated the angular momentum of two maximally spinning black holes, you'd realize that if you could combine them, it would larger than the maximal possible spin of a single black hole with the combined mass. They must and will radiate this away as gravitational waves before merging.
How would a non-rotating black hole be possible? Wouldn't it be impossible for any source material to lack spin entirely? Or is it a purely theoretical thing?
I like your analogy on #2, but
To get to a more precise answer to the question, can't we just say the energy that was released came from the potential energy between the two?
It's just awesome to see these notebooks released. I'm not going to play with them, but I love that people are able to. It makes me much more confident in the results that are announced, and I hope this approach to doing science becomes the norm.
I second the wider adoption of notebooks. I'd really love to see government budgeting offices and what not begin to make their spending and analyses transparent through these documents. We need government on github and jupyter notebooks :D
She gave a neat analogy between GWs, as sensed by LIGO, and an electric guitar. In the sense that a distant pluck on the string is transmitted as a wave down the string to the pickup, which senses a little wiggle in the string and amplifies it. I thought it was a poetic analogy that gives a second meaning to the word "instrument" in this context.
Except imagine you're embedded on the string itself and cannot actually sense the string "moving through space". The way you have to measure it is by sensing tiny changes in distance between the left and right side of the string as it wiggles around.
The actual detail of the experiment and the precision they reach is quite fascinating. Veratisium has a pretty good video explaining it in more laymen terms [0]
>"GW170104 was first identified by inspection of low-latency triggers from Livingston data [15–17]. An automated notification was not generated as the Hanford detector’s calibration state was temporarily set incorrectly in the low-latency system. After it was manually determined that the calibration of both detectors was in a nominal state, an alert with an initial source localization [18,19] was distributed to collaborating astronomers [20] for the purpose of searching for a transient counterpart. About 30 groups of observers covered the parts of the sky localization using ground- and space-based instruments, spanning from γ ray to radio frequencies as well as high energy neutrinos [21]."
https://dcc.ligo.org/LIGO-P170104/public
Regarding the earlier detection:
>"At 11:23:20 UTC, an analyst follow-up determined which auxiliary channels were associated with iDQ’s decision. It became clear that these were un-calibrated versions of h(t) which had not been flagged as “unsafe” and were only added to the set of available low latency channels after the start of ER8. Based on the safety of the channels, the Data Quality Veto label was removed within 2.5 hours and analyses proceeded after re- starting by hand."
http://ligo.elte.hu/magazine/LIGO-magazine-issue-8.pdf
So both times humans had to take special action for the detection to "count". I really wonder about whether the null model they are using is appropriate/relevant here.
Also, the other thing I have been concerned about is the lack of any corroborating evidence that these signals are truly generated by inspiraling black holes(gamma ray bursts, etc). Apparently, in this case the above-mentioned miscalibration has impeded that effort:
>"The event candidate was not reported by the low-latency analysis pipelines because re-tuning the calibration of the LIGO Hanford detector is not yet complete after the holiday shutdown. This resulted in a delay of over 4 hours before the candidate could be fully examined. We are confident that this is a highly significant event candidate, but the calibration issue may be affecting the initial sky maps. We will provide an update in approximately 48 hours which may include an improved sky map."
https://gcn.gsfc.nasa.gov/other/G268556.gcn3
I can't tell from that text file whether they got corroborating evidence or not. IANAP though.
Hi Nonbel, I work within the LIGO Scientific Collaboration, and as another posted commented, manual intervention (in the case of GW170104 by me) was only necessary for the online analysis. The purpose of online analysis is for fast coordination with EM partners so that potentially interesting opportunities are not missed. In the case of binary black holes, the expectation is that there will be no electromagnetic counterpart, as the region is expected to be cleared of matter well before we observed the black holes merging. If one to were to be found however, that would be exciting.
By design, detection statements and significance estimates come solely from the offline analysis which is conducted separately (i.e. not triggered by) the online analysis. No human intervention is required here, as the issue with the online status information was known about at the time and was not an issue with the data itself. Even if there were no candidate events at the time, it would be been included in the offline analysis of the period containing the event.
In regards to GW150914 and iDQ, you should know that iDQ has never been approved as a veto for CBC (compact binaries such as neutron stars and black holes) searches. Again, no intervention was required to "remove" the veto as it was never used in the offline analysis nor would be in the first place. It's only use that I am aware of is as a veto against Burst triggers in online analysis. These searches look for generic signals, but may also detect some of the louder CBC sources, such as GW150914. In case you were wondering, there weren't dedicated online CBC searches at the time of GW150914, but there were offline analysis, and those produced the results reported in the original detection paper.
This is information that should be included in the papers because the current description is too terse. Basically you are saying that the filters used for online analysis have nothing to do with the background model, zero influence on what periods get included, etc. I'm still unclear on what exactly needed to be "restarted by hand" for the original GW150914 signal, but ok.
>"In the case of binary black holes, the expectation is that there will be no electromagnetic counterpart, as the region is expected to be cleared of matter well before we observed the black holes merging."
Is there any other type of event that is expected to be accompanied by some kind of corroborating evidence?
To get an electromagnetic counterpart, you need matter to be in the system. It may be possible for binary neutron star and some neutron-star black hole mergers. These types of mergers are one of the predicted sources of short gamma ray bursts, so if we get reasonably lucky it may be possible to find one in coincidence. Gamma ray burst are beamed, however, so to detect it, it would have to be point towards the earth, and many of the gamma bursts that we have accurate distance measurements for are currently outside our sensitivity range. We only have estimates for a fraction of GRBs though. Lower energy EM radiation may be possible to see as well with these mergers.
There have been three signals witnessed in about 12 months of observations - of course the models need some tuning to correctly, automatically, trigger alerts. In any case, you are referring to the _online_ triggers which look very quickly at the data and try to guess if an apparent signal is real before informing electromagnetic observatories to follow up. The real analysis is conducted _offline_ in a much slower, careful way with lots of checks and balances on the state of the instruments to rule out artificial signals. That's one of the main reasons why it took 5 months between the first detection and the publication of the paper announcing it.
In terms of corroborating evidence, remember that the two independent LIGO detectors - 3000km apart - saw the event within 10ms of each other. That's enough corroborating evidence for a lot of people. The NASA text file you link shows no observed electromagnetic counterpart, but that's expected: unfortunately the best models so far for black hole coalescences predict very little or no electromagnetic emission - so although EM partners were informed, the chances of them seeing anything were slim. Other predicted sources of gravitational waves, like as-yet unseen binary neutron star coalescences, are more likely to emit EM radiation and stand a chance of being witnessed by conventional observatories as "corroboration".
>"In terms of corroborating evidence, remember that the two independent LIGO detectors - 3000km apart - saw the event within 10ms of each other. That's enough corroborating evidence for a lot of people."
I don't see what the first part of the post has to do with the null ("background noise") model being inapplicable to situations where special human intervention comes into play. Do they include any events like that in the background timeseries or not? I am suspecting not (which renders the model false and hence false alarm rates/sigma values meaningless), but do not know for sure.
Second, that is just a detection. Corroboration occurs when your model predicts multiple types of observations related to a phenomena (measured by different types of instruments). This weakening of definitions is concerning to me if it has infected physics. I have seen that trick used a lot by "softer" fields such as medicine/psych (eg their definition of a replication is just seeing "an effect" in the same direction).
Also, I read somewhere that they had 6 additional signals they haven't reported yet (can't find it at the moment).
I guess we will see once the count gets into the dozens. If it happens without any kind of outside way to verify these signals are inspiraling black hole events it will definitely be interesting to see how the physics community deals with it.
Anyone familiar with this branch of astronomy want to explain why one detection in a volume on the order of 27 billion cubic light years is reasonable? Are they still processing data and will find more events? Is the sensitivity highly anisotropic so the detection volume is significantly smaller? Or are events like this just really conveniently rare that we get about 1 every data gathering interval?
Events big enough to be detected are quite rare, but the phenomenon you are asking about is more a financial reality than a "convenient coincidence" as you put it. There is a pretty steep curve connecting sensitivity and cost, so when the team that built LIGO was designing it, they used the best available models of colliding black hole event rates to estimate the sensitivity required to deliver a conclusive result in a reasonable amount of time.
If you're getting 10 events/second with a device like this, you probably overpaid for sensitivity and if you're getting 1 event per century you're probably not going to be able to maintain the operating expenses to still be running when the detectable event occurs (and, as critically, none of the people involved will be able to get the data they need in the time they need it to get their PhD's, assistant profesorships, or tenured positions, so you can't get the labor force you need for your experiment to work on it, which is really what sets the acceptable duration of most experiments in practice).
It looks like the original estimates were pretty good, so events are coming in at about the rate the experimenters hoped they would see them.
Hmm...are the rates of black holes per volume well-constrained at all? I was under the impression that it's a possibility that dark matter consists prominently of primordial BHs? The truth or falsity of this would seem to have a big effect on rates.
> are the rates of black holes per volume well-constrained at all?
There are estimates. But two of the three LIGO detections are of black holes that are more massive than we had expected to exist (~ a few tens of solar masses). Previously we had convincing examples of black holes with <~ 10 solar masses and others with >~ 1e6 solar masses. But since we didn't have any convincing observational detections of BHs with ~20-40 solar masses, it's safe to say that the volume density is poorly constrained for that mass range. At the high end we have a reasonable estimate of the volume density, because we think all galaxies with spheroid components have a black hole and that the black hole's mass is linked to the spheroid.
> I was under the impression that it's a possibility that dark matter consists prominently of primordial BHs?
It depends on what you mean by "primordial". Micro-lensing experiments (when a star is brightly made brighter by the gravitational focusing of light from an object passing between us and the star), mostly looking towards the LMC/SMC [e.g., 0] have tried to address this. My recollection is that there aren't enough stellar mass black holes around to account for all of dark matter. Assuming Hawking radiation exists, low-mass primordial black holes should have evaporated by now, leaving only the more massive ones. There's a range in between the two, but I'm not sure if you can fit enough of them in a galaxy to account for dark matter while still being consistent with the sensitivity of the microlensing surveys.
> The truth or falsity of this would seem to have a big effect on rates.
Possibly. Though in order to emit GWs, pairs of black holes have to become bound to each other. If black holes make up the dark matter halos, they probably have large velocities relative to each other, which would limit their ability to form bound pairs (though it is possible with 3-body interactions). I am not aware of estimates of the BH pair-formation rate in halos _if_ DM haloes are in fact made of black holes. But the event rate probably can't be extraordinarily high, otherwise we might expect to see dark matter halos becoming less massive as the Universe ages. Though there are many confounding factors that might hide such a signal.
> If you're getting 10 events/second with a device like this, you probably overpaid for sensitivity
Aside from issues processing and disentangling the overlapping events in a situation with that high of an event rate, more events would not be bad, so I'm not sure I'd call it "overpaying". Imagine the kind of population demographics that could be built up if we were detecting that many events.
Keep in mind that when LIGO was built at tremendous expense, gravitational waves were never conclusively detected, and just to get to this sensitivity it was a feat of engineering. It was unknown how long it would be or if they would ever detect a wave.
Just detecting the initial wave was one of the most important measurements in the history of physics. Now that we know that gravitational waves exist, and can give us great insight into the mysteries of the universe, there will be more efforts to detect waves at smaller amplitudes and different frequencies. That is one reason the eLISA project is going on [1].
I'm aware of the context for the GW detections and the expense, etc. I was making a general philosophical point about "overpaid", not commenting specifically on the LIGO cost-benefit analysis.
It's not that physicists wouldn't love to capture all those events, it's that the cost of building instruments like LIGO is nearly prohibitively high and the cost is a strong function of the sensitivity of the instrument. If you aim too high in your sensitivity aspirations, the cost hits a point where the experiment simply can't be funded.
> If you aim too high in your sensitivity aspirations, the cost hits a point where the experiment simply can't be funded.
Agreed. I misunderstood your meaning then; I'd interpreted your wording to mean that "overpaid" was still within the bounds of reasonable expectations for funding. "Overpaid" didn't imply "too expensive to build", to me.
It's a balance between cost and sensitivity, and remember that 1 discovery would prove the experiment a "success". I think the way it was planned and executed was great. Also, I think the proof that gravitational waves is far more exciting than comparing gravity wave signatures among a sample of celestial collisions.
This is sort of like saying that "proving that stars emit light" is more exciting than using better and better telescopes to compare electromagnetic wave signatures of different light emitting celestial objects.
We will likely learn a great many things over the coming decades with this new way of looking at the universe, many of which we couldn't have even guessed we would learn!
While there are many compact binary systems in the universe continuously emitting gravitational waves for a very long time, LIGO is only able to detect the most violent waves emitted by the final coalescence and merger. So, on the astrophysical side, there is some joint probability given by how common these systems are, and how likely they are to merge in a given time frame. (Space based observatories like LISA would be able to see the long-lived inspiral waves though.)
On the instrumental side, we've only just reached the sensitivity levels to make any detections in the first place, so it's not surprising that we're not getting a huge number of events (otherwise the previous generation of detectors would've seen something). In addition, each individual observatory has its own "antenna pattern", making us less less sensitive to certain sky locations. This will improve as VIRGO, KAGRA, and LIGO-India come online in the future.
It's not 100% accurate that "LIGO is only able to detect the most violent waves emitted" but that LIGO can only detect waves emitted in a certain frequency range (total mass) of binary mergers. For instance, LIGO pretty much cannot detect the merger of supermassive black holes.
It's important to know that LISA and LIGO aren't really competing for sensitivity. Rather they complement each other by looking in different frequency ranges. The relationship between LISA and LIGO is analogous to a radio telescope and a gamma ray one. They observe different parts of the spectrum. At the frequencies that black holes merge for example, ground motion is not much of an issue, and other noise source dominate.
I am just waiting for the $10,000 shielding for high end speakers to keep gravity waves from interfering with the acoustic purity of the sound they produce. :-)
You absolutely must keep your record player suspended from glass fiber in a vacuum chamber to avoid any unwanted coloration. Just takes 30 minutes to pump down when you want to flip sides.
These events are actually not rare. There are roughly billions of them in that volume over the current lifetime of the Universe. They are comparatively rarer than many other kinds of events, like supernovae, because they involve the extremely massive stars that are themselves very rare.
I'm not a physicist/astronomer so someone could give you a more detailed explanation. From what I understand, LIGO has been designed to detect collisions of mid-sized black holes, and it looks like those aren't that common.
There are loads of possible ways to increase sensitivity, but none of them are easy or cheap given that the low hanging fruit was all picked off in previous generation detectors.
Increasing arm length is the "easiest" but definitely the most expensive option. Try finding a 40km L-shaped area that's seismically stable and free from significant anthropogenic activity. There may only be a handful of places in North America. However, 4km is already on the cusp of being long enough that gravity misaligns the two mirrors are each end of each arm due to the curvature of the Earth. Going to 40km would prompt the need for static corrections to mirror alignment, which will increase the amount of seismic noise that couples into the longitudinal direction in which gravitational waves are sensed. There are other problems such as the need to either refocus light at points along the arms (very susceptible to alignment and thermal noise) or use much, much bigger mirrors. The Advanced LIGO mirrors are already ~40kg, ~30 x 15cm cylinders of the purest fused silica known to man circa ~2012. There is talk of increasing the mirrors to 200kg and ~50 x 25 cm, and no facility is currently capable of producing pure enough fused silica at that size.
An "easier" option is to increase the laser power. This gives diminishing returns, and leads to an increase in high frequency sensitivity at the expense of low frequency sensitivity (due to photon pressure pushing the mirrors around noisily). However, the challenges are to make stable lasers that are also powerful - very tricky - and to mitigate the effect that laser absorption has on the mirrors within the interferometer - as you increase laser power, things heat up. Hot mirrors can lens the light, misaligning it and creating extra loss (i.e. reducing sensitivity). It's trickly to mitigate. Another effect of higher laser power is the introduction of parametric instabilities, where the mechanical body modes of the mirrors are amplified by the high laser power, leading to huge spikes of noise at narrow frequencies which are difficult to damp out.
Another is to use a different interferometer topology: instead of an L-shaped Michelson interferometer, suggestions have been made for Sagnac interferometers which possess an interesting property called quantum non-demolition, which can potentially reduce the limiting noise source in Advanced LIGO which directly increases sensitivity. Research into this is at a very early stage and will not be seen in detector facilities for decades, if ever.
So, the short answer is: there are lots of potential methods to increase sensitivity, but all of them are challenging and require significant R&D and money.
Sensitivity is about picking out a signal from the noise. The way you increase sensitivity is to decrease the noise. Effectively this means isolating the environment of the LIGO experiment and a great deal has been done on this. If you search arXiv for LIGO over the past two decades, you'll find plenty of articles on this, but I warn you, you may be reading about injecting null energy modes into the system in one paper while reading about mirror design and reflection results in the next paper.
"If we improve our detector sensitivity, by say a factor of two or three, the rates will go up from, you know, seeing one every month or every two months, to seeing one every day or every week."
- David Reitze, Executive director of LIGO
I don't know what their uptime is, but it sounds like they probably have a number of as-yet unreported observed events.
It's like the difference between an explosion of TNT and a atomic bomb, but on a much larger scale.
Stars like our sun spend ~10billion years turning a portion of their mass into energy. Most stars are like ours, small, dim and weak in power output. Our sun will not go supernova and will not collapse into a black hole when it dies, it will simply go nova and end up as a dwarf star in a nebula.
But, now imagine two black holes each a billion times as massive as the sun turning all their mass into energy in a couple of seconds.
10billion years to convert 99% of the mass of the sun to energy versus 10 seconds to covert 2 billion times the mass of the sun to energy. Now it makes sense that the power output is more in one second than the whole universe put together.
Solar fusion is on the cosmic scale a very slow way to convert mass to energy. It's so slow that we humans have been 'on the brink' of harnessing it for power generation for decades.
Now imagine if we could build two nano-black-holes and let them collide....
> But, now imagine two black holes each a billion times as massive as the sun turning all their mass into energy in a couple of seconds.
Sure, lemme just take off my "good at socializing with apes and running for long periods of time after antelope" hat and put on my "Cosmological scale" hat.
Huh, I seem to have misplaced that one. And the one I'm currently wearing is oddly well affixed.
I think your numbers are a bit off. I am not sure about the exact numbers but I think the sun will only burn about half or so of its mass over its lifetime. Also the black holes we observed merging are stellar black holes with masses on the order of tens of solar masses, not galactic black holes with millions or billions of solar masses.
The power (energy per unit time) emitted in the creation of gravitational waves by the source observed by LIGO was briefly greater than the light power emitted by all of the stars in the known universe. Basically, if the gravitational waves were in fact light (they're not), then they would have briefly outshone everything else in the universe put together.
Somewhat naive questions, as I know very little about astronomy. Do black holes "move?" How is it that they could merge if they're stationary, unless they're pulling each other in I guess? If black holes are indeed pulling in everything, does that mean the whole universe would eventually be one giant black hole?
Yes, like any other massive objects, black holes can have velocity and momentum. Two black holes, or a black hole and another object like a star, can orbit each other in a way that almost follows Newton's laws.
> How is it that they could merge if they're stationary, unless they're pulling each other in I guess?
This gets at what makes LIGO's findings interesting. Two black holes merge if they fall into each other's event horizons. But Newtonian gravity predicts that, in isolation, this would never happen; two orbiting black holes would just maintain their elliptical orbit forever. (I'm hand-waving here, because Newtonian gravity can't properly model black holes at all.)
The theory of general relativity predicts that the gravitational curvature of the space around the black holes contains energy, similar to the energy in the electric field around a charged particle. And intense changes in curvature can create waves in the curvature of space, which carry away kinetic energy from the black holes and cause them to spiral into each other. Under normal conditions these waves are so unimaginably tiny that they're unmeasurable, but during a black hole merger, they become intense enough to be (barely) detected from billions of light-years away. This is what LIGO detected, confirming a long-known theoretical prediction of GR.
> If black holes are indeed pulling in everything, does that mean the whole universe would eventually be one giant black hole?
Not necessarily. Everything in the universe attracts everything else gravitationally, but that doesn't mean any two objects will inevitably collide. If they have enough energy to move apart faster than their common escape velocity, they are not gravitationally bound and will continue separating forever.
If we scale time to one-trillion-quadrillion years into the future, isn't is possible that all mass in the universe eventually coalesces into a massive universal super black hole? Or, does the expansion of the universe outstrip that?
Black holes move in the same way as any other object of the same mass, for example a star.
When they merge they are typically orbiting each other already in a similar way to binary stars. When they orbit the system will lose energy to gravitational radiation causing the black holes to spiral inwards. (This happens for all orbits, including the Earth's). Eventually they merge sending out a tremendous amount of gravitational radiation.
The gravitional pull of a black hole is just as strong as another object of the same mass. The difference is that the mass is concentrated to a point in the center. This is what makes them impossible to escape from.
It was believed the the whole universe could maybe collapse back into one point (like a black hole). Now it is believed that the universe will keep expanding forever. Actually the universe seems to be expanding ever quicker due to dark energy. But no one knows what dark energy is.
Black holes have mass, just like any other object in the universe - like a star, or a planet, or the sun.
Consequently, they follow orbits just as any other mass would. In some cases, they're the local most massive object and any other masses move more in response. Other times they are near other black holes, and they orbit one another until they collide and merge.
What makes black holes different is their density. The mass a black hole has is confined in a point of zero height, width and length called a singularity. The consequences are as we know, not fully understood by current models of physics.
edit: spelling
They do indeed move. It's conceivable that a black hole of mass X could be observed orbiting a red super-giant star of mass 100X. I don't think this happens much though, but the universe is big so who knows.
I guess this was the biggest question answering piece to me. When I think of a black hole, I assume it has a gigantic mass, enough that it's always the most massive local object, and subsequently pulls in all other things. I didn't realize that might not be the case. As black holes "absorb" everything that "falls" into them, do they continue to build mass then?
Whether or not something is a black hole is dependent on its total mass and its radius. So anything can mathematically become a black hole if you compress it enough. The earth could be a black hole if it's total mass were compressed to something like 'less than the diameter of a grapefruit'. At that point space itself cannot contain the mass, physics breaks down and you get a singularity.
The moon would continue to orbit as it always did, since the moon's centre of mass is still exactly the same distance from the earth's centre of mass as it was before you pressed the 'compress button' on the north pole.
It takes incredible amounts of energy to cause this compression however. And this is why only the biggest stars become black holes. As the outward pressure of fusion diminishes because the hyrogen/helium/lithium/berylium/etc fuel runs out, the sheer gravitational pull of all that mass suddenly takes over and that inward momentum from all directions is enough to cause a singularity.
> So anything can mathematically become a black hole if you compress it enough.
Not necessarily. If the Schwarzschild radius of the resulting black hole is on the order of a Planck Length you can't really say whether or not such an object is a black hole anymore (depending on your interpretation of the Planck Length).
On my calculator the lower bound is [\frac{r c^2}{2G}] which is ~1.09e-9 kg (give or take a factor of two depending on whether you want to consider the diameter or radius). Which is small but not as small as you might think. I also believe there are some upper limits on black holes too (that come from the upper limits of stars).
It depends less on your interpretation of the Planck length and more on how quantum gravity actually works. We have some guesses and semi-supported theories (like Hawking radiation) but the jury is still out as to how micro black holes work, if they do in fact exist.
The sun is the most massive local object in the solar system, by a very large margin, but does not pull in all other things. Or, well, it does pull, obviously, but it turns out that objects under gravity end up on Keplerian orbits if they have sideways velocity to begin with.
> When I think of a black hole, I assume it has a gigantic mass, enough that it's always the most massive local object, and subsequently pulls in all other things.
The sun is not pulling in the earth. Earth orbits a common barycenter with the sun. That barycenter happens to be inside the sun, but the sun is also orbiting around it.
So a small object orbiting a bigger object is an approximation. Orbiting their common center of mass is a better approximation. This is more easily visible in the pluto-charon system.[0]
The same of course applies to black holes. Them being heavier than most other objects in their vincinity does not render them immobile any less than the sun being the most massive object in the solar system renders it immobile.
In the first detection, they mentioned that two black holes collapsed, emitted gravitational waves, and the resulting combined mass was less than then sum of two previous masses because energy was spent on gravitational wave generation. Hence it means, that due to gravitational interactions, objects leak mass. Now, we know that every object in the universe is gravitationally related to every other object, plus universe is expanding hence objects are constantly in flux with each other. The question is where all the leaked mass goes? Can this leakage account for dark matter? What about the space-time, does it function as a storage medium for this energy that now came from the leaked mass?
Objects do no leak mass due to gravitational interaction, but gravitational interaction always expends some energy. The earth orbiting the sun expends some energy in the form of gravitational waves but it's miniscule. It is enough though that the earth's orbit decays measurably.
This experiment is designed to detect the waves carrying away that lost energy, but you need a cataclysmic event like the collision of two black holes for the event to be energetic enough that you can measure it and measure it on astronomical scales.
This is the reason why LIGO must be so sensitive, we hope not to experience a nearby black hole collision for as long as we're alive, so we must measure a distant one.
Mass and energy are interchangeable but they both must be conserved.
They didn't leak mass, the missing mass became energy in the form of emission (photons and some kinetic) and gravitational waves, which until recently we could not detect. The equations suggested they were there though, and that's why these experiments were funded, a way to find out if the uncertainties in the standard model were true or not. At this point, our model seems to predict what we track in reality with this experiment.
Of course, there are gaps in the standard model and they must all be tested. LHC is also looking at the gaps, and confirming/invalidating them.
It's important to note that gravitational waves themselves contain energy and angular momentum. Relativists often use the terms mass and energy interchangeably. When I worked in numerical relativity, we used units such that the speed of light was 1, so E=mc^2 (really E^2=m^2c^4+p^2) simplifies to E=m.
I have no idea what these people are talking about but I always thought as a small child that c should = 1. That square really bugged me - why squared, why not cubed or halved or more realistically some bally awkward number... Sounds like fun this numerical relativity!
It's squared because energy is work, work is force times displacement (distance), force is mass times acceleration, acceleration is velocity per time, and velocity is vector distance per time. So you get:
"Mass" in a blackhole is not the same as mass here on earth. The likely answer to your question is that the mass was converted to energy in the form of gravitational waves.
When a gravitational wave hits the earth, does the planet oscillate in place for the duration, or is our position in the cosmos displaced, or something else altogether?
The actual space in which the planet resides stretches and shrinks as the gravitational wave passes through it. The fabric of space itself is the medium that the wave travels through.
However, the affect is incredibly tiny, even though it was generated by two black holes colliding. The size of the distortion experienced here on Earth is 1000x smaller than the width of a proton! It's mind boggling.
I think I remember hearing that there is immense distortion in the area immediately around the collision, but I'm not certain.
The resolution of the detector is 1/1000th the width of a proton. The signal they see is much bigger: the length of 4km long arms stretch and then shrink about 1/10 the width of a proton. How do they know it is really space/time changing and not an earthquake? They have two detectors over 1000 miles apart from each other:
That is incorrect. What you describe is a dipolar oscillation. The simplest gravitational waves woudl have a quadrupole moment: stretch in one direction while contracting in the direction perpendicular to it. Have a look at https://en.wikipedia.org/wiki/Gravitational_wave where there is an animation illustrating the simplest case.
We do account for the expansion of the universe in fact. We estimated that this source was at about z ~ 0.2 (see https://en.wikipedia.org/wiki/Redshift). Roughly speaking this means there'll be only ~20% effect as the scale of the universe (a) has increased by (1+z) over the time it has traveled.
Essentially a combination of passive/active isolation that effectively completely decouples the system from the environment.
I went to a talk by one of the lead scientists in LIGO and they spent a lot of effort on this. I believe the sites are also fairly remote so they don't have trucks driving nearby.
Oh: "LIGO's mirrors must be so well shielded from vibration that the random motion of the atoms within the mirrors and their housings can be detected."
This is a complete guess, as I don't have any firsthand knowledge ... but I can only imagine that they have a whole array of seismographs on the premises, which they can use to clear noise from the main LIGO readings they are interested in.
Curious to know if this is the case from anyone who happens to know one way or the other :)
edit: the more I think about it, the more I think that random vibrations from passing trucks would be irrelevant ... it doesn't detect vibrations, it's measuring the speed of light between two points
Actually, I think it's measuring the intensity of light between two points. It has two perpendicular lasers of some wavelength which then interfere with one another. If they interfere perfectly you measure a zero, if the interference is off by some amount you measure a deviation in intensity from the i^ or the j^ direction.
You can eliminate passing trucks, earth tremors, mining, asteroid impacts etc simply by applying a band-pass filter that excludes measurement frequencies outside of the range predicted by the equations.
No filtering is applied to data containing such anthropogenic/geological noise after it is measured to remove such effects. In fact, such data is usually junked and not used for analysis. The sites have thousands of witness channels which listen for things like trucks, ground motion, magnetic storms, etc. that could possibly influence the mirrors in the way a gravitational wave would. If the same signal appears in both the gravitational wave channel and some auxiliary sensor, it's thrown away.
Instead, the mirrors are highly isolated from the ground (suspended from pendulums, motion damped by actuators, that sort of thing) so that such effects do not have significant impact on the motion of the mirrors.
In any case, seismic noise can't be fully isolated and creates a sensitivity wall below around 10Hz. To get sensitivity much below 10Hz, you have to go to space (look up LISA).
The frequency range varies (it's kind of a chirp) as the black holes spiral in towards each other.
But the frequency at the end is in the auditory range. The instrument is literally measuring a displacement in the same range as many of the things you mention (traffic, mining, ...). A band pass filter will not do.
vibrations from traffic, seismic events and even thunderstorms can be detected. If the mirrors or lasers move, it makes a difference. The sensitivity of these instruments are such that its akin to measuring the difference in the width of a human hair in the distance between here and alpha centauri. On the actual scale theyre measuring the slightest movement of the a fraction of a width of a proton.
- Black hole merger occurred 3 billion light years away
- Two solar masses were converted to energy
- Briefly 10^34 megatons of energy were released every second
This is hard to intuitively wrap your head around because we think of space as constant. Something like this can distort space itself. Amazing stuff.