There's no difference between "change in speed of light", "change in distance", and "change in travel time for light". They're all the same thing.
More or less in broad details.
Don't both instruments detect very small changes in round-trip travel time for light, comparing one direction to the other?
To over simplify:
- MM experiment was about measuring a clear and constant difference of speed of light that depends on the orientation of the two beams. (if there's any fluctuation, it should be noise and discarded. It's the average speed in each direction of space that is important).
- LIGO experiment is about measuring a fluctuation of distance. It fluctuates around a set mean (because the speed of light is constant no matter which direction or travel speed), but as wave of space distorsion go through it tiny oscilliation might happen. (In other words, it *the noise* which is important. The mean speed should be constant no matter the direction.)
In very broad details (actual physicists and historians are both going to laught at me) :
Back in the 19th century, the theory of ether was that light was a pure wave (like sound waves, etc.) Like any wave it needs to travel inside a medium (e.g.: speech is a sound wave that travels through air - under the form of a compression wave of said air) (e.g.: ripples are surface displacement of water, etc.)
As lights travels through space, it should mean that space isn't void. Instead there's a medium that can carry this wave (just like air carries sound, water carry ripples) except that medium should be much thinner and lighter than air, because everything behaves as if there was void between the planets.
(Note: under this hypothesis, ether is just a simple substance/medium like anything else, just extremely light/thin).
Now the question is: how would you test it?
Well if stars roam around an universe filled with a medium called ether, that mean they move relative to the ether fluid.
And depending on the relative motion of earth in ether, the speed at which the light wave are carried in this ether should look different between the direction of the ether flux (relative of us) and perpendicular to it.
Think "doppler effect in water". Or think the distorted ripples caused by something travelling on the surface of water: as seen from the traveller's point of view, waves in front seem to move away slower (as the traveller catches up to them) whereas side way, the move away the usual speed.
So ether could be proved by setting up an interferometer. Then as you rotate it around (thus aligning differently them beams. Which beam is along the traveling distance of earth through ether, and which beam is perpendicular), you'll notice a shift in the interferometer. You're detecting the speed of travel of planet Earth across the ether medium.
Result of the experiment: Nothing. Silch. Nada. No matter how precise you measure, no matter how you orient the setup, *the speed of light is the same in all direction*. It's not direction dependent, it's not dependent on some flux of some "ether medium".
Nothing of the variation that was expected, given the speed of travel of planet Earth through ether.
A more precise instrument wouldn't have helped. The planet Earth travel rather fast around the sun, so the different of speed, though subbtle should definitely be noticeable.
Now back to LIGO:
Since the 19th century, the duality particle/wave is known (and has been proven by slit experiments, etc.).
Special Relativity states (among other) that the law of physics are invariant. The speed of light in void is always 1*C. No matter the reference, no matter if you travel, etc. (Thus, although the result of MM where surprising given the ether model, its absolutely what's expected under special relativity. Zero surpise. Speed of light should be 1*C in vacuum, no matter how you orient your experiment regarding the orbit of our planet around the sum. Or the travel of the sun around our Galaxy. Or the expansion of the milky way away from the big bang).
General Relativity states (among other) that gravity (and momentum, energy, etc.) are caused by the space it self getting curved (not a substance, a medium like ether. It's space time that bends).
One over-simplified metaphor is the popular 2D/3D model of balls placed atop of a rubber sheet (2D) which gets deformed (in 3D). (except that according to string theorist, in real life the ruber-sheet could have an insane amount of dimensions :-P ).
Another way to think about it (pure 2D): a sheet of squared paper. You draw dots on it (objects), and around the objects the grid will get automatically distorted, the grid suddenly isn't square anymore. Other objects which travel around don't actually curve *their trajectories* they always keep moving forward on the grid. But because the squares on the grid aren't squares anymore but squished, this *forward travel* seen from the outside looks like a curve).
This makes an interesting point to test: light has no mass. According to newtonnian physics, it should keep traveling forward as it doen't feel any gravity pulling on it (force of gravity is propotional to the product of both masses, and as light's is 0...).
But according to relativist theory, light *should bend*. Because it keeps travelling forward (as anything else) but travels so across space that was bent (along a "grid" whose "squares" are "squished"). So light should bend around massive objects, even though it has a mass of exactly zero.
Fast forward later and gravity lensing DOES work. (The predicted bending of light around very massive object like black hole has been measured. The space around black hole measurabily works as if it was a huge optical len)
Now these distorsions themselves should have a speed limit. The effect of gravity isn't instant. It travels around space (like light) and has an upper limit of travel (should travel at the speed of life. 1*C is the top limit for anything that travels).
To go back to the 2D/3D metaphor imagine using a high speed camera and filming the rubber sheet as you drop the ball on it. As soon as the ball touches the sheet, you should see the ripples forming and travelling away until eventually the object settles in a curved well.
The same is predicted to happen in real life. Massive objects create big distrotions of space. If the objects changes, the distosion doesn't instantly travel with it. Instead one "could see" ripple travelling from the object outward as the distrosion slowly settles in the new conformation due to the new position of the massive object. That's gravitationnal waves: ripples as the distorsion of space settles to adapt to the changes of ultra massive object.
If you have a massive enough event (like two very big black holes orbiting each other) the waves might get big enough to be measurable.
So to detect that, you *also* setup an interferometer. *BUT* this time, no matter which orientation you turn your experiment (or given LIGO's huge size - "no matter which direction planet earth's rotation orient it for you, depending on the day/night cycle) you should always find that the speed of light is 1*C in vacuum, in all directions.
Instead you just sit and wait, and try to measure thing as precisely enough. And eventually one of the ripple of something massive enough will cause a big enough distorsion to be noticeable on your experiment.
That's what is definetly confirmed to be measured at LIGO.
Again both experiment rely on interferometers.
- MM relies on there being a medium - called "ether".
- MM would have expected that there's a constant N % of difference of speed of light along the direction of travel of planet Earth accross ether.
- MM doesn't extra precision beyond some point. Either those N% are here or not. Using LIGO-level precision just puts extra decimals at the end of this constant.
- LIGO doesn't rely on a medium. It's the space-time it self that is bent. Vacuum gets shaped funny around massive object (as proved by gravity lensing).
- LIGO doesn't find a difference in speed of light in some direction. It's always 1*C in all directions.
- LIGO on the other hand might see noise. Tiny fluctuation. These fluctuation are predicted to happen due to very important bending of space time due to very bit masses playing at very high speed.
- The precision of LIGO *does* play a point. The higher the precision, the tinier the fluctuation that can be detected, and thus the less your rely on a bat-shit extreme massive event like a pair of "galaxy core"-level blackhole merging with each other at relativistic speed just on the parking lot outside the experiment.
I hope that these makes the difference clear.
Now for the obligatory disclaimer: (to paraphrase Dr. Leonard McCoy) I'm a (medical) doctor, not an astrophycist.
My research happens at much tinier scales compared to that, so take my explanations as an over simplification, with a grain of salt.
Sure then 1880s apparatus wasn't going to detect gravity waves, but that's just a matter of sensitivity of the instrument.
Whereas, as stated above, the reverse is not true.
LIGO-level precision isn't necessary to detect a constant bias in the speed of light along the direction of travel of planet earth across the ether medium. It would have added more digits of precision on the measured factor (or, as it turned out, more zero after the actual "1.00000..." factor measured in real life).
We still call an electron microscope a microscope.
Well, actually there are *surface* electron microscope, and *transmision* electron microscope.
We actually DO make a distinction.
(And for that one : you can trust me, I *do* work in life science)