Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?

Scientists Question Laws of Nature 314

mknewman writes "MSNBC is reporting that scientists are finding differences in many of the current scientific 'constants' including the speed of light, alpha (the fine structure constant of the magnetic force), the ratio of proton to electron mass and several others. These findings were made by observing quasars and comparing the results to tests here on the earth." From the article: "Time-varying constants of nature violate Einstein's equivalence principle, which says that any experiment testing nuclear or electromagnetic forces should give the same result no matter where or when it is performed. If this principle is broken, then two objects dropped in a gravitational field should fall at slightly different rates. Moreover, Einstein's gravitational theory -- general relativity -- would no longer be completely correct, Martins says."
This discussion has been archived. No new comments can be posted.

Scientists Question Laws of Nature

Comments Filter:
  • For example, Ohm's Law is much more interesting at a sub-microscopic levels []
  • This is a good thing (Score:5, Interesting)

    by growse ( 928427 ) on Wednesday July 12, 2006 @01:12PM (#15706419) Homepage

    This is a good thing. One of two things will happen from this

    1. The scientists are right and Einstein wasn't 100% correct.
    2. The scientists are wrong and let dust onto the damn sensors again

    If option (1) is true, it means we're entering that sort of post-Einsteinian "What the hell's going on here" phase in science, where we have a theory that we thought is good and we have some measurements which we also know are good and conflict with the theory. This will lead to lots more experiments being done and allow us to invent hyperspace faster.

    If option (2) is true, it means that the scientists in question will be metaphorically shot by the scientific community for daring to question the great reletivity laws, and remove bad scientists from the community.

    It's a win-win!
  • Chaos Theory (Score:2, Interesting)

    by LiquidCoooled ( 634315 ) on Wednesday July 12, 2006 @01:14PM (#15706426) Homepage Journal
    The lorenz attractor is a mathematical example of how sensitivity to initial conditions can affect the results of any test.
    There is no way that ANY test can be reproduced perfectly multiple times, however for a large percentage of things tested the differences are so small they are negligable.
    If you take a double pendulum and try (to scientific precision) to orient the beams to the exact location the results will be different every single time you do it (fluctuations in the universes' gravitational field caused by me farting or a butterfly flapping its wings for instance).
  • by helioquake ( 841463 ) * on Wednesday July 12, 2006 @01:19PM (#15706466) Journal
    Sometimes in astronomy, the handling in errors (both random and systematic) is sloppily done. The random error is probably done ok; but how about systematic ones?

    In an attempt to publish hastily, scientists often willingfully ignore some shortcomings in instrumetal calibration, etc., and may not take into account all the uncertainties that should be propagated through their calculations. I hope that those astronomers are not embarrassing themselves by making an error like that.
  • by gilroy ( 155262 ) on Wednesday July 12, 2006 @01:21PM (#15706479) Homepage Journal
    Blockquoth the poster:

    Is it also possible that the quasars we are observing are differing light years away and thus we are making observations based on data from several billion years ago (as the article states)?

    Oh, it's worse than that. The quasars are different distances away. How do we figure out how far away they are? By measuring the redshift in the frequencies of their spectra. What do we use for that? The relativistic Doppler formula. What is the key constant in the Doppler formula? The speed of light. Actualy, it's even worse, because it's not the naive Doppler formula but one that includes cosmological effects which are not independently observable.

    In other words, the distance of the quasars -- and the frequency their light "should" be -- are highly model-dependent.

    There's less to this story than meets the eye.
  • General Relativity (Score:3, Interesting)

    by duplicate-nickname ( 87112 ) on Wednesday July 12, 2006 @01:21PM (#15706483) Homepage
    Isn't general relativity incorrect for sub atomic particles anyway?'s been like 10 years since my last quantum physics class.
  • by Weaselmancer ( 533834 ) on Wednesday July 12, 2006 @01:24PM (#15706500)

    From the blurb:

    Time-varying constants of nature violate Einstein's equivalence principle, which says that any experiment testing nuclear or electromagnetic forces should give the same result no matter where or when it is performed.

    Maybe there is a hidden assumption in there. Maybe space itself isn't constant.

    We're already thinking that space may have an energy to it. [] If it has energy, then space would have an equivalent mass. Possibly you could describe that as a density of sorts.

    So if space itself has a sort of density, then maybe the slight differences you see in the constants are caused by the varying density of different regions of space they are traveling through to be measured.

    IANAP, YMMV, etc. But I think it might be at least possible. Einstein's principle above would have to be edited to say "in equivalent spaces".

    That always seems to be the way of scientific progress. You create a set of equations describing what you see, like Newton did. Then someone can see a little farther, and amend them like Einstein did. Another amendment wouldn't be "questioning the laws of nature", it would just simply be understanding them a little better.

  • Grain of salt time (Score:2, Interesting)

    by Anonymous Coward on Wednesday July 12, 2006 @01:26PM (#15706513)
    It's worth noting that none of the results described in TFA have actually been confirmed, that they are in fact recent and highly contested, and that many such claims in the past were subsequently retracted or refuted. There is a minor bandwagon on "variable constants", actually; everybody and their brother is measuring physical constants, and pointing at any minor statistical fluctuation way out at the edges of detectability as "evidence of variation".

    The implications would be very interesting if any of these claims panned out (which is why it's so popular to make claims like this in the literature), and there are theories in which some of these "constants" are indeed allowed to vary, but we'll need to wait years to see if followup experiments determine that any of these effects are real. Personally, I'm skeptical that any of the specific constants discussed have been proven variable by any of the experiments mentioned in the TFA. I'm not saying the experimentalists are incompetent, but the reported effects are so hard to measure that the effect may just go away after a few more independent checks; this has happened a lot in the literature.
  • Rupert Sheldrake ... (Score:1, Interesting)

    by Anonymous Coward on Wednesday July 12, 2006 @01:29PM (#15706542)
    ... thinks most of the so-called "laws of nature" are more like habits. Here's his essay on The Variability of Fundamental Constants [].
  • by swschrad ( 312009 ) on Wednesday July 12, 2006 @01:34PM (#15706575) Homepage Journal
    the closer you get to measuring a small event, the more the attempt to measure it gets in the way.

    also called the "uncertainty principle."

    there is a good chance that all these differing microerrors in all sorts of differing directions are different diffractions through inteference in what we can observe, thus proving the heisenberg principle has raised its ugly head again.

    aka don't sweat it until you get a couple thousand indicators in the same direction. just like this week's surprise medical discovery that pesticides cure cancer, or coffee cures cancer, or coffee cures pesticides, or whatever bogus wrong-way publication made it into print on one limited study. the last line of those articles always reads, "The findings suggest that further studies in the field should be undertaken," which is code for "The previous article was written to get more grant money, send to PO Box 666, Unterderlinden, NJ."
  • by jeblucas ( 560748 ) <jeblucas&gmail,com> on Wednesday July 12, 2006 @01:36PM (#15706588) Homepage Journal
    I've been stewing about this for a long time, I've called into NPR talk shows about it, etc. I feel like the Standard Model [] is irrevocably broken. There's a generation of physicists that really loves the hell out this thing, but it's got so many problems. I was tangentially involved with "proton sigma-r" cross-section experiments [] at the University of Redlands that violated the Standard Model. A lot of the SM's important values are empirical and "bolted on". A number of its predictions are not yet found (Higgs boson, anyone? Bueller?)

    Yes, it predicted a number of cool particles, and sure enough, there they are. It also craps out more and more lately. Neutrinos oscillate, huh? Uh, well, we'll fix that later. Gravity... yeah. That's a bitch. I know! More free variables! We're at 19 now, what's 10 more?

    This whole thing smacks of turn-of-the-20th-century Newtonians trying to cobble together a decent explanation for black-body radiators []. They tried all kinds of tricks--turns out they didn't work, because the system is not Newtonian. Newtonian physics was awesome for predicting meso-scale behavior, but it's a dog at small and large scales. Similarly, I think, the Standard Model was super-dynamite for a good number of years, but to hang on to it through all these issues should be a red flag that something else might be a better explanation. Kuhn, here we come. []

  • by Jhan ( 542783 ) on Wednesday July 12, 2006 @01:39PM (#15706617) Homepage
    This is a good thing. One of two things will happen from this :
    1. The scientists are right and Einstein wasn't 100% correct.
    2. The scientists are wrong and let dust onto the damn sensors again
    I'd say 1. It's not just the "variable constants", it's the way the galaxy rotates, it's the anisotropy measurements of the comsic background etc. You know, all the evidence piling up over the last few decades that lead cosmologists to pull first dark matter, then dark energy out of their hats.

    Apparently 96% of our entire universe is now believed to be made up by these two substances, neither of wich have been explained. I suggest that one of the following options are true:

    1. With many "patches" the existing theories can be contorted enough to explain the new data (see also epicycles, phlogiston)
    2. A new theory will explain these anomalies in a simple and obvious way.

    My bet is 2, and string theory is not it... Interesting times ahead, mark my word.

  • by jfengel ( 409917 ) on Wednesday July 12, 2006 @01:41PM (#15706640) Homepage Journal
    Yeah, I noticed the same thing. In one sense it's kind of irritating to have the insinuation perpetrate the myth that scientists have a non-rational belief equivalent to a religious belief, and that these scientsts are some kind of heretics. We know what they meant, but still...

    A more precise headline is somewhat harder to write: "Scientists find evidence that they may have to refine or even refactor some really, really well-demonstrated theories" isn't nearly as punchy.

    (Scientists do, in fact, have non-rational fundamentally held beliefs, but they're nothing so simple as "Einstein was right, Darwin was right". Trying to convince somebody that a scientist's real religious belief is "The universe has some sort of fundamental, objective, and probably comparatively simple law, one that we can understand or at least produce successively more acurate approximations, one that can be modeled mathematically and is true over all space and time, one that makes predictions that can be tested and will stand up to all such tests all the time" is rather more complicated and less fun. And yes, I recognize that my approximation of that belief above is both more complicated and less accurate than some other formulations, but I'm already drifting dangerously off-topic.)
  • sample too small (Score:1, Interesting)

    by CJSlim2001 ( 988471 ) on Wednesday July 12, 2006 @01:42PM (#15706644) Homepage
    One of my hypothesis from high school was that all of the "laws" we've found to be true for our planet, may not hold true when applied arcoss the universe. The problem is that we're observing too small of a sample size. Our planet is a mere spec when compared to the total of all masses in existance.

    Chances are, the laws we now know are correct... but only when applied to our planet. The displacement caused by the earth is what gives us gravity. Should the displacement of the Earth be altered by either adding or subtracting large amounts of high density molecules, then the gravity would also shift. The laws of science will only hold true when the variables being measures are the same. ie - The speed that light travels given our displacement will yield different results than the speed light travels when given a different displacement (namely, a quazar).

    Is the sky blue?



    (source []) [quote] The blue color of the sky is due to Rayleigh scattering. As light moves through the atmosphere, most of the longer wavelengths pass straight through. Little of the red, orange and yellow light is affected by the air. However, much of the shorter wavelength light is absorbed by the gas molecules. The absorbed blue light is then radiated in different directions. It gets scattered all around the sky. Whichever direction you look, some of this scattered blue light reaches you. Since you see the blue light from everywhere overhead, the sky looks blue.[/quote]

    Yet if we were to observe the same sky from outer space, the same princinple does not apply. Now the sky is blue because you are looking down on many large bodies of water.

    Perception is 9/10 of reality.
  • Re:Err.... (Score:3, Interesting)

    by Billosaur ( 927319 ) * <wgrother@oEINSTE ... minus physicist> on Wednesday July 12, 2006 @01:55PM (#15706761) Journal
    Look out at the stars. You're seeing them as they appeared several million or billion years ago. The light that you now see from the sun is 8 minutes old, for comparison. All the data we collect from outer space is historical information--how the universe was in the past.

    However, if physical constants such a the speed of light are variable, based on the expansion of the universe and the distance from the initial point of expansion, then the light from those quasars has perhaps sped up or slowed down since being released. While we may be looking into the past, a variable speed of light would mean we don't know how far into the past. This brings up the question of relativity, since not only would an observer see something different at one point A, than another would see at point B, but now neither observer could be sure if what the other is seeing is invariant compared to what they have seen. Both might use the same formula to calculate mass increase as a function of velocity, but inherent to that equation is "c" and if both observers have different local values for "c", then their answers will not be the same and they will not be seeing exactly the same thing. It makes for interesting nightmares.

  • by ScentCone ( 795499 ) on Wednesday July 12, 2006 @02:20PM (#15706989)
    With this planet's increasing inhospitability

    I always find this perspective to be sort of a head-scratcher. What time-frame are you using? Is it less hospitable than, say, during the ice age? Or, while the plague was slaughtering half the population of Europe? Or while the Soviets and their puppets were within inches of launching nukes from Cuba? Or, while we were paying more (in real dollars) for oil a couple decades back... or suffering horrible inflation and much higher unemployment in the 1970s? Or while millions were dying in the great world wars? Or while slavery was a key part of the colonial economy?

    Personally I like antibiotics, refridgeration, satellite communication, computer networks with millions of nodes including something smaller than a bar of soap that lets me write and send things like this while sitting in the woods listening to birds chirp. We've never had a higher standard of living, longer life expectancy, or more ways to communicate with one another. That we're having cultural friction with someo groups that don't want things to play out quite that way, and have to sort out amongst ourselves the best way to deal with that (while not getting blown up on a train, etc), is unfortunate... but still nothing compared to the growing pains of the past.

    That being said, I also want to zoom around the universe. A lot.
  • by invader_allan ( 583758 ) on Wednesday July 12, 2006 @02:23PM (#15707015)
    This is an old problem with science put forth by David Hume. In order for science to work the future must be like the past and the past must be like the present observations. Any "constants" found by observing a finite part of the universe and applying it to the whole may be problematic, yet we are willing to jump into the metaphysics of "and yet it MUST be so!" from our observations and ingenious models that seem to work so very well. Now, it does work very very well because you can build a remarkably functional rocket based on our laws of science, so on a pragmatic level science is an exceptionally solid epistemology. But the metaphysics are the problem, if you care to take metaphysics into the equation. The engineers designing a functional rocket don't. And I consider myself a pragmatist, so let them build a better mousetrap even if they mistakenly call them "laws". }8^)>
  • Re:Err.... (Score:3, Interesting)

    by mattkinabrewmindspri ( 538862 ) on Wednesday July 12, 2006 @02:27PM (#15707052)
    Yes, but what the article is saying is that if things like the speed of light aren't constants, then the light from those stars may have been traveling here at differing speeds.

    All of the sudden our yardstick is broken, because if the speed of light isn't really constant, then two stars which seem to be the same distance away might actually be two very different distances away from us.

    If light from a closer star came at a slower speed compared to light from a far star, then they may seem to be the same distance away from the earth.

    Or if the speed of light changes over time, then light from one star may have traveled quite a distance longer than we thought to get here while light from another, newer star may have traveled less distance at a slower speed. The light from the two stars may lead us to believe that the two stars were similar distances away, when one was drastically older and drastically farther away.
  • by LordVorp ( 988488 ) on Wednesday July 12, 2006 @02:54PM (#15707295)
    Oh, but it's even worse than THAT... recent observations that the vacuum is *not* purely empty, but apparently seething with energy [], give rise to a modern, quantum mechanical confirmation of the 19th century concept of that sacreligious word: the (a)ether. But, modelled as a matrix of quantum particles [] (muons, in this case), it is possibly palatable to modern science. How can this be relevant, you ask? When one models physics BASED on this matrix of quanta, all kinds of things that are currently mysteries become clear. Like for example, the observation that redshift is quantized []. That, along with other observations [], give lie to the fact that Doppler redshift of star spectra is *ONLY* due to distance and speed. Which means that all astronomical distances recorded and marked based on redshift alone, vs. parallax measurements, fall under new scrutiny. And which allows for areas of the universe (like the high-energy surrounds of quasars) that have a higher energy density than our local galactic neighborhood. And these higher energy domains have "ether" concentrations that will affect what? You guessed it: the speed of light, the fine structure constant, the cosmological constant, and the value of G, the gravitational constant.
  • by Anonymous Coward on Wednesday July 12, 2006 @03:37PM (#15707661)
    Actually, it's his Theory of General Relativity. Even Einstien said it was imperfect and incorrect for all things. He was looking for a Theory of Specific Relativity, but that eluded him as it has all others since then. String theory is the latest greatest attempt at specific relativity, but it doesn't hold water every time either, nor does quantum mechanics, although both are pretty close.
  • by Artifakt ( 700173 ) on Wednesday July 12, 2006 @04:24PM (#15708063)
    There are some other things that can be used to guesstimate quasar distances - for just one, gravitational lensing effects accumulate if there are more galaxise between us and the observed quasar, and so the quasars with the most complex total lensing are likely to also be exceptionally far. (The comparison would be a statistical average methodology for a laege sample of quasars, rather than serving to predict distances for any individual quasar). There are probably enough observations already on record to compare total lensing complexity with the doppler formula predictions with pre-existing data, and it shouldn't be too calculation intensive. (Just imagine a Beowulf Cluster of old cheap boxes, six months actual processing, and a grad student looking for a good doctoral thesis). I wouldn't be at all surprised if this has already been done.
              Offhand, there are probably also different ratios for the really high energy cosmic rays emitted (Particularly cosmic rays over the theoretical maximum predicted for an extra galactic source) These last have been observed coming from extra galactic sources such as quasars. The theoretical maximum is known as the Greisen-Zatsepin-Kuzmin limit, and is derived from GR. It's a puzzle for cosmologists that the GZK limit doesn't match real world observations, but I don't know if anyone has actually matched sources with other distance prediction methods on a large scale. Cosmics over the GZK limit are rare, but not ultra rare events, and it may take a decade or so to amass enough data to be able to draw significant conclusions, but more data gathering here would probably give us some distance checks on the relativistic doppler method faster than it will explain the failure of GZK itself.
  • by tgrigsby ( 164308 ) on Wednesday July 12, 2006 @06:44PM (#15709073) Homepage Journal
    If this principle is broken, then two objects dropped in a gravitational field should fall at slightly different rates.

    Only if the physical constants are different for the two objects. If, within the context in which they fall, the constants are the same, the objects will drop at the same rate. The experiments show that these constants vary over extreme amounts of time, with no proof as of yet that they vary over distance.

  • by sickofthisshit ( 881043 ) on Wednesday July 12, 2006 @08:12PM (#15709491) Journal
    You seriously underrate Einsten if you believe he was limited to a quasi-Newtonian world view. He wrote something like 3 out of the first 10 papers on quantum mechanics, becoming the first to (quite boldly) apply it beyond black-body radiation (everyone knew something novel was needed to explain black-body radiation, but not for Einstein's choices of the photo-electric effect, optical coefficients, and specific heat of solids) and was quite possibly one of the first (or second, after Poincare) to realize that something fundamentally non-classical was going on in Planck's calculation. (Read Kuhn's book on Planck and the "Quantum Discontinuity.") He also wrote a paper on the chaotic motion of the helium atom defeating a semi-classical approach which was something like *50* years ahead of its time.

    I believe essentially the opposite; that Einstein was greatly influenced by statistical mechanics; he knew that atomic spectra were always measured in cases using quite large numbers of atoms/molecules to create the line spectra, and that to attribute the emission of spectra to isolated atoms was logically unjustified.

    Nowadays, we can *do* experiments on isolated atoms and verify that, individually, they do obey quantum principles. In the 1920s when Bohr was handwaving through concepts like complementarity, there was no concrete basis for such a belief.
  • I for one (Score:3, Interesting)

    by suitepotato ( 863945 ) on Wednesday July 12, 2006 @08:26PM (#15709554)
    welcome our new (in)constant overlords or would if quantum mechanics allowed me to state what they were and when and where at the same time.

    What I took away from the field of physics so far was that constant variables are bunk and largely a matter of fudging. The important constants are actually the formulaic and thus geometric relationships between the variables. Such as E=mc^2. If c is variable then with a factor n,

    E=m((nc)^2) which amounts to E=(m/(n^2))((n^2)(c^2))

    So for energy to remain the same without violations, as the local speed of light increases, mass must decrease.

    I don't believe and never have that the individual value constants are constant but subject to the spacetime fabric and its conditions.

Information is the inverse of entropy.