Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Scientists Question Laws of Nature 314

mknewman writes "MSNBC is reporting that scientists are finding differences in many of the current scientific 'constants' including the speed of light, alpha (the fine structure constant of the magnetic force), the ratio of proton to electron mass and several others. These findings were made by observing quasars and comparing the results to tests here on the earth." From the article: "Time-varying constants of nature violate Einstein's equivalence principle, which says that any experiment testing nuclear or electromagnetic forces should give the same result no matter where or when it is performed. If this principle is broken, then two objects dropped in a gravitational field should fall at slightly different rates. Moreover, Einstein's gravitational theory -- general relativity -- would no longer be completely correct, Martins says."
This discussion has been archived. No new comments can be posted.

Scientists Question Laws of Nature

Comments Filter:
  • For example, Ohm's Law is much more interesting at a sub-microscopic levels [gsu.edu]
  • "There is absolutely no reason these constants should be constant," says astronomer Michael Murphy of the University of Cambridge. "These are famous numbers in physics, but we have no real reason for why they are what they are."
    Well, I'm a computer scientist not a physicist but I thought these constants are present because all observations so far have verified that. We aren't able to make observations from several million or billion years ago so we cannot tell whether or not these constants change or at what rate. Our instruments are not precise enough to do that nor have they been around long enough.

    I recall reading that as a universe expands or contracts, the constants would theoretically change to adjust to the expansion or contraction of the basic building blocks of matter.

    Not all quasar data is consistent with variations. In 2004, a group of astronomers -- including Patrick Petitjean of the Astrophysical Institute of Paris -- found no change in the fine structure constant using quasar spectra from the Very Large Telescope in Chile. No one has yet explained the discrepancy with the Keck telescope results. "These measurements are so difficult and at the extreme end of what can be achieved by the telescopes that it is very difficult to answer this question," Petitjean says.
    Is it possible that the measuring instruments failed here? I thought that was always a possibility in observations. Is it also possible that the quasars we are observing are differing light years away and thus we are making observations based on data from several billion years ago (as the article states)?

    "We have an incomplete theory, so you look for holes that will point to a new theory," Murphy says. Varying constants may be just such a hole.
    Yes, I think that there is call for speculation on the constants varying over billions of years since the light we are observing is roughly 12 billion years old and all our observations here on earth remain static.
    • by gilroy ( 155262 ) on Wednesday July 12, 2006 @01:21PM (#15706479) Homepage Journal
      Blockquoth the poster:

      Is it also possible that the quasars we are observing are differing light years away and thus we are making observations based on data from several billion years ago (as the article states)?

      Oh, it's worse than that. The quasars are different distances away. How do we figure out how far away they are? By measuring the redshift in the frequencies of their spectra. What do we use for that? The relativistic Doppler formula. What is the key constant in the Doppler formula? The speed of light. Actualy, it's even worse, because it's not the naive Doppler formula but one that includes cosmological effects which are not independently observable.

      In other words, the distance of the quasars -- and the frequency their light "should" be -- are highly model-dependent.

      There's less to this story than meets the eye.
      • by wanerious ( 712877 ) on Wednesday July 12, 2006 @01:53PM (#15706746) Homepage
        How do we figure out how far away they are? By measuring the redshift in the frequencies of their spectra. What do we use for that? The relativistic Doppler formula.

        Only at pretty low redshift, though. At any redshift appreciably close to or greater than 1, there really isn't much meaning to "distance" --- would you interpret that distance to be at the time of emission, the time of detection, or somewhere in between? We basically just use the cosmological redshift, which says that the redshift z represents how much the universe has expanded since the radiation was emitted. That's it. Any "distance" or lookback time is model-dependent. Instead of measuring slight deviations in universal constants, they are perhaps measuring perturbations in a particular cosmological model.

        In other words, the distance of the quasars -- and the frequency their light "should" be -- are highly model-dependent.

        Right --- I'm just picking nits, since I've seen lots of confusion by others in similar reports.

      • by jma34 ( 591871 ) on Wednesday July 12, 2006 @02:17PM (#15706957)
        ...highly model-dependent.


        This is really the crux of a measurement. How many assumptions from the model are used to make the measurement? In an ideal experiment, the measurement itself is what verifies or falsifies the model, but in reality there are usually other parameters that are needed as inputs to the experiment that are computed using the model, thus the model dependence. I'm in experimental high energy particle physics and we worry about this every day, and try to reduce the number of theoretical inputs needed to make sense of our data. I'm sure the astronomers do likewise, but sometimes inputs are unavoidable. This doesn't make the measurement invalid because a model should be self consistent as well. So if you correctly compute the inputs using the model, and your results still differ from the model then some double checking of everything needs to be done because the model is showing a flaw. The true size of the flaw is the really hard thing to quantify because all of the quatities are model-dependent. In the end this could turn out to be nothing or the start of something.

        I welcome all chinks in scientific theories because it generally leads to new scientific understanding and a new round of theories and models. Really that's what science is all about. In my field, we all hope that the LHC finds the Higgs, that will solidify the Standard Model, but we also hope that it finds lots of things that don't fit the Standard Model, that would point the direction for future discovery. If we didn't find anything unusual at the LHC it might put a huge damper on particle physics, and I'd have to switch areas of research.
      • Oh, but it's even worse than THAT... recent observations that the vacuum is *not* purely empty, but apparently seething with energy [slashdot.org], give rise to a modern, quantum mechanical confirmation of the 19th century concept of that sacreligious word: the (a)ether. But, modelled as a matrix of quantum particles [aspden.org] (muons, in this case), it is possibly palatable to modern science. How can this be relevant, you ask? When one models physics BASED on this matrix of quanta, all kinds of things that are currently mysterie
      • The article doesn't go into detail but I suspect the changes they're observing are a bit more subtle than the redshift not being exactly what they thought it might be. Note also that they're not talking about the speed of light or the strength of the electromagnetic force, but rather the fine structure constant, which is a unitless RATIO of two constants.

        I expect what they're observing is not all of the spectral lines being in the wrong place (as you'd get with different redshifts) but rather SOME of them
      • There are some other things that can be used to guesstimate quasar distances - for just one, gravitational lensing effects accumulate if there are more galaxise between us and the observed quasar, and so the quasars with the most complex total lensing are likely to also be exceptionally far. (The comparison would be a statistical average methodology for a laege sample of quasars, rather than serving to predict distances for any individual quasar). There are probably enough observations already on record to
    • Err.... (Score:3, Insightful)

      by brian0918 ( 638904 )
      "We aren't able to make observations from several million or billion years ago so we cannot tell whether or not these constants change or at what rate."

      Look out at the stars. You're seeing them as they appeared several million or billion years ago. The light that you now see from the sun is 8 minutes old, for comparison. All the data we collect from outer space is historical information--how the universe was in the past.
      • I think they mean we can't look several billion years further into the past to identify if the rate was constant and changed once, decaying, changing linearly etc. Say the quasar's light was now 10 billion years old, they mean we would like to see light from 11 billion years ago to see what the observation would have been 11 billion years ago, etc.
      • Re:Err.... (Score:3, Interesting)

        by Billosaur ( 927319 ) *

        Look out at the stars. You're seeing them as they appeared several million or billion years ago. The light that you now see from the sun is 8 minutes old, for comparison. All the data we collect from outer space is historical information--how the universe was in the past.

        However, if physical constants such a the speed of light are variable, based on the expansion of the universe and the distance from the initial point of expansion, then the light from those quasars has perhaps sped up or slowed down sin


        • While we may be looking into the past, a variable speed of light would mean we don't know how far into the past.


          While what you're saying is technically true, the errors introduced by the variance of a few parts in a million of the speed of light is WAY smaller than the uncertainty of the distance that a quasar is from us (a few parts in a hundred I'd guess). In other words we already don't know exactly how far into the past we're looking to a MUCH larger degree than this potential variability of C.
      • Yes, but what the article is saying is that if things like the speed of light aren't constants, then the light from those stars may have been traveling here at differing speeds.

        All of the sudden our yardstick is broken, because if the speed of light isn't really constant, then two stars which seem to be the same distance away might actually be two very different distances away from us.

        If light from a closer star came at a slower speed compared to light from a far star, then they may seem to be the same dist
    • > Yes, I think that there is call for speculation on the constants varying over billions of years ...

      Yet more evidence that the universe is just a gigantic computer simulation.

      Old programmer's adage: Variables won't. Constants aren't.

  • by MECC ( 8478 ) * on Wednesday July 12, 2006 @01:08PM (#15706387)
    FTA:the quasar observations are sometimes interpreted as indicating that light was faster in the past,

    They just don't make photons like they use to...
  • by supersnail ( 106701 ) on Wednesday July 12, 2006 @01:08PM (#15706390)
    filthy law breaking unearthly quasars should be hunted down and expelled from the galaxy.

  • by OctoberSky ( 888619 ) on Wednesday July 12, 2006 @01:10PM (#15706402)
    For those wondering who "scientists" are, it's the Dharma crew.

    I would recommend not flying/sailing for the next few months.
  • honestly... (Score:5, Funny)

    by Digitus1337 ( 671442 ) <lk_digitus@h[ ]ail.com ['otm' in gap]> on Wednesday July 12, 2006 @01:12PM (#15706415) Homepage
    It doesn't take an Einstein to... aww crap.
  • This is a good thing (Score:5, Interesting)

    by growse ( 928427 ) on Wednesday July 12, 2006 @01:12PM (#15706419) Homepage

    This is a good thing. One of two things will happen from this

    :
    1. The scientists are right and Einstein wasn't 100% correct.
    2. The scientists are wrong and let dust onto the damn sensors again

    If option (1) is true, it means we're entering that sort of post-Einsteinian "What the hell's going on here" phase in science, where we have a theory that we thought is good and we have some measurements which we also know are good and conflict with the theory. This will lead to lots more experiments being done and allow us to invent hyperspace faster.

    If option (2) is true, it means that the scientists in question will be metaphorically shot by the scientific community for daring to question the great reletivity laws, and remove bad scientists from the community.

    It's a win-win!
    • (1) is true, it means we're entering that sort of post-Einsteinian "What the hell's going on here" phase in science

      Been there, done that, got the quantum mechanics.
      • Quantum physics was evolving as Einstein was doing his work, but it left Einstein feeling uneasy. Given that Einstein grew up learning a fairly Newtonian view of the world, it's understandable that he was hesitant to leave all of it behind even as he was redefining much of it. Although perhaps he didn't view it as redefining, but rather (consciously or unconsciously) refining, whereas quantum mechanics really are a redefinition of the laws of physics.
        • Let's also not forget that Einstein was one of the founders of quantum mechanics! He won his Nobel Prize for work on the photoelectric effect, which helped prove that light was quantized, not for anything he did with Relativity. Sources: http://en.wikipedia.org/wiki/Photoelectric [wikipedia.org], http://en.wikipedia.org/wiki/Albert_Einstein [wikipedia.org]

        • You seriously underrate Einsten if you believe he was limited to a quasi-Newtonian world view. He wrote something like 3 out of the first 10 papers on quantum mechanics, becoming the first to (quite boldly) apply it beyond black-body radiation (everyone knew something novel was needed to explain black-body radiation, but not for Einstein's choices of the photo-electric effect, optical coefficients, and specific heat of solids) and was quite possibly one of the first (or second, after Poincare) to realize th
    • The scientists are right and Einstein wasn't 100% correct.

      If option (1) is true, it means we're entering that sort of post-Einsteinian "What the hell's going on here" phase in science, where we have a theory that we thought is good and we have some measurements which we also know are good and conflict with the theory. This will lead to lots more experiments being done and allow us to invent hyperspace faster.

      Totally. Get me off this crazy planet! Seriously. I've been paying attention to various thi
      • With this planet's increasing inhospitability, I'd like to at least check out Mars in my lifetime.

        Yes, because Mars is sooo much more hospitable than the earth.
      • by ScentCone ( 795499 ) on Wednesday July 12, 2006 @02:20PM (#15706989)
        With this planet's increasing inhospitability

        I always find this perspective to be sort of a head-scratcher. What time-frame are you using? Is it less hospitable than, say, during the ice age? Or, while the plague was slaughtering half the population of Europe? Or while the Soviets and their puppets were within inches of launching nukes from Cuba? Or, while we were paying more (in real dollars) for oil a couple decades back... or suffering horrible inflation and much higher unemployment in the 1970s? Or while millions were dying in the great world wars? Or while slavery was a key part of the colonial economy?

        Personally I like antibiotics, refridgeration, satellite communication, computer networks with millions of nodes including something smaller than a bar of soap that lets me write and send things like this while sitting in the woods listening to birds chirp. We've never had a higher standard of living, longer life expectancy, or more ways to communicate with one another. That we're having cultural friction with someo groups that don't want things to play out quite that way, and have to sort out amongst ourselves the best way to deal with that (while not getting blown up on a train, etc), is unfortunate... but still nothing compared to the growing pains of the past.

        That being said, I also want to zoom around the universe. A lot.
    • by Mac Degger ( 576336 ) on Wednesday July 12, 2006 @01:38PM (#15706601) Journal
      Option 1 has always been true. Not since the quantum crisis have scientists been that arrogant to assume that their theories are set in stone; we're constantly refining the models to fit reality better and better. Hell, even if we finally accomodate all the forces into one model, we'll assume that that model will eventually be surpased by one which is better and more precise. Modern science is based on the fact that we realise we're pretty much never 100% correct.
      • "Not since the quantum crisis have scientists been that arrogant to assume that their theories are set in stone"

        My own observation is that ever since relativity and quantum mechanics, scientists are happy to make up wild-ass theories to explain stuff that doesn't quite fit. Witness Dark matter, Dark energy, "unseen" dimensions, time varying fundamental constants... No sir, I think physics has more arrogance now than before - people are willing to assert strange or non-intuitive explanations for everything

    • by Jhan ( 542783 ) on Wednesday July 12, 2006 @01:39PM (#15706617) Homepage
      This is a good thing. One of two things will happen from this :
      1. The scientists are right and Einstein wasn't 100% correct.
      2. The scientists are wrong and let dust onto the damn sensors again
      I'd say 1. It's not just the "variable constants", it's the way the galaxy rotates, it's the anisotropy measurements of the comsic background etc. You know, all the evidence piling up over the last few decades that lead cosmologists to pull first dark matter, then dark energy out of their hats.

      Apparently 96% of our entire universe is now believed to be made up by these two substances, neither of wich have been explained. I suggest that one of the following options are true:

      1. With many "patches" the existing theories can be contorted enough to explain the new data (see also epicycles, phlogiston)
      2. A new theory will explain these anomalies in a simple and obvious way.

      My bet is 2, and string theory is not it... Interesting times ahead, mark my word.

    • by electrosoccertux ( 874415 ) on Wednesday July 12, 2006 @01:40PM (#15706627)
      Even the ones you think lead to a gaping abyss. You never know when there'll be an ore field on the way.

      I'm tired of hearing people tell my friend from Georgia Tech that he can't develope a free energy device. The quantum model is far from perfect. It is entirely possible we could extract the [theories, now] ZPE (our gravitational like-force experienced in the casimir-effect) from empty space. Who are these people to comdemn him? How many of them went to Georgia Tech? Do they have the schematics and plans for a device for free energy? No. How would they know anything about it? Are they willing to fund him so he can build his? Even though that might prove them right, they're too busy running after their quantum smoke. They're no better than the Catholic Church railing on Galileo.
    • Or, we're saying "Einstein is right, but a quasar pulsing at one time will have a noticably different set of characteristics from a similar quasar several billion light years later." If the constants vary over time, it has no effect on the validity of Einstein's observations (AFAIK). The light they're measuring comes from different eras in the universe's history. However, knowing how the constants vary over time is much more helpful, because then we could make better guesses as to why they would do so.

    • by Thangodin ( 177516 ) <elentar AT sympatico DOT ca> on Wednesday July 12, 2006 @01:43PM (#15706653) Homepage
      If option (2) is true, it means that the scientists in question will be metaphorically shot by the scientific community for daring to question the great reletivity laws, and remove bad scientists from the community.

      No, they won't be shot. Stephen Hawking has challenged Einstein's theories and been wrong about nearly everything he's ever proposed, and he's still considered a good physicist. It's okay to challenge the dominant theory, just as long as you have good evidence to back it up, and your theory explains something that nothing else does. Bad science is done with poor or no evidence, explains even less than the current theory, and is usually presented to the general public without peer review. When confronted with evidence that proves their theory false, good scientists concede, while bad scientists wail on about scientific orthodoxy and appeal to popular opinion.
  • Chaos Theory (Score:2, Interesting)

    The lorenz attractor is a mathematical example of how sensitivity to initial conditions can affect the results of any test.
    There is no way that ANY test can be reproduced perfectly multiple times, however for a large percentage of things tested the differences are so small they are negligable.
    If you take a double pendulum and try (to scientific precision) to orient the beams to the exact location the results will be different every single time you do it (fluctuations in the universes' gravitational field ca
    • It'll always be cool to watch
    • Depends on what you're measuring. If you measure the energy of that double pendulum you'll find its more reproducible than the exact x,y,z positions of the bobs. And of course if you measure averages over many runs you're set; the averages of one set of 1000 runs and the averages of a separate set of 1000 runs will be decently similar, and you can predict how different they should be by looking at the statistics you got from those different runs. Chaos doesn't mean 'give up', it means 'measure the things wh
    • Not all calculations are inherently chaotic. The lorentz attractor is a great example of a calculation that is, but there's plenty that aren't.
  • by MrNougat ( 927651 ) <ckratsch@noSPAm.gmail.com> on Wednesday July 12, 2006 @01:16PM (#15706439)
    Scientists Question Laws of Nature

    Isn't "questioning laws of nature" by definition what scientists do? Question, hypothesis, experiment, theory, law, lather, rinse, repeat - right?
    • I believe that mostly, they use the existing knowledge of the laws of nature to understand how those laws shape nature. You can't form a proper hypothesis if you don't know enough to ask the right questions.
    • That's kind of what I was thinking.

      Next they'll be telling us the theory of relativity is just a theory!
    • by jfengel ( 409917 ) on Wednesday July 12, 2006 @01:41PM (#15706640) Homepage Journal
      Yeah, I noticed the same thing. In one sense it's kind of irritating to have the insinuation perpetrate the myth that scientists have a non-rational belief equivalent to a religious belief, and that these scientsts are some kind of heretics. We know what they meant, but still...

      A more precise headline is somewhat harder to write: "Scientists find evidence that they may have to refine or even refactor some really, really well-demonstrated theories" isn't nearly as punchy.

      (Scientists do, in fact, have non-rational fundamentally held beliefs, but they're nothing so simple as "Einstein was right, Darwin was right". Trying to convince somebody that a scientist's real religious belief is "The universe has some sort of fundamental, objective, and probably comparatively simple law, one that we can understand or at least produce successively more acurate approximations, one that can be modeled mathematically and is true over all space and time, one that makes predictions that can be tested and will stand up to all such tests all the time" is rather more complicated and less fun. And yes, I recognize that my approximation of that belief above is both more complicated and less accurate than some other formulations, but I'm already drifting dangerously off-topic.)
      • I agree, but to clarify, even though scientists hold that as a basic belief science is still not a religion. Scientists believe there are principles that govern the universe and that we might be able to figure some of them out. Religions believe they already have a good handle on what those principles are.
  • scientific method (Score:3, Insightful)

    by lazarusdishwasher ( 968525 ) on Wednesday July 12, 2006 @01:18PM (#15706460)
    Doesn't the scientific method say that when the answers don't fit you need to ask why and go throught the steps again? I rember learning in my high school chemistry class that pv=nrt and my teacher said that higher levels of chemistry don't use that formula because it is just sort of a rough guide to gasses. If my chemistry teacher was right I would guess that scientists figured out the easy formula once and fine tuned it as they gained knowledge and better instruments.
    • I rember learning in my high school chemistry class that pv=nrt and my teacher said that higher levels of chemistry don't use that formula because it is just sort of a rough guide to gasses.

      It's because the fundamental assumptions for that equation are not true (eg the particles do not interact). It is used at higher levels for theory development, but for useful applications a measured constant is often included to make up for the discrepency between theoretical models and actual observation.
    • that's why it's called the Ideal Gas Law.
    • I don't know much about the evolution of the formulas behind chemistry, but in high school physics when you study simple Newtonian mechanics the teacher will give you lots of algebraic formulas to memorize.

      If you take a college physics course (and you're still doing simple Newtonian mechanics), you'll find that you're using calculus instead, because it works a lot better. You might conclude that physicists used to use algebra and refined it until they got calculus. You'd have it completely backwards. Hig

  • by helioquake ( 841463 ) * on Wednesday July 12, 2006 @01:19PM (#15706466) Journal
    Sometimes in astronomy, the handling in errors (both random and systematic) is sloppily done. The random error is probably done ok; but how about systematic ones?

    In an attempt to publish hastily, scientists often willingfully ignore some shortcomings in instrumetal calibration, etc., and may not take into account all the uncertainties that should be propagated through their calculations. I hope that those astronomers are not embarrassing themselves by making an error like that.
  • This isn't new (Score:5, Informative)

    by whitehatlurker ( 867714 ) on Wednesday July 12, 2006 @01:19PM (#15706469) Journal
    Apart from the time scale involved, this isn't all that new. Scientific American had an article [sciam.com] on this over a year ago.
  • General Relativity (Score:3, Interesting)

    by duplicate-nickname ( 87112 ) on Wednesday July 12, 2006 @01:21PM (#15706483) Homepage
    Isn't general relativity incorrect for sub atomic particles anyway? ....it's been like 10 years since my last quantum physics class.
  • by KIFulgore ( 972701 ) on Wednesday July 12, 2006 @01:23PM (#15706493)
    Well.... yeah. That's their jeorb.
  • by Weaselmancer ( 533834 ) on Wednesday July 12, 2006 @01:24PM (#15706500)

    From the blurb:

    Time-varying constants of nature violate Einstein's equivalence principle, which says that any experiment testing nuclear or electromagnetic forces should give the same result no matter where or when it is performed.

    Maybe there is a hidden assumption in there. Maybe space itself isn't constant.

    We're already thinking that space may have an energy to it. [slashdot.org] If it has energy, then space would have an equivalent mass. Possibly you could describe that as a density of sorts.

    So if space itself has a sort of density, then maybe the slight differences you see in the constants are caused by the varying density of different regions of space they are traveling through to be measured.

    IANAP, YMMV, etc. But I think it might be at least possible. Einstein's principle above would have to be edited to say "in equivalent spaces".

    That always seems to be the way of scientific progress. You create a set of equations describing what you see, like Newton did. Then someone can see a little farther, and amend them like Einstein did. Another amendment wouldn't be "questioning the laws of nature", it would just simply be understanding them a little better.

  • Grain of salt time (Score:2, Interesting)

    by Anonymous Coward
    It's worth noting that none of the results described in TFA have actually been confirmed, that they are in fact recent and highly contested, and that many such claims in the past were subsequently retracted or refuted. There is a minor bandwagon on "variable constants", actually; everybody and their brother is measuring physical constants, and pointing at any minor statistical fluctuation way out at the edges of detectability as "evidence of variation".

    The implications would be very interesting if any of t
  • by swschrad ( 312009 ) on Wednesday July 12, 2006 @01:34PM (#15706575) Homepage Journal
    the closer you get to measuring a small event, the more the attempt to measure it gets in the way.

    also called the "uncertainty principle."

    there is a good chance that all these differing microerrors in all sorts of differing directions are different diffractions through inteference in what we can observe, thus proving the heisenberg principle has raised its ugly head again.

    aka don't sweat it until you get a couple thousand indicators in the same direction. just like this week's surprise medical discovery that pesticides cure cancer, or coffee cures cancer, or coffee cures pesticides, or whatever bogus wrong-way publication made it into print on one limited study. the last line of those articles always reads, "The findings suggest that further studies in the field should be undertaken," which is code for "The previous article was written to get more grant money, send to PO Box 666, Unterderlinden, NJ."
  • by Darren Hiebert ( 626456 ) on Wednesday July 12, 2006 @01:35PM (#15706578) Homepage

    Einstein's gravitational theory -- general relativity -- would no longer be completely correct, Martins says.

    First of all, let me preface this by saying IAAP (I am a physicist):

    All this talk of laws being "wrong" or no longer "correct" is just popular fluff the press either hypes or makes up.

    No physical law is ever completely correct. A physical law is simply a description of reality to the degree to which we understand it, and is "correct" (i.e. produces predicitions which fit our measurements) within the realm of our present experience of the phenomenon it describes. As our understanding and experience of a phenomenon grows to encompass a wider range of circumstances (e.g. scale, velocity), the law needs to be either refined or replaced with new law, possibly based upon a new paradigm.

    Newton's laws of motion are no less "correct" now than they ever were. Einstein determined that the realm in which they accurately described reality did not include large velocities near the speed of light (i.e. >0.1c). Quantum mechanics explained how at small scales these same rules no longer applied. Even today, no one yet knows how to reconcile the theories of relativity and quantum mechanics when their realms overlap--this is still pioneering work.

    Yet Newton's laws are still taught as the foundation of physics to all new students because they are still valid within the realm or experience in which all of our normal lives are conducted. Models, and the laws derived with them, are valid only within the realm of experience within which they were formed (and, if the inventer is lucky, they hold even beyond that). And they remain valid within that realm even when we find later than they don't hold outside that realm. Even Aristotle's belief that heavier objects fall faster than light objects is valid to a point (within a realm where air friction is a significant contributor), even though Galileo later "proved" this was wrong (i.e. it is not a general law).

    • No physical law is ever completely correct. A physical law is simply a description of reality to the degree to which we understand it, and is "correct" (i.e. produces predicitions which fit our measurements) within the realm of our present experience of the phenomenon it describes. As our understanding and experience of a phenomenon grows to encompass a wider range of circumstances (e.g. scale, velocity), the law needs to be either refined or replaced with new law, possibly based upon a new paradigm.

      Wow-

    • I'm sacrificing modding you up for an attaboy. On of my areas of interest is the philosophy of science, especially the epitemology of science (how can we know empircal fact x). I find physicists who are willing to admit that law does not equal fact, and that math does not equal universe, refreshing. It seems many of the physicists I know don't want to question the fundamentals of their discipline (they are so busy doing physics, that they never question what that means). One of my best friends is finish
  • by jeblucas ( 560748 ) <[jeblucas] [at] [gmail.com]> on Wednesday July 12, 2006 @01:36PM (#15706588) Homepage Journal
    I've been stewing about this for a long time, I've called into NPR talk shows about it, etc. I feel like the Standard Model [wikipedia.org] is irrevocably broken. There's a generation of physicists that really loves the hell out this thing, but it's got so many problems. I was tangentially involved with "proton sigma-r" cross-section experiments [osti.gov] at the University of Redlands that violated the Standard Model. A lot of the SM's important values are empirical and "bolted on". A number of its predictions are not yet found (Higgs boson, anyone? Bueller?)

    Yes, it predicted a number of cool particles, and sure enough, there they are. It also craps out more and more lately. Neutrinos oscillate, huh? Uh, well, we'll fix that later. Gravity... yeah. That's a bitch. I know! More free variables! We're at 19 now, what's 10 more?

    This whole thing smacks of turn-of-the-20th-century Newtonians trying to cobble together a decent explanation for black-body radiators [egglescliffe.org.uk]. They tried all kinds of tricks--turns out they didn't work, because the system is not Newtonian. Newtonian physics was awesome for predicting meso-scale behavior, but it's a dog at small and large scales. Similarly, I think, the Standard Model was super-dynamite for a good number of years, but to hang on to it through all these issues should be a red flag that something else might be a better explanation. Kuhn, here we come. [wikipedia.org]

    • by Anonymous Coward
      I feel like the Standard Model is irrevocably broken.

      You don't get a Nobel for pointing out it's broken. You get the Nobel for pointing out the replacement.

      Physisists like the SM because it's the best we've got. They'll dump it like a ton of bricks when something demonstably better* comes along.

      * and string theory isn't there yet.
  • On the subject of string theory and the possibility of other universes/dimensions with differing laws of nature, I've often wondered about whether constants change with time or the growth of a universe; if the spatial complexities or aging or changes in dimensions we don't percieve directly affect constants and laws... while we can percieve light that originated billions of years ago, that light may be subject to different laws as it reaches us now. We'd have no real way to test it, either, as our measureme
  • by imaginaryelf ( 862886 ) on Wednesday July 12, 2006 @01:38PM (#15706604)
    Q: "Easy: Change the gravitational constant of the universe."

    Geordi: "What?"

    Q: "Change the gravitational constant of the universe, thereby altering the asteroid's orbit."

    Geordi: "How do you do that?"

    Q: "You just DO it, that's all..."

    Data: "What Geordi is saying is that we do not have the ability to change the gravitational constant of the universe."

    Q: "Well, then, you obviously never read slashdot."
  • by Pinkybum ( 960069 ) on Wednesday July 12, 2006 @01:39PM (#15706609)
    Scientific theories form two main purposes: 1. They are useful at predicting how things will behave (e.g. important for NASA) 2. They provide a framework to show the way for future work. Einstein's axioms of constancy were constructs built from empirical evidence which yielded some interesting and very useful insights into the way things worked. They also showed potential paths forward which Einstein himself pursued until his death. Einstein himself knew his theories were not the last word and any scientist knows this is a fundamental philosophy of the scientific method. The rest of the world can pretend there is something else sensational going on if they want to but it isn't science.
  • Sod's Law? (Score:3, Funny)

    by owlnation ( 858981 ) on Wednesday July 12, 2006 @01:45PM (#15706677)
    I'm guessing that we can still count on Murphy's Law?
  • by dpaton.net ( 199423 ) on Wednesday July 12, 2006 @01:56PM (#15706772) Homepage Journal
    Osborn's Law:
                    Variables won't; constants aren't.

    Thank the BSD fortune file on my machine at home.
  • Why can't it just be that the gas clouds between here and there have a different makeup than assumed?
  • Did you or did you not cause one Peter Miller to fall when he accidentally stepped off the edge of that cliff on November 19th?

    Did you or did you not fatally electrocute one Robert Schindler when he mishandled a household 220V line on January the 7th.

    Are you or are you not responsible for mangling one Sally Parks when her car deccelerated from 65mph to 0 in the course of striking a tree on March 8th?

    You're honor, we scientists request that the Laws of Nature be jalied without bail as they have been
  • Laws of Nature should be happy that it is scientists, who are questionning them and not the US Department of Homeland Security. When those DHS guys decide to question those laws, then the nature should really start getting worried, until then it should be just sitting there, happy that the weakest (physically) segment of the population is asking the questions.
  • Remember: (Score:3, Insightful)

    by pingveno ( 708857 ) on Wednesday July 12, 2006 @02:19PM (#15706977)
    What these scientists have found isn't necessarily correct. There has to be more evidence before it gets to having enough evidence to be get it to established theory.
  • This is an old problem with science put forth by David Hume. In order for science to work the future must be like the past and the past must be like the present observations. Any "constants" found by observing a finite part of the universe and applying it to the whole may be problematic, yet we are willing to jump into the metaphysics of "and yet it MUST be so!" from our observations and ingenious models that seem to work so very well. Now, it does work very very well because you can build a remarkably f
  • Lisa:...and here is my perpetual motion machine.

    Homer: Lisa, in this house we obey the laws of thermodynamics!

  • No it Doesn't!!! (Score:3, Informative)

    by Tired and Emotional ( 750842 ) on Wednesday July 12, 2006 @04:16PM (#15707996)
    The blurb opined:

    Time-varying constants of nature violate Einstein's equivalence principle, which says that any experiment testing nuclear or electromagnetic forces should give the same result no matter where or when it is performed.

    No it doesn't

    The principle of equivalence, more properly called the principle of covariance, says that the laws of physics can be expressed covariantly. This means that your co-ordinate system does not matter. Actually you have to make sure you take derivatives in a physically meaningful way rather than just relative to your arbitary co-ordinates.

    But this is entirely a local principle. It does not mean that an experiment performed in one place will give the same results as the same experiment performed elsewhere.

    For example, observe cepheid variables from down a gravity well!

    The principle of equivalence in its limited form (that leads on to the principle of covariance) says you can't tell the difference between acceleration and gravity. Once again this is a local phenomenum because in an elevator (or other closed box) of non-trivial size, you can distinguish them by observing the curvature associated with gravity.

  • by tgrigsby ( 164308 ) on Wednesday July 12, 2006 @06:44PM (#15709073) Homepage Journal
    If this principle is broken, then two objects dropped in a gravitational field should fall at slightly different rates.

    Only if the physical constants are different for the two objects. If, within the context in which they fall, the constants are the same, the objects will drop at the same rate. The experiments show that these constants vary over extreme amounts of time, with no proof as of yet that they vary over distance.

  • I for one (Score:3, Interesting)

    by suitepotato ( 863945 ) on Wednesday July 12, 2006 @08:26PM (#15709554)
    welcome our new (in)constant overlords or would if quantum mechanics allowed me to state what they were and when and where at the same time.

    What I took away from the field of physics so far was that constant variables are bunk and largely a matter of fudging. The important constants are actually the formulaic and thus geometric relationships between the variables. Such as E=mc^2. If c is variable then with a factor n,

    E=m((nc)^2) which amounts to E=(m/(n^2))((n^2)(c^2))

    So for energy to remain the same without violations, as the local speed of light increases, mass must decrease.

    I don't believe and never have that the individual value constants are constant but subject to the spacetime fabric and its conditions.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...