Comment Re:Other explanations: (Score 1) 94
maybe it's winter there
maybe it's winter there
p-values are not probabilities. What people would like it to be are probabilities that one hypothesis is correct compared to another. But that is not what it does, and because people ignore that gap and mis-interpret them it has become such a problem; that's why they are being banned. Many experiments with acceptable p-values (p0.05) are not reproducible.
Actually the inventor of p-values never intended them for a test, only to uncover that there is perhaps worth of further investigation.
p-values tell you, if you collected data under the current model, how frequently you will get data more extreme than the data at hand. p0.01 means, only in 1% of cases you will get such an "outlier". But it assumes that the model itself is correct. It varies the data!
Instead, what should be done is to compare one model versus another one, with the data we have. Bayes factors do that, and should be used and taught.
The problem came to be because social sciences do not have proper, meaningful models, which can be compared. So they have resorted to techniques that do not require specifying models (or alternatives) rigorously. In the physical sciences, you can precisely write a model for a planetary system with 2 planets and one with 3 planets, and the Bayes factor will be meaningful.
Try these?
and
Wikipedia also list a few books if you are interested.
ad 3: Plenty of people are working on modified models, such as alternatives to general relativity. There are papers coming out every week. https://en.wikipedia.org/wiki/...
ad 2: Errors in measurements can be somewhat excluded as a possibility because many different measurements looking at different aspects and scales find the same result. Wikipedia lists 3.1 Galaxy rotation curves, 3.2 Velocity dispersions of galaxies, 3.3 Galaxy clusters and gravitational lensing, 3.4 Cosmic microwave background, 3.5 Sky surveys and baryon acoustic oscillations, 3.6 Type Ia supernovae distance measurements, 3.7 Lyman-alpha forest and 3.8 Structure formation . See also my other post.
Then wouldn't the dark matter clouds just collapse in on themselves and form singularities as there would be no counterforce to gravitational attraction?
Gravity is the attraction of masses. The reason that things don't pass through each other is something else. It involves the electric repulsion of electrons and protons, but a more detailed answer is here
In a rush to tailor the evidence to a flawed theory, dark mentor was invented by humon minds in an attempt to save a beloved theory. We need to cast off the shackles of what we want to be true, and look at the evidence in a cold, anyalytical light. When this is done, I'm quite certain that there will be no need for the magical fairy dust matter that is there but isn't there.
The term dark matter is just the name for a discrepancy. For example, the galaxy rotation speed is 220 km/s at our position in the galaxy (8kpc), and stays the same until 30kpc. But the number of stars, which are the mass we can see, declines exponentially. So some mass (10x more than what we see) must be there to keep the rotation fast (otherwise it would be like the solar system -- Pluto rotating around the sun much slower than Mars).
Then in clusters we see that gravitational light acts as a lens and we can infer the mass that bends the light behind it. And it is much more than we see in stars and gas.
In the cosmic microwave background, which is a relic from the last time electrons and photons interacted very strongly, 380000 years after the "Big Bang", we can estimate the density of the universe there. Also, the fraction of matter interacting with photons, is only a fraction of the total matter there.
All of these *different, independent* probes, and several others, point to the same ratio of total matter to electromagnetically-interacting matter.
Now you can take the state of the Universe at 380000 years age, with its total matter, electromagnetically-interacting matter and photon budget and evolve it following general relativity. And people find that the clustering of galaxies, their total number and sizes can be reproduced quite well. And this is not possible without putting that additional, non-electromagnetically-interacting matter there. And In this experiment you can learn something about how weak the electromagnetic interaction must be (for example, a large population of Neutrinos can be excluded, because they interact to strongly, smoothing out the structures).
As you say, another path is to modify the theories of GR, and every week there are papers explaining Dark Matter with alternative theories, sometimes in combination with Dark Energy. This is a path that many people are working on. If you see the term "Dark Matter" as the *name of the problem*, namely the discrepancy between observations and normal matter + GR, then there is no conflict, it does not say how to solve it. Dark matter is real, because the discrepancy exists. And the search for particles is also not concluded yet: Larger, cold objects have been proposed (e.g. brown dwarfes, Jupiter-size planets), as well as new fundamental particles (Neutrinos, as well as as-of-yet unobserved particles, like the sterile Neutrino, or totally new particles from some theories of supersymmetry). Some of them have been excluded -- for example it can not be stellar-size black holes, because of the number of binary star systems we observe in the outer parts of the Milky Way; those would be destroyed by frequent interactions with a large population of masses. The upgraded LHC will try to produce more particles, and there is a real chance it will produce (or exclude) a specific candidate dark matter particle predicted (proposed) by supersymmetry.
Believe me, Astronomers really do not like the idea of Dark Matter, and have been fighting it for decades. But the evidence from many different experiments is there. We still don't know what it is, whether the laws have to be changed or additional particles have to be put there (and which ones). But the range of possibilities is getting smaller and smaller. And putting particles there that do not interact except for gravity has been very successful in explaining various observations. I used to be cautious because in principle you could just arbitrarily put mass where you need it -- but if you start from the Big Bang and only use general relativity, then the created galaxies with dark matter in/around them, or galaxy collisions like the one in this article just come out -- there is no choice involved here except for the density of dark matter in the early universe.
Many different observations, proposed resolutions in new theories, proposed particles and detection experiments are listed on the Wikipedia page https://en.wikipedia.org/wiki/...
For this particular observation, you should note that they observed 60 collisions, which act like we think dark matter acts (no collisions, only gravity), and one is odd. That should tell you to be cautious, perhaps something is peculiar about this system or the observation.
I don't get why this article is on slashdot. Is it because the average reader might think ESA has not contributed to NASA missions before? Because it is the first time a lander may be the contribution? I don't think these are the case, NASA rarely does any mission without collaborations.
Dark energy can also be measured from the CMB radiation, through the angular size of anisotropies and through baryonic acoustic oscillations in the large scale structure.
And the constraints from these *independent* probes are consistent with the results from supernovae, all pointing to the presence of an acceleration of the universe at late times. It is not so that we rely on a single tool here!
Also, TFA states that their finding that a different class of supernova is dominant at high redshift does not attack the presence of dark energy, only its exact value (of energy density).
If people want to fight a war, they need to do it with a gun in hand on the battlefield.
How about without a gun? Then you really have to be determined.
"How does this happen, if a black hole exerts so much gravitational force that not even light can escape?"
The stars don't form inside the black hole, so I don't see how that is related. Instead they are at a distance of several light years, where some of the gas that falls towards the center stops (angular momentum; similar gravitational attraction of black hole and galaxy stars). The gas can collapse and form stars. These are called "nuclear star clusters".
If you want to go for avoiding ambiguity, go for https://en.wikipedia.org/wiki/Lojban
AGPLv3 solves exactly this problem.
The question of open source is really -- do you have a secure upgrade path. If Windows goes away, and software you use depends on their software, you do not. If you use software based on a BSD/Apache2 license, and someone extends it and makes the result non-open source, and the software you use begins to require these extensions, you don't have a secure upgrade path anymore. GPL solves this problem and guarantees that you will always have an upgrade path, because derivatives need to be open source.
I think this is really the key point, and why purism in software licensing should not be laughed at by "pragmatists". Like for example distributions that do not include closed source software (Flash) or drivers (nvidia), because "pragmatists" want it to "just work". If you go down that path, you are making yourself dependent on a company going the path you want. You get into a situation 5 years down the line where even more software depends on closed source (e.g. mono/.NET), and it is out of your control. That's why I think purism in open source software is still relevant.
Pragmatically speaking, the upgrade path of open source software packages is not in your hands, but in those who are experts in that package's code, and those who invest time in it. The point is rather that if you get annoyed enough to pay someone, you would be able to get control back, while with closed-source extended BSD/Apache2 packages, you would not. You would need to re-invent that software.
For Web services, I think it depends. If the company provides proprietary data, then it doesn't really matter whether the software to access it, or the API is open source. You will have that dependency, until you have open data.
In summary, I think one should ask oneself: In 5 years, when this platform is outdated, and the company goes away or refocuses, what will I do, and am I prepared for that. Who am I dependent on? Having a community of millions of programmers which are in the same situation helps, because only one has to solve the problem and open-source it for an upgrade path.
Wasn't the NSA accused of suggesting/modifying various encryption standards in order to weaken them? In which case they don't need back doors into the software as they can already unlock the data.
Yes, and the authors of said algorithms (CS researchers) agree that that was ok (a security - speed/implementation tradeoff).
The revelations did not change the way *I* looked at the Internet and privacy. It merely confirmed my well-justified suspicions. I think the same statement can be made by most people on slashdot, and by most technicians in general. The only people who were surprised were the technically ignorant.
There is a difference between suspecting and being looked at as paranoid, and everyone knowing something as a fact.
It is a Unity plug in that is legit. It basically caches the data and compiles the c++ to ecmascripten a fork of asm.js.
You can download the source and compile it yourself as an executable if you do not want the browser
And why can't they compile the c++ to ecmascripten or asm.js before they put it on the website?
If you have a procedure with 10 parameters, you probably missed some.