Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:What's the problem? (Score 1) 208

p-values are not probabilities. What people would like it to be are probabilities that one hypothesis is correct compared to another. But that is not what it does, and because people ignore that gap and mis-interpret them it has become such a problem; that's why they are being banned. Many experiments with acceptable p-values (p0.05) are not reproducible.

Actually the inventor of p-values never intended them for a test, only to uncover that there is perhaps worth of further investigation.

p-values tell you, if you collected data under the current model, how frequently you will get data more extreme than the data at hand. p0.01 means, only in 1% of cases you will get such an "outlier". But it assumes that the model itself is correct. It varies the data!

Instead, what should be done is to compare one model versus another one, with the data we have. Bayes factors do that, and should be used and taught.

The problem came to be because social sciences do not have proper, meaningful models, which can be compared. So they have resorted to techniques that do not require specifying models (or alternatives) rigorously. In the physical sciences, you can precisely write a model for a planetary system with 2 planets and one with 3 planets, and the Bayes factor will be meaningful.

Comment Re:The real extinction (Score 5, Informative) 93

Try these?

  • Firestone RB, West A, Kennett JP et al. (October 2007). "Evidence for an extraterrestrial impact 12,900 years ago that contributed to the megafaunal extinctions and the Younger Dryas cooling". Proceedings of the National Academy of Sciences of the United States of America 104 (41): 16016–21. Bibcode:2007PNAS..10416016F. doi:10.1073/pnas.0706977104. PMC 1994902. PMID 17901202.
  • Loarie, Scott R.; Duffy, Philip B.; Hamilton, Healy; Asner, Gregory P.; Field, Christopher B.; Ackerly, David D. (2009). "The velocity of climate change". Nature 462 (7276): 1052–1055. Bibcode:2009Natur.462.1052L. doi:10.1038/nature08649. PMID 20033047.
  • Steadman, D. W. (1995). "Prehistoric extinctions of Pacific island birds: biodiversity meets zooarchaeology". Science 267 (5201): 1123–1131. Bibcode:1995Sci...267.1123S. doi:10.1126/science.267.5201.1123.
  • Steadman, D. W.; Martin, P. S. (2003). "The late Quaternary extinction and future resurrection of birds on Pacific islands". Earth Science Reviews 61 (1–2): 133–147. Bibcode:2003ESRv...61..133S. doi:10.1016/S0012-8252(02)00116-2.

and

  • S.L. Pimm, G.J. Russell, J.L. Gittleman and T.M. Brooks, The Future of Biodiversity, Science 269: 347–350 (1995)
    Doughty, C. E., A. Wolf, and C. B. Field (2010), Biophysical feedbacks between the Pleistocene megafauna extinction and climate: The first humaninduced global warming?,Geophys. Res. Lett., 37, L15703, doi:10.1029/2010GL043985
  • Pitulko, V. V., P. A. Nikolsky, E. Y. Girya, A. E. Basilyan, V. E. Tumskoy, S. A. Koulakov, S. N. Astakhov, E. Y. Pavlova, and M. A. Anisimov (2004), The Yana RHS site: Humans in the Arctic before the Last Glacial Maximum, Science, 303(5654), 52–56, doi:10.1126/science.1085219
  • Barnosky, Anthony D.; Matzke, Nicholas; Tomiya, Susumu; Wogan, Guinevere O. U.; Swartz, Brian; Quental, Tiago B.; Marshall, Charles; McGuire, Jenny L.; Lindsey, Emily L.; Maguire, Kaitlin C.; Mersey, Ben; Ferrer, Elizabeth A. (3 March 2011). "Has the Earth’s sixth mass extinction already arrived?". Nature 471 (7336): 51–57. Bibcode:2011Natur.471...51B. doi:10.1038/nature09678.
  • Zalasiewicz, Jan; Williams, Mark; Smith, Alan; Barry, Tiffany L.; Coe, Angela L.; Bown, Paul R.; Brenchley, Patrick; Cantrill, David; Gale, Andrew; Gibbard, Philip; Gregory, F. John; Hounslow, Mark W.; Kerr, Andrew C.; Pearson, Paul; Knox, Robert; Powell, John; Waters, Colin; Marshall, John; Oates, Michael; Rawson, Peter; Stone, Philip (2008). "Are we now living in the Anthropocene". GSA Today 18 (2): 4. doi:10.1130/GSAT01802A.1.
  • Vitousek, P. M.; Mooney, H. A.; Lubchenco, J.; Melillo, J. M. (1997). "Human Domination of Earth's Ecosystems". Science 277 (5325): 494–499. doi:10.1126/science.277.5325.494.
  • Wooldridge, S. A. (9 June 2008). "Mass extinctions past and present: a unifying hypothesis". Biogeosciences Discuss (Copernicus) 5 (3): 2401–2423. doi:10.5194/bgd-5-2401-2008.
  • Jackson, J. B. C. (Aug 2008). "Colloquium paper: ecological extinction and evolution in the brave new ocean" (Free full text). Proceedings of the National Academy of Sciences of the United States of America 105 (Suppl 1): 11458–11465. Bibcode:2008PNAS..10511458J. doi:10.1073/pnas.0802812105. ISSN 0027-8424. PMC 2556419. PMID 18695220. edit
  • Elewa, Ashraf M. T. "14. Current mass extinction". In Elewa, Ashraf M. T. Mass Extinction. pp. 191–194. doi:10.1007/978-3-540-75916-4_14.
    Mason, Betsy (10 December 2003). "Man has been changing climate for 8,000 years". Nature. doi:10.1038/news031208-7.
    MacPhee and Marx published their hyperdisease hypothesis in 1997. "The 40,000-year plague: Humans, hyperdisease, and first-contact extinctions." In S. M. Goodman and B. D. Patterson (eds), Natural Change and Human Impact in Madagascar, pp. 169–217, Smithsonian Institution Press: Washington DC.
  • Lyons, S. Kathleen; Smith, Felisa A.; Wagner, Peter J.; White, Ethan P.; Brown, James H. (2004). "Was a ‘hyperdisease’ responsible for the late Pleistocene megafaunal extinction?". Ecology Letters 7 (9): 859–868. doi:10.1111/j.1461-0248.2004.00643.x.
  • Graham, R. W. and Mead, J. I. 1987. Environmental fluctuations and evolution of mammalian faunas during the last deglaciation in North America. In: Ruddiman, W. F. and H.E. Wright, J., editors. North America and Adjacent Oceans During the Last Deglaciation. Volume K-3. The Geology of North America, Geological Society of America
  • Martin P. S. (1967). Prehistoric overkill. In Pleistocene extinctions: The search for a cause (ed. P.S. Martin and H.E. Wright). New Haven: Yale University Press. ISBN 0-300-00755-8.
  • Lyons, S.K., Smith, F.A., and Brown, J.H. (2004). "Of mice, mastodons and men: human-mediated extinctions on four continents". Evolutionary Ecology Research 6: 339–358. Retrieved 18 October 2012.

Wikipedia also list a few books if you are interested.

Comment Re:How have we ruled out measurement or model erro (Score 2) 117

ad 3: Plenty of people are working on modified models, such as alternatives to general relativity. There are papers coming out every week. https://en.wikipedia.org/wiki/...
ad 2: Errors in measurements can be somewhat excluded as a possibility because many different measurements looking at different aspects and scales find the same result. Wikipedia lists 3.1 Galaxy rotation curves, 3.2 Velocity dispersions of galaxies, 3.3 Galaxy clusters and gravitational lensing, 3.4 Cosmic microwave background, 3.5 Sky surveys and baryon acoustic oscillations, 3.6 Type Ia supernovae distance measurements, 3.7 Lyman-alpha forest and 3.8 Structure formation . See also my other post.

Comment Re:If the only interaction was gravity (Score 1) 117

Then wouldn't the dark matter clouds just collapse in on themselves and form singularities as there would be no counterforce to gravitational attraction?

Gravity is the attraction of masses. The reason that things don't pass through each other is something else. It involves the electric repulsion of electrons and protons, but a more detailed answer is here

Comment Re:Dark matter doesn't exist. (Score 5, Interesting) 117

In a rush to tailor the evidence to a flawed theory, dark mentor was invented by humon minds in an attempt to save a beloved theory. We need to cast off the shackles of what we want to be true, and look at the evidence in a cold, anyalytical light. When this is done, I'm quite certain that there will be no need for the magical fairy dust matter that is there but isn't there.

The term dark matter is just the name for a discrepancy. For example, the galaxy rotation speed is 220 km/s at our position in the galaxy (8kpc), and stays the same until 30kpc. But the number of stars, which are the mass we can see, declines exponentially. So some mass (10x more than what we see) must be there to keep the rotation fast (otherwise it would be like the solar system -- Pluto rotating around the sun much slower than Mars).
Then in clusters we see that gravitational light acts as a lens and we can infer the mass that bends the light behind it. And it is much more than we see in stars and gas.
In the cosmic microwave background, which is a relic from the last time electrons and photons interacted very strongly, 380000 years after the "Big Bang", we can estimate the density of the universe there. Also, the fraction of matter interacting with photons, is only a fraction of the total matter there.
All of these *different, independent* probes, and several others, point to the same ratio of total matter to electromagnetically-interacting matter.
Now you can take the state of the Universe at 380000 years age, with its total matter, electromagnetically-interacting matter and photon budget and evolve it following general relativity. And people find that the clustering of galaxies, their total number and sizes can be reproduced quite well. And this is not possible without putting that additional, non-electromagnetically-interacting matter there. And In this experiment you can learn something about how weak the electromagnetic interaction must be (for example, a large population of Neutrinos can be excluded, because they interact to strongly, smoothing out the structures).

As you say, another path is to modify the theories of GR, and every week there are papers explaining Dark Matter with alternative theories, sometimes in combination with Dark Energy. This is a path that many people are working on. If you see the term "Dark Matter" as the *name of the problem*, namely the discrepancy between observations and normal matter + GR, then there is no conflict, it does not say how to solve it. Dark matter is real, because the discrepancy exists. And the search for particles is also not concluded yet: Larger, cold objects have been proposed (e.g. brown dwarfes, Jupiter-size planets), as well as new fundamental particles (Neutrinos, as well as as-of-yet unobserved particles, like the sterile Neutrino, or totally new particles from some theories of supersymmetry). Some of them have been excluded -- for example it can not be stellar-size black holes, because of the number of binary star systems we observe in the outer parts of the Milky Way; those would be destroyed by frequent interactions with a large population of masses. The upgraded LHC will try to produce more particles, and there is a real chance it will produce (or exclude) a specific candidate dark matter particle predicted (proposed) by supersymmetry.

Believe me, Astronomers really do not like the idea of Dark Matter, and have been fighting it for decades. But the evidence from many different experiments is there. We still don't know what it is, whether the laws have to be changed or additional particles have to be put there (and which ones). But the range of possibilities is getting smaller and smaller. And putting particles there that do not interact except for gravity has been very successful in explaining various observations. I used to be cautious because in principle you could just arbitrarily put mass where you need it -- but if you start from the Big Bang and only use general relativity, then the created galaxies with dark matter in/around them, or galaxy collisions like the one in this article just come out -- there is no choice involved here except for the density of dark matter in the early universe.

Many different observations, proposed resolutions in new theories, proposed particles and detection experiments are listed on the Wikipedia page https://en.wikipedia.org/wiki/...

For this particular observation, you should note that they observed 60 collisions, which act like we think dark matter acts (no collisions, only gravity), and one is odd. That should tell you to be cautious, perhaps something is peculiar about this system or the observation.

Comment Re:Not surprising (Score 2) 33

I don't get why this article is on slashdot. Is it because the average reader might think ESA has not contributed to NASA missions before? Because it is the first time a lander may be the contribution? I don't think these are the case, NASA rarely does any mission without collaborations.

Comment Re:Dark Energy (Score 3, Informative) 199

Dark energy can also be measured from the CMB radiation, through the angular size of anisotropies and through baryonic acoustic oscillations in the large scale structure.
And the constraints from these *independent* probes are consistent with the results from supernovae, all pointing to the presence of an acceleration of the universe at late times. It is not so that we rely on a single tool here!
Also, TFA states that their finding that a different class of supernova is dominant at high redshift does not attack the presence of dark energy, only its exact value (of energy density).

Comment clarification (Score 4, Informative) 32

"How does this happen, if a black hole exerts so much gravitational force that not even light can escape?"
The stars don't form inside the black hole, so I don't see how that is related. Instead they are at a distance of several light years, where some of the gas that falls towards the center stops (angular momentum; similar gravitational attraction of black hole and galaxy stars). The gas can collapse and form stars. These are called "nuclear star clusters".

Comment (A)GPL solves this (Score 1) 146

AGPLv3 solves exactly this problem.

The question of open source is really -- do you have a secure upgrade path. If Windows goes away, and software you use depends on their software, you do not. If you use software based on a BSD/Apache2 license, and someone extends it and makes the result non-open source, and the software you use begins to require these extensions, you don't have a secure upgrade path anymore. GPL solves this problem and guarantees that you will always have an upgrade path, because derivatives need to be open source.

I think this is really the key point, and why purism in software licensing should not be laughed at by "pragmatists". Like for example distributions that do not include closed source software (Flash) or drivers (nvidia), because "pragmatists" want it to "just work". If you go down that path, you are making yourself dependent on a company going the path you want. You get into a situation 5 years down the line where even more software depends on closed source (e.g. mono/.NET), and it is out of your control. That's why I think purism in open source software is still relevant.

Pragmatically speaking, the upgrade path of open source software packages is not in your hands, but in those who are experts in that package's code, and those who invest time in it. The point is rather that if you get annoyed enough to pay someone, you would be able to get control back, while with closed-source extended BSD/Apache2 packages, you would not. You would need to re-invent that software.

For Web services, I think it depends. If the company provides proprietary data, then it doesn't really matter whether the software to access it, or the API is open source. You will have that dependency, until you have open data.

In summary, I think one should ask oneself: In 5 years, when this platform is outdated, and the company goes away or refocuses, what will I do, and am I prepared for that. Who am I dependent on? Having a community of millions of programmers which are in the same situation helps, because only one has to solve the problem and open-source it for an upgrade path.

Comment Re:Tin foil hat time (Score 1, Interesting) 142

Wasn't the NSA accused of suggesting/modifying various encryption standards in order to weaken them? In which case they don't need back doors into the software as they can already unlock the data.

Yes, and the authors of said algorithms (CS researchers) agree that that was ok (a security - speed/implementation tradeoff).

Comment Re:Not everyone (Score 4, Insightful) 140

The revelations did not change the way *I* looked at the Internet and privacy. It merely confirmed my well-justified suspicions. I think the same statement can be made by most people on slashdot, and by most technicians in general. The only people who were surprised were the technically ignorant.

There is a difference between suspecting and being looked at as paranoid, and everyone knowing something as a fact.

Comment Re:Plug-in still required (Score 1, Insightful) 97

It is a Unity plug in that is legit. It basically caches the data and compiles the c++ to ecmascripten a fork of asm.js.

You can download the source and compile it yourself as an executable if you do not want the browser

And why can't they compile the c++ to ecmascripten or asm.js before they put it on the website?

Slashdot Top Deals

If you have a procedure with 10 parameters, you probably missed some.

Working...