You make it sound like the temperature of the (empty) region averages down the background, making it colder. But something way more awesome actually happens: Photons enter one side of the Void (empty region) at an early time and travel through it. During that time, the Void expands. To escape the Void, the photon then has to lose more energy than it received when it entered. It is the slow light speed relative to these enormous scale, evolving structures that causes this effect!
For Gentoo, I can switch between systemd and OpenRC at boot time. Also for Ubuntu 15.04, you can switch between systemd and upstart. So I don't think your argument that the user does not have a choice holds, this is just about which package is installed by default.
You can't just leave things alone, because computers have also changed. Today we do not work on mainframes or desktop computers, but increasingly on laptops and mobile phones, which constantly change state, in terms of network connections, devices plugged in, location, hibernation.
I think there is consensus that these things did not work well on the old init system, although band-aids were found. I remember that changing the hostname stopped X from working, which can occur when DHCP gives you a new hostname. That is 80s design for you. Or changing the time messes up the logfiles.
Now you can choose which modern init system you want, and there are a couple out there: OpenRC, upstart and systemd are the most well-known ones.
OpenRC is the familiar runlevel based approach, which runs scripts which may or may not succeed.
Upstart is a triggering framework, that takes pre-defined actions (but does not work with goals). That means you have to write tasks for how to get from A to B with your system.
systemd is a dependency resolution program, that knows what to activate next to get to a certain state (goal). It handles services, mount points and network connections in the same framework. It is essentially an overseer of a services tree.
There are some upsides to systemd, besides parallelizing the tasks of a dependency tree to reach a goal. One is for every process it is known which service launched it (there are some Linux-specifics that allow marking those processes). Also, each service can be assigned resources (memory, number of processes), which it can not exceed (again, modern Linux supports that). And, obviously, you are not limited to a set number of runlevels.
Yes, systemd is annoying, because it is a new thing to learn. And it is annoying, because the maintainers are inconsiderate. But in the end, it is just a program to start other programs, with one particular way to do it. I don't get what the big deal is. If it is feature bloat -- Linux also has a lot of features, so does VLC -- there we consider them a good thing. Technically, the dependency resolution approach of systemd seems like a good thing (as in progress for Linux) to me.
You need a stellar-mass black hole for that, not a 4 million solar mass black hole.
I think what he/she was referring to is
See also the Narrabri Stellar Intensity Interferometer.
I think you can not make maps this way, I think you can only measure the spatial extension. Not sure though.
the duck farts
maybe it's winter there
p-values are not probabilities. What people would like it to be are probabilities that one hypothesis is correct compared to another. But that is not what it does, and because people ignore that gap and mis-interpret them it has become such a problem; that's why they are being banned. Many experiments with acceptable p-values (p0.05) are not reproducible.
Actually the inventor of p-values never intended them for a test, only to uncover that there is perhaps worth of further investigation.
p-values tell you, if you collected data under the current model, how frequently you will get data more extreme than the data at hand. p0.01 means, only in 1% of cases you will get such an "outlier". But it assumes that the model itself is correct. It varies the data!
Instead, what should be done is to compare one model versus another one, with the data we have. Bayes factors do that, and should be used and taught.
The problem came to be because social sciences do not have proper, meaningful models, which can be compared. So they have resorted to techniques that do not require specifying models (or alternatives) rigorously. In the physical sciences, you can precisely write a model for a planetary system with 2 planets and one with 3 planets, and the Bayes factor will be meaningful.
- Firestone RB, West A, Kennett JP et al. (October 2007). "Evidence for an extraterrestrial impact 12,900 years ago that contributed to the megafaunal extinctions and the Younger Dryas cooling". Proceedings of the National Academy of Sciences of the United States of America 104 (41): 16016–21. Bibcode:2007PNAS..10416016F. doi:10.1073/pnas.0706977104. PMC 1994902. PMID 17901202.
- Loarie, Scott R.; Duffy, Philip B.; Hamilton, Healy; Asner, Gregory P.; Field, Christopher B.; Ackerly, David D. (2009). "The velocity of climate change". Nature 462 (7276): 1052–1055. Bibcode:2009Natur.462.1052L. doi:10.1038/nature08649. PMID 20033047.
- Steadman, D. W. (1995). "Prehistoric extinctions of Pacific island birds: biodiversity meets zooarchaeology". Science 267 (5201): 1123–1131. Bibcode:1995Sci...267.1123S. doi:10.1126/science.267.5201.1123.
- Steadman, D. W.; Martin, P. S. (2003). "The late Quaternary extinction and future resurrection of birds on Pacific islands". Earth Science Reviews 61 (1–2): 133–147. Bibcode:2003ESRv...61..133S. doi:10.1016/S0012-8252(02)00116-2.
- S.L. Pimm, G.J. Russell, J.L. Gittleman and T.M. Brooks, The Future of Biodiversity, Science 269: 347–350 (1995)
Doughty, C. E., A. Wolf, and C. B. Field (2010), Biophysical feedbacks between the Pleistocene megafauna extinction and climate: The first humaninduced global warming?,Geophys. Res. Lett., 37, L15703, doi:10.1029/2010GL043985
- Pitulko, V. V., P. A. Nikolsky, E. Y. Girya, A. E. Basilyan, V. E. Tumskoy, S. A. Koulakov, S. N. Astakhov, E. Y. Pavlova, and M. A. Anisimov (2004), The Yana RHS site: Humans in the Arctic before the Last Glacial Maximum, Science, 303(5654), 52–56, doi:10.1126/science.1085219
- Barnosky, Anthony D.; Matzke, Nicholas; Tomiya, Susumu; Wogan, Guinevere O. U.; Swartz, Brian; Quental, Tiago B.; Marshall, Charles; McGuire, Jenny L.; Lindsey, Emily L.; Maguire, Kaitlin C.; Mersey, Ben; Ferrer, Elizabeth A. (3 March 2011). "Has the Earth’s sixth mass extinction already arrived?". Nature 471 (7336): 51–57. Bibcode:2011Natur.471...51B. doi:10.1038/nature09678.
- Zalasiewicz, Jan; Williams, Mark; Smith, Alan; Barry, Tiffany L.; Coe, Angela L.; Bown, Paul R.; Brenchley, Patrick; Cantrill, David; Gale, Andrew; Gibbard, Philip; Gregory, F. John; Hounslow, Mark W.; Kerr, Andrew C.; Pearson, Paul; Knox, Robert; Powell, John; Waters, Colin; Marshall, John; Oates, Michael; Rawson, Peter; Stone, Philip (2008). "Are we now living in the Anthropocene". GSA Today 18 (2): 4. doi:10.1130/GSAT01802A.1.
- Vitousek, P. M.; Mooney, H. A.; Lubchenco, J.; Melillo, J. M. (1997). "Human Domination of Earth's Ecosystems". Science 277 (5325): 494–499. doi:10.1126/science.277.5325.494.
- Wooldridge, S. A. (9 June 2008). "Mass extinctions past and present: a unifying hypothesis". Biogeosciences Discuss (Copernicus) 5 (3): 2401–2423. doi:10.5194/bgd-5-2401-2008.
- Jackson, J. B. C. (Aug 2008). "Colloquium paper: ecological extinction and evolution in the brave new ocean" (Free full text). Proceedings of the National Academy of Sciences of the United States of America 105 (Suppl 1): 11458–11465. Bibcode:2008PNAS..10511458J. doi:10.1073/pnas.0802812105. ISSN 0027-8424. PMC 2556419. PMID 18695220. edit
- Elewa, Ashraf M. T. "14. Current mass extinction". In Elewa, Ashraf M. T. Mass Extinction. pp. 191–194. doi:10.1007/978-3-540-75916-4_14.
Mason, Betsy (10 December 2003). "Man has been changing climate for 8,000 years". Nature. doi:10.1038/news031208-7.
MacPhee and Marx published their hyperdisease hypothesis in 1997. "The 40,000-year plague: Humans, hyperdisease, and first-contact extinctions." In S. M. Goodman and B. D. Patterson (eds), Natural Change and Human Impact in Madagascar, pp. 169–217, Smithsonian Institution Press: Washington DC.
- Lyons, S. Kathleen; Smith, Felisa A.; Wagner, Peter J.; White, Ethan P.; Brown, James H. (2004). "Was a ‘hyperdisease’ responsible for the late Pleistocene megafaunal extinction?". Ecology Letters 7 (9): 859–868. doi:10.1111/j.1461-0248.2004.00643.x.
- Graham, R. W. and Mead, J. I. 1987. Environmental fluctuations and evolution of mammalian faunas during the last deglaciation in North America. In: Ruddiman, W. F. and H.E. Wright, J., editors. North America and Adjacent Oceans During the Last Deglaciation. Volume K-3. The Geology of North America, Geological Society of America
- Martin P. S. (1967). Prehistoric overkill. In Pleistocene extinctions: The search for a cause (ed. P.S. Martin and H.E. Wright). New Haven: Yale University Press. ISBN 0-300-00755-8.
- Lyons, S.K., Smith, F.A., and Brown, J.H. (2004). "Of mice, mastodons and men: human-mediated extinctions on four continents". Evolutionary Ecology Research 6: 339–358. Retrieved 18 October 2012.
Wikipedia also list a few books if you are interested.
ad 3: Plenty of people are working on modified models, such as alternatives to general relativity. There are papers coming out every week. https://en.wikipedia.org/wiki/...
ad 2: Errors in measurements can be somewhat excluded as a possibility because many different measurements looking at different aspects and scales find the same result. Wikipedia lists 3.1 Galaxy rotation curves, 3.2 Velocity dispersions of galaxies, 3.3 Galaxy clusters and gravitational lensing, 3.4 Cosmic microwave background, 3.5 Sky surveys and baryon acoustic oscillations, 3.6 Type Ia supernovae distance measurements, 3.7 Lyman-alpha forest and 3.8 Structure formation . See also my other post.
Then wouldn't the dark matter clouds just collapse in on themselves and form singularities as there would be no counterforce to gravitational attraction?
Gravity is the attraction of masses. The reason that things don't pass through each other is something else. It involves the electric repulsion of electrons and protons, but a more detailed answer is here
In a rush to tailor the evidence to a flawed theory, dark mentor was invented by humon minds in an attempt to save a beloved theory. We need to cast off the shackles of what we want to be true, and look at the evidence in a cold, anyalytical light. When this is done, I'm quite certain that there will be no need for the magical fairy dust matter that is there but isn't there.
The term dark matter is just the name for a discrepancy. For example, the galaxy rotation speed is 220 km/s at our position in the galaxy (8kpc), and stays the same until 30kpc. But the number of stars, which are the mass we can see, declines exponentially. So some mass (10x more than what we see) must be there to keep the rotation fast (otherwise it would be like the solar system -- Pluto rotating around the sun much slower than Mars).
Then in clusters we see that gravitational light acts as a lens and we can infer the mass that bends the light behind it. And it is much more than we see in stars and gas.
In the cosmic microwave background, which is a relic from the last time electrons and photons interacted very strongly, 380000 years after the "Big Bang", we can estimate the density of the universe there. Also, the fraction of matter interacting with photons, is only a fraction of the total matter there.
All of these *different, independent* probes, and several others, point to the same ratio of total matter to electromagnetically-interacting matter.
Now you can take the state of the Universe at 380000 years age, with its total matter, electromagnetically-interacting matter and photon budget and evolve it following general relativity. And people find that the clustering of galaxies, their total number and sizes can be reproduced quite well. And this is not possible without putting that additional, non-electromagnetically-interacting matter there. And In this experiment you can learn something about how weak the electromagnetic interaction must be (for example, a large population of Neutrinos can be excluded, because they interact to strongly, smoothing out the structures).
As you say, another path is to modify the theories of GR, and every week there are papers explaining Dark Matter with alternative theories, sometimes in combination with Dark Energy. This is a path that many people are working on. If you see the term "Dark Matter" as the *name of the problem*, namely the discrepancy between observations and normal matter + GR, then there is no conflict, it does not say how to solve it. Dark matter is real, because the discrepancy exists. And the search for particles is also not concluded yet: Larger, cold objects have been proposed (e.g. brown dwarfes, Jupiter-size planets), as well as new fundamental particles (Neutrinos, as well as as-of-yet unobserved particles, like the sterile Neutrino, or totally new particles from some theories of supersymmetry). Some of them have been excluded -- for example it can not be stellar-size black holes, because of the number of binary star systems we observe in the outer parts of the Milky Way; those would be destroyed by frequent interactions with a large population of masses. The upgraded LHC will try to produce more particles, and there is a real chance it will produce (or exclude) a specific candidate dark matter particle predicted (proposed) by supersymmetry.
Believe me, Astronomers really do not like the idea of Dark Matter, and have been fighting it for decades. But the evidence from many different experiments is there. We still don't know what it is, whether the laws have to be changed or additional particles have to be put there (and which ones). But the range of possibilities is getting smaller and smaller. And putting particles there that do not interact except for gravity has been very successful in explaining various observations. I used to be cautious because in principle you could just arbitrarily put mass where you need it -- but if you start from the Big Bang and only use general relativity, then the created galaxies with dark matter in/around them, or galaxy collisions like the one in this article just come out -- there is no choice involved here except for the density of dark matter in the early universe.
Many different observations, proposed resolutions in new theories, proposed particles and detection experiments are listed on the Wikipedia page https://en.wikipedia.org/wiki/...
For this particular observation, you should note that they observed 60 collisions, which act like we think dark matter acts (no collisions, only gravity), and one is odd. That should tell you to be cautious, perhaps something is peculiar about this system or the observation.
I don't get why this article is on slashdot. Is it because the average reader might think ESA has not contributed to NASA missions before? Because it is the first time a lander may be the contribution? I don't think these are the case, NASA rarely does any mission without collaborations.
Dark energy can also be measured from the CMB radiation, through the angular size of anisotropies and through baryonic acoustic oscillations in the large scale structure.
And the constraints from these *independent* probes are consistent with the results from supernovae, all pointing to the presence of an acceleration of the universe at late times. It is not so that we rely on a single tool here!
Also, TFA states that their finding that a different class of supernova is dominant at high redshift does not attack the presence of dark energy, only its exact value (of energy density).
If people want to fight a war, they need to do it with a gun in hand on the battlefield.
How about without a gun? Then you really have to be determined.