Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment Re: How about basic security? (Score 5, Informative) 390

IPSec is perfectly usable.

Telebit demonstrated transparent routing (ie: total invisibility of internal networks without loss of connectivity) in 1996.

IPv6 has a vastly simpler header, which means a vastly simpler stack. This means fewer defects, greater robustness and easier testing. It also means a much smaller stack, lower latency and fewer corner cases.

IPv6 is secure by design. IPv4 isn't secure and there is nothing you can design to make it so.

Comment Re: Waiting for the killer app ... (Score 3, Informative) 390

IPv6 would help both enormously. Lower latency on routing means faster responses.

IP Mobility means users can move between ISPs without posts breaking, losing responses to queries, losing hangout or other chat service connections, or having to continually re-authenticate.

Autoconfiguration means both can add servers just by switching the new machines on.

Because IPv4 has no native security, it's vulnerable to a much wider range of attacks and there's nothing the vendors can do about them.

Comment Re: DNS without DHCP (Score 4, Informative) 390

Anycast tells you what services are on what IP. There are other service discovery protocols, but anycast was designed specifically for IPv6 bootstrapping. It's very simple. Multicast out a request for who runs a service, the machine with the service unicasts back that it does.

Dynamic DNS lets you tell the DNS server who lives at what IP.

IPv6 used to have other features - being able to move from one network to another without dropping a connection (and sometimes without dropping a packet), for example. Extended headers were actually used to add features to the protocol on-the-fly. Packet fragmentation was eliminated by having per-connection MTUs. All routing was hierarchical, requiring routers to examine at most three bytes. Encryption was mandated, ad-hoc unless otherwise specified. Between the ISPs, the NAT-is-all-you-need lobbyists and the NSA, most of the neat stuff got ripped out.

IPv6 still does far, far more than just add addresses and simplify routing (reducing latency and reducing the memory requirements of routers), but it has been watered down repeatedly by people with an active interest in everyone else being able to do less than them.

I say roll back the protocol definition to where the neat stuff existed and let the security agencies stew.

Comment What is wrong with SCTP and DCCP? (Score 4, Interesting) 84

These are well-established, well-tested, well-designed protocols with no suspect commercial interests involved. QUIC solves nothing that hasn't already been solved.

If pseudo-open proprietary standards are de-rigour, then adopt the Scheduled Transfer Protocol and Delay Tolerant Protocol. Hell, bring back TUBA, SKIP and any other obscure protocol nobody is likely to use. It's not like anyone cares any more.

Comment Re: Must hackers be such dicks about this? (Score 1) 270

He claimed he could hack the plane. This was bad and the FBI had every right to determine his motives, his actual capabilities and his actions.

The FBI fraudulently claimed they had evidence a crime had already taken place. We know it's fraudulent because if they did have evidence, the guy would be being questioned whilst swinging upside down over a snake pit. Hey, the CIA and Chicago have Black Sites, the FBI is unlikely to want to miss out. Anyways, they took his laptop, not him, which means they lied and attempted to pervert the course of justice. That's bad, unprofessional and far, far more dangerous. The researcher could have killed himself and everyone else on his plane. The FBI, by using corrupt practices, endanger every aircraft.

Comment Re: Must hackers be such dicks about this? (Score 1) 270

Did the FBI have the evidence that he had actually hacked a previous leg of the flight, or did they not?

If they did not, if they knowingly programmed a suspect with false information, they are guilty of attempted witness tampering through false memory syndrome. Lots of work on this, you can program anyone to believe they've done anything even if the evidence is right in front of them that nothing was done at all. Strong minds make no difference, in fact they're apparently easier to break.

Falsifying the record is self-evidently failure of restraint.

I have little sympathy for the researcher, this kind of response has been commonplace since 2001, slow-learners have no business doing science or engineering. They weren't exactly infrequent before then.

Nor have I any sympathy for the airlines. It isn't hard to build a secure network where the security augments function rather than simply taking up overhead. The same is true of insecure car networks. The manufacturers of computerized vehicles should be given a sensible deadline (say, next week Tuesday) to have fully tested and certified patches installed on all vulnerable vehicles.

Failure should result in fines of ((10 x vehicle worth) + (average number of occupants x average fine for unlawful death)) x number of vehicles in service. At 15% annual rate of interest for every year the manufacturer delays.

Comment Re:What's the problem? (Score 1) 208

p-values are not probabilities. What people would like it to be are probabilities that one hypothesis is correct compared to another. But that is not what it does, and because people ignore that gap and mis-interpret them it has become such a problem; that's why they are being banned. Many experiments with acceptable p-values (p0.05) are not reproducible.

Actually the inventor of p-values never intended them for a test, only to uncover that there is perhaps worth of further investigation.

p-values tell you, if you collected data under the current model, how frequently you will get data more extreme than the data at hand. p0.01 means, only in 1% of cases you will get such an "outlier". But it assumes that the model itself is correct. It varies the data!

Instead, what should be done is to compare one model versus another one, with the data we have. Bayes factors do that, and should be used and taught.

The problem came to be because social sciences do not have proper, meaningful models, which can be compared. So they have resorted to techniques that do not require specifying models (or alternatives) rigorously. In the physical sciences, you can precisely write a model for a planetary system with 2 planets and one with 3 planets, and the Bayes factor will be meaningful.

Comment Re:Because K9 sucks like most (Score 1) 179

Exchange is a given in many a corporate environment.
Not for technical reasons, but for legal and business reasons.
Choices like Microsoft and Oracle are almost a default because of two reasons:
- You can have a nice Platinum contract for same business day support.
- You have a target for your lawyers if things go wrong.* **

Neither of these are provided by the kind of project you mention.

*) I'm talking from Europe, it's probably worse in the USA.
**) I'm talking something that can absorb a €100+ million damage claim.

Comment Re:The real extinction (Score 5, Informative) 93

Try these?

  • Firestone RB, West A, Kennett JP et al. (October 2007). "Evidence for an extraterrestrial impact 12,900 years ago that contributed to the megafaunal extinctions and the Younger Dryas cooling". Proceedings of the National Academy of Sciences of the United States of America 104 (41): 16016–21. Bibcode:2007PNAS..10416016F. doi:10.1073/pnas.0706977104. PMC 1994902. PMID 17901202.
  • Loarie, Scott R.; Duffy, Philip B.; Hamilton, Healy; Asner, Gregory P.; Field, Christopher B.; Ackerly, David D. (2009). "The velocity of climate change". Nature 462 (7276): 1052–1055. Bibcode:2009Natur.462.1052L. doi:10.1038/nature08649. PMID 20033047.
  • Steadman, D. W. (1995). "Prehistoric extinctions of Pacific island birds: biodiversity meets zooarchaeology". Science 267 (5201): 1123–1131. Bibcode:1995Sci...267.1123S. doi:10.1126/science.267.5201.1123.
  • Steadman, D. W.; Martin, P. S. (2003). "The late Quaternary extinction and future resurrection of birds on Pacific islands". Earth Science Reviews 61 (1–2): 133–147. Bibcode:2003ESRv...61..133S. doi:10.1016/S0012-8252(02)00116-2.

and

  • S.L. Pimm, G.J. Russell, J.L. Gittleman and T.M. Brooks, The Future of Biodiversity, Science 269: 347–350 (1995)
    Doughty, C. E., A. Wolf, and C. B. Field (2010), Biophysical feedbacks between the Pleistocene megafauna extinction and climate: The first humaninduced global warming?,Geophys. Res. Lett., 37, L15703, doi:10.1029/2010GL043985
  • Pitulko, V. V., P. A. Nikolsky, E. Y. Girya, A. E. Basilyan, V. E. Tumskoy, S. A. Koulakov, S. N. Astakhov, E. Y. Pavlova, and M. A. Anisimov (2004), The Yana RHS site: Humans in the Arctic before the Last Glacial Maximum, Science, 303(5654), 52–56, doi:10.1126/science.1085219
  • Barnosky, Anthony D.; Matzke, Nicholas; Tomiya, Susumu; Wogan, Guinevere O. U.; Swartz, Brian; Quental, Tiago B.; Marshall, Charles; McGuire, Jenny L.; Lindsey, Emily L.; Maguire, Kaitlin C.; Mersey, Ben; Ferrer, Elizabeth A. (3 March 2011). "Has the Earth’s sixth mass extinction already arrived?". Nature 471 (7336): 51–57. Bibcode:2011Natur.471...51B. doi:10.1038/nature09678.
  • Zalasiewicz, Jan; Williams, Mark; Smith, Alan; Barry, Tiffany L.; Coe, Angela L.; Bown, Paul R.; Brenchley, Patrick; Cantrill, David; Gale, Andrew; Gibbard, Philip; Gregory, F. John; Hounslow, Mark W.; Kerr, Andrew C.; Pearson, Paul; Knox, Robert; Powell, John; Waters, Colin; Marshall, John; Oates, Michael; Rawson, Peter; Stone, Philip (2008). "Are we now living in the Anthropocene". GSA Today 18 (2): 4. doi:10.1130/GSAT01802A.1.
  • Vitousek, P. M.; Mooney, H. A.; Lubchenco, J.; Melillo, J. M. (1997). "Human Domination of Earth's Ecosystems". Science 277 (5325): 494–499. doi:10.1126/science.277.5325.494.
  • Wooldridge, S. A. (9 June 2008). "Mass extinctions past and present: a unifying hypothesis". Biogeosciences Discuss (Copernicus) 5 (3): 2401–2423. doi:10.5194/bgd-5-2401-2008.
  • Jackson, J. B. C. (Aug 2008). "Colloquium paper: ecological extinction and evolution in the brave new ocean" (Free full text). Proceedings of the National Academy of Sciences of the United States of America 105 (Suppl 1): 11458–11465. Bibcode:2008PNAS..10511458J. doi:10.1073/pnas.0802812105. ISSN 0027-8424. PMC 2556419. PMID 18695220. edit
  • Elewa, Ashraf M. T. "14. Current mass extinction". In Elewa, Ashraf M. T. Mass Extinction. pp. 191–194. doi:10.1007/978-3-540-75916-4_14.
    Mason, Betsy (10 December 2003). "Man has been changing climate for 8,000 years". Nature. doi:10.1038/news031208-7.
    MacPhee and Marx published their hyperdisease hypothesis in 1997. "The 40,000-year plague: Humans, hyperdisease, and first-contact extinctions." In S. M. Goodman and B. D. Patterson (eds), Natural Change and Human Impact in Madagascar, pp. 169–217, Smithsonian Institution Press: Washington DC.
  • Lyons, S. Kathleen; Smith, Felisa A.; Wagner, Peter J.; White, Ethan P.; Brown, James H. (2004). "Was a ‘hyperdisease’ responsible for the late Pleistocene megafaunal extinction?". Ecology Letters 7 (9): 859–868. doi:10.1111/j.1461-0248.2004.00643.x.
  • Graham, R. W. and Mead, J. I. 1987. Environmental fluctuations and evolution of mammalian faunas during the last deglaciation in North America. In: Ruddiman, W. F. and H.E. Wright, J., editors. North America and Adjacent Oceans During the Last Deglaciation. Volume K-3. The Geology of North America, Geological Society of America
  • Martin P. S. (1967). Prehistoric overkill. In Pleistocene extinctions: The search for a cause (ed. P.S. Martin and H.E. Wright). New Haven: Yale University Press. ISBN 0-300-00755-8.
  • Lyons, S.K., Smith, F.A., and Brown, J.H. (2004). "Of mice, mastodons and men: human-mediated extinctions on four continents". Evolutionary Ecology Research 6: 339–358. Retrieved 18 October 2012.

Wikipedia also list a few books if you are interested.

Comment Re:How have we ruled out measurement or model erro (Score 2) 117

ad 3: Plenty of people are working on modified models, such as alternatives to general relativity. There are papers coming out every week. https://en.wikipedia.org/wiki/...
ad 2: Errors in measurements can be somewhat excluded as a possibility because many different measurements looking at different aspects and scales find the same result. Wikipedia lists 3.1 Galaxy rotation curves, 3.2 Velocity dispersions of galaxies, 3.3 Galaxy clusters and gravitational lensing, 3.4 Cosmic microwave background, 3.5 Sky surveys and baryon acoustic oscillations, 3.6 Type Ia supernovae distance measurements, 3.7 Lyman-alpha forest and 3.8 Structure formation . See also my other post.

Comment Re:If the only interaction was gravity (Score 1) 117

Then wouldn't the dark matter clouds just collapse in on themselves and form singularities as there would be no counterforce to gravitational attraction?

Gravity is the attraction of masses. The reason that things don't pass through each other is something else. It involves the electric repulsion of electrons and protons, but a more detailed answer is here

Comment Re:Dark matter doesn't exist. (Score 5, Interesting) 117

In a rush to tailor the evidence to a flawed theory, dark mentor was invented by humon minds in an attempt to save a beloved theory. We need to cast off the shackles of what we want to be true, and look at the evidence in a cold, anyalytical light. When this is done, I'm quite certain that there will be no need for the magical fairy dust matter that is there but isn't there.

The term dark matter is just the name for a discrepancy. For example, the galaxy rotation speed is 220 km/s at our position in the galaxy (8kpc), and stays the same until 30kpc. But the number of stars, which are the mass we can see, declines exponentially. So some mass (10x more than what we see) must be there to keep the rotation fast (otherwise it would be like the solar system -- Pluto rotating around the sun much slower than Mars).
Then in clusters we see that gravitational light acts as a lens and we can infer the mass that bends the light behind it. And it is much more than we see in stars and gas.
In the cosmic microwave background, which is a relic from the last time electrons and photons interacted very strongly, 380000 years after the "Big Bang", we can estimate the density of the universe there. Also, the fraction of matter interacting with photons, is only a fraction of the total matter there.
All of these *different, independent* probes, and several others, point to the same ratio of total matter to electromagnetically-interacting matter.
Now you can take the state of the Universe at 380000 years age, with its total matter, electromagnetically-interacting matter and photon budget and evolve it following general relativity. And people find that the clustering of galaxies, their total number and sizes can be reproduced quite well. And this is not possible without putting that additional, non-electromagnetically-interacting matter there. And In this experiment you can learn something about how weak the electromagnetic interaction must be (for example, a large population of Neutrinos can be excluded, because they interact to strongly, smoothing out the structures).

As you say, another path is to modify the theories of GR, and every week there are papers explaining Dark Matter with alternative theories, sometimes in combination with Dark Energy. This is a path that many people are working on. If you see the term "Dark Matter" as the *name of the problem*, namely the discrepancy between observations and normal matter + GR, then there is no conflict, it does not say how to solve it. Dark matter is real, because the discrepancy exists. And the search for particles is also not concluded yet: Larger, cold objects have been proposed (e.g. brown dwarfes, Jupiter-size planets), as well as new fundamental particles (Neutrinos, as well as as-of-yet unobserved particles, like the sterile Neutrino, or totally new particles from some theories of supersymmetry). Some of them have been excluded -- for example it can not be stellar-size black holes, because of the number of binary star systems we observe in the outer parts of the Milky Way; those would be destroyed by frequent interactions with a large population of masses. The upgraded LHC will try to produce more particles, and there is a real chance it will produce (or exclude) a specific candidate dark matter particle predicted (proposed) by supersymmetry.

Believe me, Astronomers really do not like the idea of Dark Matter, and have been fighting it for decades. But the evidence from many different experiments is there. We still don't know what it is, whether the laws have to be changed or additional particles have to be put there (and which ones). But the range of possibilities is getting smaller and smaller. And putting particles there that do not interact except for gravity has been very successful in explaining various observations. I used to be cautious because in principle you could just arbitrarily put mass where you need it -- but if you start from the Big Bang and only use general relativity, then the created galaxies with dark matter in/around them, or galaxy collisions like the one in this article just come out -- there is no choice involved here except for the density of dark matter in the early universe.

Many different observations, proposed resolutions in new theories, proposed particles and detection experiments are listed on the Wikipedia page https://en.wikipedia.org/wiki/...

For this particular observation, you should note that they observed 60 collisions, which act like we think dark matter acts (no collisions, only gravity), and one is odd. That should tell you to be cautious, perhaps something is peculiar about this system or the observation.

Slashdot Top Deals

"The one charm of marriage is that it makes a life of deception a neccessity." - Oscar Wilde

Working...