Slashdot videos: Now with more Slashdot!
With regards to the Wikipedia article claiming a historical 0.0-0.2mm range over the last 2000 years that probably needs to be updated with more recent research.
Thewell-preserved biological remains on the sh tank wall allow us to estimate anRSL rise of 40 ±10 cm at Frejus since Roman times
400 / 2000 = 0.2mm average per year over the last 2000 years. (And as documented in this paper there are other papers that claim higher numbers)
(Slashdot seems to make a mess out of the hyphen in the link - the paper can be found as doi 10.1002/gea.21444 )
It's trivial to secretly record someone using a mobile phone - I can hold it in my hand, down by my waist, at an angle.
It's extremely difficult recording someone secretly using a head mount camera. I must look directly at them the whole time.
The warming data clearly indicates that rate of temperature of last 50 years is far higher than any other period in history
Why do you believe that? It's not even true for the last 150 years - even less so if we include the rest of the Holocene.
Q: Do you agree that according to the global temperature record used by the IPCC, the rates of global warming from 1860-1880, 1910-1940 and 1975-1998 were identical?
A: So, in answer to the question, the warming rates for all 4 periods are similar and not statistically significantly different from each other.
- Phil Jones, director of the Climatic Research Unit (CRU)
Until a few decades ago it was generally thought that all large-scale global and regional climate changes occurred gradually over a timescale of many centuries or millennia, scarcely perceptible during a human lifetime. The tendency of climate to change relatively suddenly has been one of the most suprising outcomes of the study of earth history, specifically the last 150,000 years (e.g., Taylor et al., 1993). Some and possibly most large climate changes (involving, for example, a regional change in mean annual temperature of several degrees celsius) occurred at most on a timescale of a few centuries, sometimes decades, and perhaps even just a few years. The decadal-timescale transitions would presumably have been quite noticeable to humans living at such times, and may have created difficulties or opportunities (e.g., the possibility of crossing exposed land bridges, before sea level could rise)
(This post does not question AGW. It does question strange statements regarding our current climate that have no scientific basis)
This is nowhere near the worst drought in California's recorded history.
Through studies of tree rings, sediment and other natural evidence, researchers have documented multiple droughts in California that lasted 10 or 20 years in a row during the past 1,000 years -- compared to the mere three-year duration of the current dry spell. The two most severe megadroughts make the Dust Bowl of the 1930s look tame: a 240-year-long drought that started in 850 and, 50 years after the conclusion of that one, another that stretched at least 180 years.
Unless, of course, those proxies are unreliable.
I guess most of us (yes, I'm Swedish) find it much more plausible that the decision came after the court having told the prosecutor that she did not fulfil the reasonability requirement for handling the investigation.
That happened this Tuesday.
På tisdagen bestämde Högsta domstolen att Riksåklagaren ska skicka in en svarsskrivelse i målet. Riksåklagaren ska där förklara hur utredningsarbetet ska fortsätta – speciellt när det gäller frågan om proportionalitetsprincipen.
Enligt principen ska olika intressen vägas mot varandra och åtgärder ska inte gå utöver det som är nödvändigt med hänsyn till ändamålet. Det kan till exempel gälla samhällets krav på säkerhet mot individens rätt till integritet.
Marianne Ny had no choice after this but to finally do what Swedish prosecutors do all the time - question people abroad.
(Looking at your post history on the subject of Assange your bias is extremely visible)
I suspect that we could persuade those caches to flush to RAM, simply by exhausting the number of possible lines for that address - if the cache is set-associative. Of course modern processors have multiple levels of cache, so that makes it harder.
This is sort of self-contradictory, so I don't really need to respond to it directly. I just want to point one thing out. I can't afford to work for any company as less than a C-level employee. It would be a salary cut from my current business.
Not to mention that I'd not like it.
An AC talking about balls. Pathetic.
Right. I didn't even bother responding to the taunts.
Coward really means coward. I am sorry for the folks who are afraid that their employer will take a dislike of what they post, but for them we have handles.
I can't say I'm happy about what's happened to Debian. Having Ubuntu as a commercial derivative really has been the kiss of death for it, not that there were not other problems. It strikes me that the kernel team has done better for its lack of a constitution and elections, and Linus' ability to tell someone to screw off. I even got to tell him to screw off when he was dumping on 'Tridge over Bitkeeper. Somehow, that stuff works.
IMO, don't create a happy inclusive project team full of respect for each other. Hand-pick the geniuses and let them fight. You get better code in the end.
This actually has something to do with why so many people hate Systemd. It turns out that Systemd is professional-quality work done by competent salaried engineers. Our problem with it is that we're used to beautiful code made by geniuses. Going all of the way back to DMR.
It really does look like Jomo did post this article, and it refers to another article of his.
What isn't to like about Ubuntu is that it's a commercial project with a significant unpaid staff. Once in a while I make a point of telling the unpaid staff that there really are better ways that they could be helping Free Software.
It's just that I object folks who would be good community contributors being lured into being unpaid employees instead.
Say how do feel about idiots working for corporations contractually enmeshed with the US military-industrial-surveillance complex. Why no spittle-laced hate for them?
The GNU Radio project was funded in part by a United States intelligence agency. They paid good money and the result is under GPL. What's not to like?
Keep all of the idiots that want to work for a millionare for nothing. Fire the others.
Anyone with sense has by now joined a non-profit project.
Compare-and-exchange and mfence would be doing cache flush all of the way to RAM and global cache line invalidation, wouldn't they? So, they can potentially be used to hammer too.
Multi-threaded programs really do need those cache flushes to implement their interprocessor communications, don't they? It seems to me that they would be the ones most likely to hit this problem.