Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment So a 1st world programmer costs even more (Score 1) 130

So a 1st world programmer has to pay $1200 for theri development kit while it's sold to a 3rd world developers for $70 or even given a way free.

Then the 3rd world developer is allowed to directly compete for work with the 1st world developer as if the both lived in the 1st world country.

Products should be sold for the same price in both locations since labor is forced to compete for living expenses and it's more expensive to live in the higher GDP country.

Unless you just want a lot of unemployeed people in the high gdp country.

This is literally extracting wealth from wealthy countries leaving less to circulate in the economy.

Comment Re:Is it true? (Score 3, Informative) 86

I never saw that in the many years I was working primarily with C++ and a regular reader of the related newsgroups. When Bjarne did contribute in any forums I followed, he generally seemed direct and reasonable, and it was usually in the more advanced discussions about tricky areas or the future of the language.

Comment Re:Leave. (Score 1) 433

There is no way to trace what they did, no way to confirm their methods. Sadly the masses are not equipped to scrutinize the nonsense. [Steve A Morris, 2017-01-11]

You can trace what Hausfather et al. 2017 did by downloading the code they made freely available at You can confirm their methods by reading the full paper and following the links at the end which lead to all the data they used. Interested members of the public can read or watch the background they shared.

... they simply don't use 1/3 of the ARGO datasets because its data is "more ambiguous". Translation: "It doesn't fit our needs." [Lonny Eachus, 2017-01-11]

Read the paper to see if Lonny's "translation" is reasonable: "... Two of the three Argo near-SST records assessed, APDRC and H2008, agree well with the buoy-only and satellite-based records and suggest a cool bias in ERSSTv3b during the 2005-2015 period, when sufficient Argo data are available (Fig. 3). The RG2009 series is more ambiguous, with trends that are not significantly different (P > 0.05) from either ERSSTv3b or ERSSTv4. ..."

Lonny Eachus is wrong to claim that Hausfather et al. "simply don't use 1/3 of the ARGO datasets" (presumably a reference to RG2009). They used 3 independent Argo near-SST (near sea surface temperature) datasets, and reported the results from all 3 datasets. Anyone who reads the full paper will see that they mention RG2009 a total of 17 times while reporting the results of using that dataset.

... the study's argument is rather weak. ARGO data has best coverage, best instruments. Yet they arbitrarily throw out 1/3 of the ARGO data sets because they don't agree with their preconceptions. ... In sum, it appears that this paper committed the same likely error as Karl et al. That is to say: ignoring arguably better data because it doesn't fit their preconceptions. [Lonny Eachus, 2017-01-11]

Wrong. Hausfather et al. didn't "throw out" or "ignore" 1/3 of the Argo datasets. Look at figure 3 (backup). They show the results of all three Argo datasets, including four instances using the RG2009 dataset which Lonny baselessly accuses them of "arbitrarily throwing out" and "ignoring".

Paper: (1) "We constructed our own data set from other data sets." (2) Oops. But we left some out. "(3) "We find MOST of the data we used does not match our new contrived data set. So we will ignore it." [Lonny Eachus, 2017-01-11]

Again, Hausfather et al. didn't "leave out" or "ignore" the RG2009 dataset. Look at figure 4 (backup). They show the results of all 3 Argo datasets, including the RG2009 dataset which Lonny baselessly accuses them of "ignoring".

Figure 4 examines four composite SST records: ERSSTv4, ERSSTv3b, HadSST3, and COBE-SST. These composite SST records are compared to instrumentally homogenous datasets (which just means "from a single type of instrument"): buoys, CCI (satellite), and all three Argo near-SST datasets. Figure 4 subtracts all those datasets from each composite SST record, then calculates the trend. If a differenced trend includes "zero" inside its 95% confidence interval, scientists say that particular instrument's trend agrees with that particular composite SST record's trend at the 95% confidence level.

For both examined timespans, buoys and CCI agree with ERSSTv4 at the 95% confidence level, and disagree with all the other composite SST records. The H2008 Argo dataset disagrees with all composite SST records because it shows more warming than all composite SST records, although ERSSTv4 is the closest match. The APDRC Argo dataset agrees with ERSSTv4 and COBE-SST. The RG2009 Argo dataset (which Lonny wrongly claims they "ignored") is in fact the last dataset shown in figure 4. RG2009 agrees with all four composite SST records at the 95% confidence level. That's what Hausfather et al. meant when they said RG2009 is "more ambiguous".

Paper: (1) "We constructed our own data set from other data sets." (2) Oops. But we left some out. "(3) "We find MOST of the data we used does not match our new contrived data set. So we will ignore it." [Lonny Eachus, 2017-01-11]

Presumably Lonny's "new contrived data set" is ERSSTv4, which Jane/Lonny has complained about ad nauseam. Look at figure 4 again. Buoys and CCI satellite datasets agree with ERSSTv4. The Argo APDRC and RG2009 datasets agree with ERSSTv4, but they also agree with other composite SST records so those results are more ambiguous. The Argo H2008 dataset disagrees with all composite SST records because H2008 shows more warming than all of them including ERSSTv4, though ERSSTv4 is the best match.

In other words, they didn't ignore any data, and most of the data matches ERSSTv4. In fact, the RG2009 dataset which Lonny wrongly claims they "simply don't use" is "more ambiguous" precisely because it does match ERSSTv4 (and all the other tested composite SST records). The Argo H2008 dataset is the only one which doesn't have a trend matching ERSSTv4 at the 95% confidence level (because H2008 shows more warming than ERSSTv4) and it also shows that ERSSTv4 is a closer match than any other tested composite SST record.

That really is what they did, though. As I described. They transformed data in ways that are not 100% clear. [Lonny Eachus, 2017-01-11]

No, what Lonny described really isn't what Hausfather et al. 2017 did. In fact, it's hard to imagine how Lonny's description could have been more wrong. See above. Or just read the paper to see that they didn't ignore data and made their methodology 100% clear by making their code freely available.

I find it amusing that climate scientists are prone to argue "surface temp. records are better than satellite". But then claim that the satellite record is "better" than ARGO floats. Just as they did with satellites, climate scientists crowed about their new ARGO floats. "Best thing ever." Then, just like satellites, when the new data does not fit their preconceptions, they just omit it. That has been a very noticeable pattern in the field of climate science. ... a pattern of behavior is a pattern of behavior. Always excuses to omit inconvenient data. [Lonny Eachus, 2017-01-11]

Again, Lonny's delusional narrative where "scientists crowed" about satellite data before "omitting" them is completely baseless. And even though he probably won't ever admit it, deep down Lonny should realize that his new delusional narrative about Hausfather et al. "omitting" data was also just shown to be completely baseless.

Again, Lonny's just projecting. Jane/Lonny previously cited ocean heat content (OHC) measurements based on Argo and satellite data until I showed him that those data doesn't support his incorrect claims that there hasn't been any global warming for 18 years. For years, Lonny has shown a pattern of behavior where he ignores the "best measure" of global warming: OHC data from Argo which reveal ~90% of Earth's added heat. Maybe Lonny omits those data because they don't fit his preconceptions?

Comment YES. (Score 1) 105

Because until we get unmolested pure android OS installs that allow us to remove all baked in crap the Carriers and phone makers try and sneak in there, Android users will need a way to get a smooth and clean Android experience.

Comment Re:Wind and Solar are Environmental Disasters (Score 1) 409

âoeThe most reliable estimate of the cost of decommissioning [a nuclear power plant] is 10-15 percent of the construction cost, contrary to some highly inflated estimates ... Modern serious studies of the disposal problem indicate that satisfactory isolation is technologically feasible, even for the long term.â So wrote MIT nuclear engineering professor David Rose in the November 1985 issue of The Bulletin of the Atomic Scientists.

How misguided that view seems now, with the advantage of decades of experience. The Yankee Nuclear Power Station in Rowe, Massachusetts, took 15 years to decommissionâ"or five times longer than was needed to build it. And decommissioning the plantâ"constructed early in the 1960s for $39 millionâ"cost $608 million. The plantâ(TM)s spent fuel rods are still stored in a facility on-site, because there is no permanent disposal repository to put them in.


Look, it's late and I'm tired but I've had this exact conversation many many times. I'm not just "spewing" out random crap.

That plant was suppose to cost 6 million to decommission. Adjusting for inflation, it would have cost 39million (the same as it cost to build it but with inflated dollars so really just a nice coincidence). That's $560 million more than estimated and paid for in utility bills along the way.


The company wants to try out the idea for the first time on the northwest coast of England, at the notorious nuclear dumping ground at Sellafield, which holds the world's largest stock of civilian plutonium. At close to 120 tons, it stores more plutonium from reactors than the U.S. and Russia combined.

While most of the world's civilian plutonium waste is still trapped inside highly radioactive spent fuel, much of that British plutonium is in the form of plutonium dioxide powder. It has been extracted from spent fuel with the intention of using it to power an earlier generation of fast reactors that were never built. This makes it much more vulnerable to theft and use in nuclear weapons than plutonium still held inside spent fuel, as most of the U.S. stockpile is.


By 2025, Germany is to have no more than 45 percent renewable power. The U.S. should too.
It has a quarter of our population but total US GDP is 16.77 trillion dollars while germany is only 3.77 trillion dollars.
We can do this and almost permanently cap the price of coal and oil.


Really we are quibbling.

I think we both agree a smart mix of alternative energy, nuclear energy, and even coal makes sense for the near future (say 2045) and that increasing the percentage of alternative energy will reduce consumption and prices of fossil fuels.

I showed that breeder reactors produce plutonium dioxide which must be secured against terrorists and backed that up with the actual experience of a breeder reactor in England.
I also showed that decommissioning costs for nuclear plants are underestimated by over and order of magnitude.

Comment Re:Wind and Solar are Environmental Disasters (Score 2) 409

The problem is that decommissioning nuclear power plants is coming in at 10 times more expensive than estimated 30 years ago. And since private companies can't afford those costs, you end up paying them in higher rates or higher taxes.

We also need at least one breeder reactor which would reduce nuclear waste to 1% the volume AND also simultaneously reduces the lifespan of the radioactive waste significantly
" removing the transuranics from the waste eliminates much of the long-term radioactivity of spent nuclear fuel.["

Such a reactor would need very high security (perhaps to the extent of being run by the government and on a large military base because plutonium is one output. You can make nuclear weapons from that. BUT, you could also shuttle it off the planet to fuel long range space exploration as fast as we make it to reduce that risk.

On your other point...

Solar is now cheaper than wind.

Solar is closing in on price parity with the likes of coal â" with full-cycle, unsubsidized costs of about 13 cents per kilowatthour, versus 12 cents for advanced coal plants.

But there will be cases where we need Coal (with proper scrubbing which didn't start for many plants until 2015 and which may be backed out now) until we get very good batteries. And lots of them. If every consumer has a "power wall" of some kind with 8 hours of electrical storage, and when power companies have lots of molten salt (or whatever) to store power for night time and cloudy days, then we'll need no coal. But until then, we'll need some coal.

But less.

And the price for coal (and oil) is set by the most expensive coal to mine (or oil to pump).
Say you can mine 90% of coal for 36 dollars a ton and the last 10% for 46 dollars a ton. Then the price of coal will be $46 dollars a ton. So if you can just eliminate 10% of demand for coal, then the price of coal (and your electric cost ber kwh) will drop about 22%.

Comment Free software assistant... already exists (Score 3, Informative) 90

Free software assistant... already exists

They've got an RPi image you can download, slap on a card, and be up and running with a USB mic and something to handle the audio out.

Seems to me like the FSF should pay more attention to what is already going on.

Submission + - France to review food whitener additive, titanium dioxide, for health risks (

Eloking writes: The French government has ordered a review of the safety of titanium dioxide as a food additive after a scientific study released on Friday found health effects in animals that consumed the substance.

Titanium dioxide is widely used in industry as a whitener, notably for paint. It is an ingredient in some foods such as sweets and known as additive E171.

France's National Institute for Agricultural Research (INRA) and partners in a study on oral exposure to titanium dioxide had shown for the first time that E171 crosses the intestine wall in animals to reach other parts of the body, INRA said.

Submission + - The 32-Bit Dog Ate 16 Million Kids' CS Homework

theodp writes: Tech explains in a blog post that it encountered technical difficulties Friday that temporarily made the work of 16 million K-12 students who have used the nonprofit's Code Studio offering disappear. CTO Jeremy Stone gave the kids an impromptu lesson on the powers of two with his explanation of why The Cloud ate their homework: "This morning, at 9:19 am PST, coding progress by students stopped saving on Code Studio, and the issue briefly brought the Code Studio site down. We brought the site back up shortly thereafter but student progress was still not being saved, and instead students saw an outdated message about the Hour of Code from December. [...] The way we store student coding activity is in a table that until today had a 32-bit index. What this means is that the database table could only store 4 billion rows of coding activity information. We didn’t realize we were running up to the limit, and the table got full. We have now made a new student activity table that is storing progress by students. With the new table, we are switching to a 64-bit index which will hold up to 18 quintillion rows of information. On the plus side, this new table will be able to store student coding information for millions of years. On the down side, until we’ve moved everything over to the new table, some students’ code from before today may temporarily not appear, so please be patient with us as we fix it."

Slashdot Top Deals

Things equal to nothing else are equal to each other.