Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Slashdot Deals: Cyber Monday Sale Extended! Courses ranging from coding to project management - all eLearning deals 20% off with coupon code "CYBERMONDAY20". ×

Comment Re:Related? (Score 2) 138

That is a gross oversimplification. Receiving a dose of 200uSv via exposure to something like x-rays is very different to being exposed to 200uSv that includes particulate matter that will accumulate inside the body. The former is a one time "hit", the latter is much more likely to lead to cancer because the material can sit inside the body slowly damaging DNA.

If you believe in the linear no-threshold model then it makes no difference whether the dose is received in a single hit or an extended time period.

Those who doubt LNT usually suspect a dose-response curve that goes in the opposite direction to what you are suggesting.

Particulate exposure could conceivably be worse for you due to the exposure being localised to one part of the body, but that has nothing to do with the timescale over which the dose is spread.

Sadly that XKCD chart and nonsense like the "banana equivalent dose" have spread a lot of misinformation about this.

The main issue with the concept of a "banana equivalent dose" is homeostasis of potassium levels, which again has nothing to do with any of the points above.

Comment Re:headline is misleading (Score 1) 528

The headline is sufficient for those who do not understand how the power grid works, and anyone who knows how the power grid works would not be misled by the headline.

I disagree.

Even though my bill says "100% wind" on it, and somewhere out there are windmill(s) generating as much electricity as my home consumes, the actual power consumed in my house might just as easily come from the coal plants up the highway. It's all on the same grid.

Fair enough: electricity is fungible, and it doesn't matter what powers what (if it is even possible to tell).

If you understand that, then it's obvious that "Power Every US Home With Renewables" means "Generate As Much Renewable Energy As All Homes Consume". What appears on the bills of those homeowners is irrelevant.

If you had said enough /power/ for all homes then I'd agree there too, but that is much more difficult than generating enough energy, because you have to deliver it reliably and match the demand curve. By only counting the energy you are saying two things:

1. That the non-domestic part of the grid will reserve enough spare capacity to cover any shortfall from renewables.

2. That you can dump an unlimited amount of energy onto the non-domestic part of the grid and still count it towards your target, even if it isn't actually needed at the time.

I'm not saying that increasing renewable capacity in this way is a bad thing - it depends on the details, and in any event I'm not from the US so won't have to pay for it. The claimed outcome sounds overstated though.

Comment Re: Sure you can. (Score 3, Insightful) 492

Why can I say you're wrong? Because people have been saying that for 20 years, it hasn't happened, it won't happen, it isn't even remotely close to happening.

I can remember much the same being said about Internet Explorer, which went from well over 90% usage share to more like 20% over the last 10-15 years (with much of the decline happening before mobile became an important factor).

An entrenched monopoly can be difficult to dislodge, but that doesn't mean it will last forever. Microsoft has also lost a lot of ground that would have protected Windows had it held onto them - control of the web browser and wordprocessor being the two main ones.

(Imagine if every website used ActiveX - that would be a problem for competitors. There are plenty of market niches were similar problems still exist, but for mainstream users I don't see any insurmountable barriers to migration now.)

Now it may very well be that what replaces the Windows desktop isn't called Linux. It might not even be Linux-based, or run on what we would currently recognise as a desktop PC. (The most effective challengers so far have been Android and IOS, which satisfy two and three of these conditions respectively.) Microsoft could also stay there longer by upping their game. Nothing lasts forever, though.

Comment Re:bad statistics (Score 1) 240

And yet if I look at StatCounter's map function, showing the leading browser in each country Chrome leads in most of the world. IE only leads in Japan, South Korea, Swaziland (pop. 1.1mio), Greenland (pop. 55000) and Antarctica (5000 visitors). Firefox has a few strongholds like Germany, Indonesia, Myanmar, Bangladesh, Iran and a bunch of countries in Africa, but the only place IE is ahead of Chrome in second place is Iran (pop. 78mio). With Chrome winning on walkover in Europe, South America, North America, Africa and Oceania and taking massive wins in China, India and Russia I don't see how any possible weighting of StatCounter's numbers would put IE on top.

You're right that the country weightings don't account for the difference by themselves, but there is also the difference between counting users versus pageviews, and it would be unsurprising if there were differences between the types of websites sampled by the two companies.

Comment Re:bad statistics (Score 1) 240

Correction: it seems that Net Applications do count unique users per site, and it is per day not per month, so most of the discrepancy must be due to a different mechanism from the one I described above. Apologies for the belated fact checking.

The figures do count users rather than traffic, and while they claim to weight by traffic, the data source they appear to be referring to is stated in terms of users. If that is so then it would remain the case that they are counting traffic which is not real: users presumed to be online for more days per month than they are, and to visit more websites than they do. That is less likely to result in a very large discrepancy, but could very well be enough to account for the difference between Net Applications and other published figures.

Comment Re:bad statistics (Score 1) 240

It isn't just their correction algorithms, it is the whole basis of what they are trying to measure. Consider this.

I probably use IE once or twice a month, but Firefox and Chrome several thousand times in the same period. So far as Net Applications are concerned that counts as one user for each of the three browsers. Meanwhile, over in the Duchy of Grand Fenwick you might have a user who doesn't bother installing Firefox or Chrome because he uses the Internet so little, but who probably counts as several users for IE once the statistics are corrected[1].

The result is that IE could dwindle to a negligible fraction of total web traffic and Net Applications might still show them ahead in terms of users - even if their correction factors were spot on (which I doubt). I'm sure they're doing their best in their own terms, but it's difficult to see what the figures they are producing are useful for. The StatCounter sample may be biased, but at least their results bear some resemblance to the traffic that a web site is actually likely to receive.

[1] No offence intended to readers from the Duchy of Grand Fenwick.

Comment Re:Bad idea (Score 1) 626

Now that happened, and we add the previous corpus of English-speaking people, I think its reached a critical mass to make it a de-facto standard (like how Windows and not anything really good is our most common OS

Er, you do realise that it is several years since Windows was the most common OS (longer if you include embedded systems). It's a great example of the network effect at work, but shows how it can both give and take away.

Comment Re:A Language With No Rules... (Score 2) 667

Yes, but the ultimate goal is communication, and to that end some change is useful, some is harmful - and almost any change will have the effect of making older texts less readable.

Think of descriptivists as scientists and prescriptivists as engineers (albeit, it must be said, not always very good ones). I think there is a role for both.

Comment Re:So.. what? (Score 1) 255


Any difference looks a lot smaller than the markup I've ended up paying for things like going through an energy co-op instead of straight from the generating company.

[...] We do need to talk about cost but we need to talk about ALL the costs not just the operating costs but all the externalized costs as well.

Not just the costs, but also whether the energy is dispatchable.

Power sources which can be turned on and off at short notice - such as gas and hydro - are economically more valuable than ones which can't - such as coal and nuclear. (Some nuclear plants can be ramped up and down, but the capital costs are so high and the fuel costs so low that it doesn't win you much.)

Any of the above are considerably more valuable than sources which are both non-dispatchable and intermittent, such as wind and solar. (How much more valuable depends on factors such as the shape of the demand curve, and how much of the rest of your capacity is gas and/or hydro. Intermittent sources can work quite well in some locations, others not so much.)

Comment Re:headed in the wrong direction (Score 2) 230

Background levels are around 1 mS/year. So why advocate thresholds more than two orders of magnitude lower than what people normally get in a year? I just don't think science has much to do with your choice of thresholds.

This is a fallacy. The threshold should be set on the estimated benefits of a higher threshold vs the estimated harm from the additional radiation. The background radiation has nothing to with it.

It would be a fallacy if background levels were fixed and unavoidable. They're not. So long as people are allowed to and choose to travel by air, and live in areas with above-average background radiation, it is reasonable to argue that nuclear power should be held to a similar standard.

(Granted that medical imaging is different because you would normally be doing it for a good medical reason.)

Comment Re:About time (Score 2) 230

Nuclear plants don't emit an even level of radiation in all directions. They emit radioactive particles that then move around on the wind, in the soil and in the water. These particles can accumulate, so the level needs to be kept very low so that they can keep dispersing.

0.25 mSv is a measure of the dose received, not the radioactivity emitted. A given amount of radioactivity inside your body will result in a larger dose than the same amount outside, so the effects you describe should already have been allowed for.

Besides, if you believe in the LNT model (which current standards are based on) then it makes little difference whether you give 0.25 mSv/yr to ten people or 2.5 mSv/yr to one person (both being well below the level at which acute effects become significant). Bioaccumulation is an issue, but merely having an uneven distribution should not be.

Relaxing the rules may in theory be safe. The problem is that if you give people an inch they will take a mile. We knew that in the 1970s, but despite Fukushima the EPA seems to have forgotten it now.

Bear in mind that the safety precautions needed to prevent very low level emissions are different to those needed to prevent catastrophic meltdowns. Focussing attention and resources on the former rather than the latter isn't necessarily in the best interests of safety.

Comment Re:n/t (Score 1) 278

I make the difference to be about 80cm. That's small enough that it might not affect the design (given that you already have to allow for factors like thermal expansion), but even so it's surely worth doing the sums to make sure.

Bell Labs Unix -- Reach out and grep someone.