Please create an account to participate in the Slashdot moderation system


Forgot your password?
For the out-of-band Slashdot experience (mostly headlines), follow us on Twitter, or Facebook. ×

Comment: Re:bad statistics (Score 1) 240 240

by gdshaw (#49606589) Attached to: Chrome Passes 25% Market Share, IE and Firefox Slip

And yet if I look at StatCounter's map function, showing the leading browser in each country Chrome leads in most of the world. IE only leads in Japan, South Korea, Swaziland (pop. 1.1mio), Greenland (pop. 55000) and Antarctica (5000 visitors). Firefox has a few strongholds like Germany, Indonesia, Myanmar, Bangladesh, Iran and a bunch of countries in Africa, but the only place IE is ahead of Chrome in second place is Iran (pop. 78mio). With Chrome winning on walkover in Europe, South America, North America, Africa and Oceania and taking massive wins in China, India and Russia I don't see how any possible weighting of StatCounter's numbers would put IE on top.

You're right that the country weightings don't account for the difference by themselves, but there is also the difference between counting users versus pageviews, and it would be unsurprising if there were differences between the types of websites sampled by the two companies.

Comment: Re:bad statistics (Score 1) 240 240

by gdshaw (#49606447) Attached to: Chrome Passes 25% Market Share, IE and Firefox Slip

Correction: it seems that Net Applications do count unique users per site, and it is per day not per month, so most of the discrepancy must be due to a different mechanism from the one I described above. Apologies for the belated fact checking.

The figures do count users rather than traffic, and while they claim to weight by traffic, the data source they appear to be referring to is stated in terms of users. If that is so then it would remain the case that they are counting traffic which is not real: users presumed to be online for more days per month than they are, and to visit more websites than they do. That is less likely to result in a very large discrepancy, but could very well be enough to account for the difference between Net Applications and other published figures.

Comment: Re:bad statistics (Score 1) 240 240

by gdshaw (#49606091) Attached to: Chrome Passes 25% Market Share, IE and Firefox Slip

It isn't just their correction algorithms, it is the whole basis of what they are trying to measure. Consider this.

I probably use IE once or twice a month, but Firefox and Chrome several thousand times in the same period. So far as Net Applications are concerned that counts as one user for each of the three browsers. Meanwhile, over in the Duchy of Grand Fenwick you might have a user who doesn't bother installing Firefox or Chrome because he uses the Internet so little, but who probably counts as several users for IE once the statistics are corrected[1].

The result is that IE could dwindle to a negligible fraction of total web traffic and Net Applications might still show them ahead in terms of users - even if their correction factors were spot on (which I doubt). I'm sure they're doing their best in their own terms, but it's difficult to see what the figures they are producing are useful for. The StatCounter sample may be biased, but at least their results bear some resemblance to the traffic that a web site is actually likely to receive.

[1] No offence intended to readers from the Duchy of Grand Fenwick.

Comment: Re:Bad idea (Score 1) 626 626

Now that happened, and we add the previous corpus of English-speaking people, I think its reached a critical mass to make it a de-facto standard (like how Windows and not anything really good is our most common OS

Er, you do realise that it is several years since Windows was the most common OS (longer if you include embedded systems). It's a great example of the network effect at work, but shows how it can both give and take away.

Comment: Re:A Language With No Rules... (Score 2) 667 667

by gdshaw (#49265113) Attached to: Why There Is No Such Thing as 'Proper English'

Yes, but the ultimate goal is communication, and to that end some change is useful, some is harmful - and almost any change will have the effect of making older texts less readable.

Think of descriptivists as scientists and prescriptivists as engineers (albeit, it must be said, not always very good ones). I think there is a role for both.

Comment: Re:So.. what? (Score 1) 255 255

by gdshaw (#47628581) Attached to: TEPCO: Nearly All Nuclear Fuel Melted At Fukushima No. 3 Reactor


Any difference looks a lot smaller than the markup I've ended up paying for things like going through an energy co-op instead of straight from the generating company.

[...] We do need to talk about cost but we need to talk about ALL the costs not just the operating costs but all the externalized costs as well.

Not just the costs, but also whether the energy is dispatchable.

Power sources which can be turned on and off at short notice - such as gas and hydro - are economically more valuable than ones which can't - such as coal and nuclear. (Some nuclear plants can be ramped up and down, but the capital costs are so high and the fuel costs so low that it doesn't win you much.)

Any of the above are considerably more valuable than sources which are both non-dispatchable and intermittent, such as wind and solar. (How much more valuable depends on factors such as the shape of the demand curve, and how much of the rest of your capacity is gas and/or hydro. Intermittent sources can work quite well in some locations, others not so much.)

Comment: Re:headed in the wrong direction (Score 2) 230 230

by gdshaw (#47493465) Attached to: EPA Mulling Relaxed Radiation Protections For Nuclear Power

Background levels are around 1 mS/year. So why advocate thresholds more than two orders of magnitude lower than what people normally get in a year? I just don't think science has much to do with your choice of thresholds.

This is a fallacy. The threshold should be set on the estimated benefits of a higher threshold vs the estimated harm from the additional radiation. The background radiation has nothing to with it.

It would be a fallacy if background levels were fixed and unavoidable. They're not. So long as people are allowed to and choose to travel by air, and live in areas with above-average background radiation, it is reasonable to argue that nuclear power should be held to a similar standard.

(Granted that medical imaging is different because you would normally be doing it for a good medical reason.)

Comment: Re:About time (Score 2) 230 230

by gdshaw (#47493411) Attached to: EPA Mulling Relaxed Radiation Protections For Nuclear Power

Nuclear plants don't emit an even level of radiation in all directions. They emit radioactive particles that then move around on the wind, in the soil and in the water. These particles can accumulate, so the level needs to be kept very low so that they can keep dispersing.

0.25 mSv is a measure of the dose received, not the radioactivity emitted. A given amount of radioactivity inside your body will result in a larger dose than the same amount outside, so the effects you describe should already have been allowed for.

Besides, if you believe in the LNT model (which current standards are based on) then it makes little difference whether you give 0.25 mSv/yr to ten people or 2.5 mSv/yr to one person (both being well below the level at which acute effects become significant). Bioaccumulation is an issue, but merely having an uneven distribution should not be.

Relaxing the rules may in theory be safe. The problem is that if you give people an inch they will take a mile. We knew that in the 1970s, but despite Fukushima the EPA seems to have forgotten it now.

Bear in mind that the safety precautions needed to prevent very low level emissions are different to those needed to prevent catastrophic meltdowns. Focussing attention and resources on the former rather than the latter isn't necessarily in the best interests of safety.

Comment: Re:Spent fuel (Score 1) 152 152

I've always been curious about this. Why can't we put all the waste on a rocket and send it to the Moon?. It shouldn't be that hard and would be cheaper than leaving it on Earth to cause future issues.

The main reason is that burial is fairly safe whereas rockets are not.

Comment: Re:Why? Is it really necessary? (Score 1) 187 187

by gdshaw (#46675919) Attached to: Ask Slashdot: User-Friendly Firewall For a Brand-New Linux User?

In any event, this only protects against internal incompetence rather than external malice, so is not a necessary part of running a secure system.

You forgot to mention internal malice.

Let's put my comment back into context. I was talking about forgetting to bind a private network service to the loopback interface. That would normally be done by an administrator. If an administrator is acting maliciously then you have fairly serious problems with or without a local firewall. In fact, this is a pretty good demonstration of my point that if you are going to use a firewall to protect against that kind of threat then the firewall wants to be on a different box (eg. a router or dedicated firewall), not the one that you are expecting to be compromised.

To be clear: I'm not saying that firewalls should never be used on Linux-based hosts (that would be ridiculous), only that they are not a necessary part of running Linux securely in the way that they are for Windows.

Comment: Re:Why? Is it really necessary? (Score 1) 187 187

by gdshaw (#46674709) Attached to: Ask Slashdot: User-Friendly Firewall For a Brand-New Linux User?

That's fine as long as you are sure there are no bugs in the services you run and the TCP/IP stack, and you keep them all up to date, and you don't mind kiddies hammering on your door 24/7 trying to guess your passwords.

If you need a service to be publicly accessible then you will need to configure the firewall accordingly, in which case it typically provides no protection if the service is exploitable.

If the service doesn't need to be publicly accessible then either turn it off or bind it to the loopback interface. Why add extra software to protect against a vulnerability that you could have avoided creating in the first place? Note that operating systems that take security seriously do not install public-facing network services unless you ask them to.

Firewalls certainly have their uses, but they aren't a necessity on non-Windows machines in the way that they are for Windows.

Comment: Re:Why? Is it really necessary? (Score 1) 187 187

by gdshaw (#46674663) Attached to: Ask Slashdot: User-Friendly Firewall For a Brand-New Linux User?

Firewalls are about keeping things in as well as out. One of the reasons that there are so many problems on corporate networks is that there's often times no firewalls once you get to the LAN. I remember when I was in college the set up in the dorms was dire. People would be sharing things read and write and you'd wind up will all sorts of nasty things on the network, and then there was the malware.

Yes, but I presume you are talking about Windows machines which run an SMB/CIFS server out of the box. Most GNU/Linux distributions rightly don't do that. Typically if you want to run Samba, or an FTP server, or an HTTP server on the default port then you need to be root to do that. Once you are root then you can also poke a hole in the firewall.

Granted you can run servers on high-numbered ports, but within a LAN all that does is allow two machines that had already been compromised to communicate with each other. For communication with the outside world I prefer to detect and/or block that at the boundary router (otherwise all it takes is a local root exploit to disable the firewall).

The same applies to outbound connections, although in a world where so many programs need network access that is arguably a lost cause for general-purpose workstations. In any event, a firewall isn't the right tool for controlling the capabilities of individual programs: you really need something like SELinux or AppArmor to do that effectively.

Chemist who falls in acid will be tripping for weeks.