No, thats not what theyre saying:
On Wednesday, Tokyo Electric Power Company presented the Nuclear Regulation Authority with an estimate that the removal work discharged 280 billion becquerels per hour of radioactive substances, or a total of 1.1 trillion becquerels.
Theyre treating Bq as if its a quantity of radiation. They dont know what theyre talking about. They multiplied 280 billion by 4, and ended up with 1.12 trillion-- which isnt how rates work.
No, thats not what theyre saying:
There are two types of comments in this thread.
* Comments by people providing definitions for what a Bq is, talking about equivalent measures, giving conversion forumulas, and providing hard facts; generally these are saying that the number is either irrelevant, and / or not really that big.
* Comments by people who are being quite vague, and warning of various undefined threats to various undefined organs because of how big the number is.
Which type of comment do you find more credible?
The material from Fukushima bioaccumulates inside the body.
Nothing about being "from Fukushima" has any affect on what the material does. Its makeup does. Is it radioactive Iodine? Potassium? Thorium? Uranium? Cobalt-60?
How contaminated is contaminated?
As everyone else has said a milion times already, Bq is not a quantifiable "amount" of radiation, its a rate. You cannot release x Bqs in y period of time, any more than you can travel 50mph in 2 hours. You could say "I travelled 100 miles", or "I am currently travelling at 50mph", or "over 2 hours I averaged 50mph", but mph is not , itself, a quantity. Same with Bq.
One Bq is defined as the activity of a quantity of radioactive material in which one nucleus decays per second.
How are they going to make the strongest monopoly ever? More stores than ever before are online now. I can literally order everything I need and have it shipped to me, and never touch amazon. Lowes, Giant Foods, clothing stores, Ali Baba, Ebay, all have online stores.
The barrier to entry is so absurdly low that I dont think anyone needs to worry about Amazon's monopoly, at least in the shopping sector.
And the barrier to entry for cloud services is pretty low too-- all you need is space at a datacenter (which can be had for relatively cheap) and you can offer a cloud platform.
NAT provides "security" because it is actually impossible to hack a computer behind a NATing router, without A) hacking into the router (in which case a firewall doesnt matter), or B) having the end user poke a hole / port forward through the NAT (which they could do with a firewall).
I suppose if you were MITMing the connection and could see what ports got opened for outbound connections, and you could spoof inbound traffic, you could perhaps exploit something-- but this will not affect the majority of users. In that sense it certainly DOES provide security, unless your ISP or someone similarly equipped is out to get you.
Except that you cant predict the future, so you dont know how many will be reported by the end of 2014. Extrapolation only works when you have a reason to justify it; neither you, nor the article does, and the original paper does not make that (dumb) extrapolation.
Its perhaps misleading to say that NAT is security, but it undoubtedly provides security.
I can think of no technical reason that someone with access to dump the RAM would not get those registers; the RAM used in the CPU is much less volatile than normal DRAM (its called "static RAM" for a reason).
For example, lets say you manage to catch a VMWare vMotion. You have A) the RAM, B) the current CPU instructions, C) the CPU registers. Ditto with Fault Tolerance.
Lets say you ice the RAM and dump it. If you have access to do that, you could in theory do the same for the CPU; and since CPU memory decays like 1000x slower than DRAM, it would almost certainly be less corrupted than the RAM.
There WAS no 100% increase. The article misinterprets the graph, and the report that it references contradicts its analysis. IE rose from some ~130 vulns to some 140 vulns; thats not 100%, its like 5%.
Like Mugato, I feel like Im taking crazy pills here. Almost noone bothered to fact check the original report, but everyone has an opinion on it. Keep doing what you do, slashdot.
IE had fewer vulnerabilities last year than Chrome, or Firefox. This year it has more. Thats not a slam dunk, or an indication that IE is a dogs breakfast.
Ie has been substantially rewritten since the IE6 days, and is a sort-of-decent browser these days. These days its firefox thats the dogs breakfast; the only saving grace it has is its low userbase and its strong extension support that can plug some of the glaring holes (like its crappy 1-process architecture, its lack of sandboxing for anything, etc).
Have you considered reading the article before criticizing someone else's analysis of it?
Firefox was "more vulnerable" in 2013, and actually for several years post IE9, I believe it was generally considered LESS secure than MSIE due to its lack of common protections (like reduced privlege, sandboxing, etc).
The real surprise here is that people on a tech site continue to use awful metrics for judging things ("works for me", "everyone else hates it, must be bad").
Neither can IE. It has a ~5-10% increase.
The summary is absolute garbage; it implies that the number of vulnerabilities is doubled (it isnt), that IE security is worse (but public exploits are reduced from last year, and mean time to patch is vastly reduced), and that its always been worse (last year, Chrome and Firefox had more exploits than IE).
Unsurprisingly, everyone here took the bait.