However the 'authorities' in question should be capable of responding to those reports in a sensible fashion.
Picture this scenario. Ten guys and ten girls live together. All ten of the guys have slept with five of the girls in the house within the first ten days. That makes them promiscuous. However, five of the girls engaged in no sexual activity whatsoever. That gives us a 100% male promiscuity rate, and a 50% female promiscuity rate.
If we're going to discuss this properly then I think we need more info on any possible threesomes.
There's no need to ever actually connect to any network to map them, just slurp up SSID broadcasts, maybe channel and signal strength.
You don't need to 'connect' to them but IIRC there is some benefit to looking at the traffic beyond mere broadcasts. IE if you can see device X sending traffic to Y you can begin to imply the position of Y even if you can't see it that device yourself because it's too far away from you.
A <------ X <-------> Y
Moz might not be doing that and perhaps it isn't a "need" but if the goal is to get the best data it's not correct to say that deeper analysis than mere SSID broadcast doesn't have benefits. Of course if you are looking deeper then you should be paying attention to any possible privacy implications and avoiding recording anything that could be considered 'content'.
Right. With credit cards, you're basically getting free insurance paid for by people who keep loads of interest-bearing debt.
Don't be silly. That money stays with the financial institutions involved. Any money that needs to be refunded due to fraud comes from the merchants who accepted the card (with a hefty fee attached too).
It's not a fallacy, you didn't understand. People are saying that limited computation facilities will encourage innovation. If limited computation facilities encourage, nay, force innovation, then limiting it even more should force even more innovation.
Why should it? That seems like an obvious fallacy. Limiting us to abacuses would just mean people would be looking to improve on the abacus. Limitations in silicon means people looking at alternatives to silicon (just as limitations with vacuum tubes meant people looked for better alternatives and the transistor).
Perhaps in 60 years the basic components of today's technology will look as quaint and old timey as vacuum tubes do today. If we are just using iteratively improved versions of todays integrated circuits then that might be somewhat disappointing.....
That's basically how you sound.
How I sound? You really should work on your reading comprehension, I was merely explaining what the article was saying.
The intent behind that sentence seems fairly clear, that the end of predictable speed increases may lead to greater focus on whole other avenues of development and other unpredictable and exciting ideas popping up.
Centralizing control over analysis of student performance data --- taking the capability away from teachers to evaluate how a program is really working, and placing it in the hands of
You seem to view this as a zero sum game.
Additional central analytics doesn't necessarily take capabilities away from teachers. It could inform and help them.
Anything can be used badly, it seems to me the fight should be to use analytics well, not stop it being used.
For improving quality of the educational materials, all Code.org needs is aggregate summary data
I don't think that's necessarily true, or at least it's not true that more specific detail than aggregate data won't lend itself to additional useful insights.
For instance it's reasonable to imagine that different people learn better in different ways and that by accumulating data on individuals one might be able to determine different groups among them which might in turn lead to more tailored materials for different types of learners.
If you aggregate the data early around one factor (eg a class or school) that will vastly reduce your ability to come up with other ways to view the data or to have things emerge from the data that you didn't already anticipate.
It would be absurd to suggest that more fine-grained data wouldn't allow for more detailed analysis. The only question is where the line needs to be drawn for privacy or other reasons.