Just because they're trendy to talk about, you don't need to use some new-age, look at me, I know about Neanderthals and Denisovans way of spelling their name. They're Neanderthals, despite the fact that 'tal' is the German word that it's based on.
Interesting, you solution seems relatively fair, and reasonable. Prediction it will ever be implemented: 0.05%
Or HCF, if you wanted to be really sure.
We would also appreciate a few ovums from a selection of healthy, attractive ladies aged 18-25. For research purposes of course!
Blue-eyed blondes with no Jewish contamination, er, ancestry, please.
And because if you set up a random roadblock type scenario, you probably need traffic control and a police presence in case of accidents. It doesn't mean that they actually cared or were part of it.
That people immediately assumed it was a Gestapo maneuver and then complied is more an indictment of the them and the degree of freedom people have already willingly given up for the veneer of security.
Interesting (and well-stated) points.
However, I don't think allowing users to control individual permissions will fix it. Users will just continue authorizing the kitchen sink. If some of them start exercising more control over specific permissions, app developers will simply respond by refusing to show the dancing pigs if SMS isn't actually working.
And I don't think shutting off the APIs entirely is an acceptable solution, even if it arguably works for Apple.
I don't think that analogy is useful. If you leave your door open, you're the one that stands to lose, but if vulnerabilities exist the software company (generally) isn't the loser, which is why it makes sense to impose some method of bringing the societal costs to bear on the company. In economic terms, vulnerability costs are largely a negative externality, while security costs are internalized. That's a recipe for incenting people to ignore security, and the general solution is to internalize the externality.
I think a better analogy would be leaving something dangerous to others unsecured. Say, explosives. If you have a license to handle explosives and you don't follow the rules for securing them appropriately, you will get fined (if your'e caught). The other twist with software vulnerabilities is that the risk associated with a given bug is much harder to pin down, whereas it's pretty easy to quantify with a given type and quantity of explosives. This proposal attempts to use market forces to quantify the risk and determine the dollar amount of the "fine". It further tries to use the fine to actually motivate and therefore fund the security research. In the case of explosives, the government pays people to audit licensees and the value of the fines go to the government. I suspect if we looked a bit we could find some situations like this proposal, where the government essentially outsources the auditing operation to a third party who is compensated by collecting the fines.
No, very clearly no.
This "arm race" wouldn't ever occur. Apple and MS are considerably more hostile towards developers and the developers just accept it. Making the OS, Hardware and Store owner mad at you is not a recipe for success if you want to be an app developer.
I suppose Google could institute a policy of banning apps that try to circumvent ad-hoc user permission restrictions. Yeah, that would cut the arms race off at the knees. Good point.
Engelbart lived at a time when bureaucracy and inflexible institutions ruled. To get anywhere one had to jump through hoops constantly and appeal to those few authorities that controlled the purse strings.
Today there are many points of accumulated capital that one can appeal to for assistance and funding. Forty years ago there was just the government or a few old giant corporations.
Some did raise hell. They're prohibited from talking about it because of gag orders and "national security".
Prohibited ? What exactly does that mean?
In this case it means that company officials who decided to violate the gag orders could be held personally and criminally liable for their decisions. By some readings of the law, they could even be charged with treason, which is a capital offense. It's unlikely that it would go that far, but these aren't laws that can be violated with impunity.
When was the last "Massive iOS Mobile Botnet Hijacking SMS Data" headline?
When was the last maximum security prisoner getting run over by a bus headline? Sometimes freedom has its own risks, which includes idiots making poor decisions over where to get their software from. Does that mean everyone should be locked up in a cage to prevent that from happening?
No, not at all, but there are parts of this story that expose one of the weaknesses of the Android permissions model; namely that an app requests a set of permissions (that are overly broad to cut down on the number of permissions groups) and you have to either accept or deny those permissions wholesale.
Because the people who download dodgy apps and sideload them, then click past the permissions list without even looking at it would selectively disable the permissions they didn't really want to grant?
The permissions problem you refer to is a really difficult one to solve. Oh, it could be solved for you, by giving you the ability to selectively disable permissions (which, BTW, you can actually do with a small amount of one-time effort), but face it, less than 1% of Android users would carefully vet and individually select the permissions. Probably much less than 1%.
Then there's also the problem that individual permission selection would just cause app developers to test to see if they got all the permissions they wanted, and refuse to function at all if they didn't. Google could respond by trying to make it appear that the apps did get permission, perhaps by serving up fake data, but that would just create an arms race between app developers and Google, and apps have a much shorter release cycle. In fact, for power users the status quo is probably better, because they can root their phones and use an app to selectively disable permissions, but there aren't enough of them (far less than 1%) to motivate app developers to try to work around it.
I don't know what the solution is, but I don't think that's it. I lean more towards finding ways, at least in the official app store, to shame apps that request broader permissions than they should. Maybe Google should develop some sort of a "risk rating", based on the permissions requested and the trustworthiness of the publisher and tag every app in the store with it, perhaps even adding an additional warning dialog if the risk is over some threshold, and probably artificially lowering "risky" apps in the search results. Of course, the really problematic apps aren't on the Play store, and adding an additional warning on an app that a user has already chosen to get from some dodgy site is unlikely to help. But Google might be able to dissuade publishers of apps on Play from requesting more permissions than absolutely required.
(Disclaimer: I work for Google, but not on Android. My relationship with Android is that of a user.)
Just because they found it 1 place that they looked, doesn't mean its not NOT in other places too.
Uneducated consumers + modern denial of causality + business interests = fail. You have ignorant parents who buy antibacterial everything because its "for the children" and because after all, they need to protect theirs, and it probably wont turn into MRSA _for them_, so shouldn't they do everything they can,etc. etc. Ditto with food - people buying shit at Walmart because they need to save money, meanwhile their neighbors lose their jobs and their kids end up playing with cadmium laced toys, but hey, they need to save 3 dollars on that gizmo.... Add the business interests capitalizing on this ignorance and philosophical gap (A !is !A) and you end up with the shitstorm we're in.
Moral: know what you're buying, know why you're buying it.