Customers don't give a damn if there is an API. Just a tiny tiny % of geeks care. But that tiny tiny % are developers. And customers like what developers create.
Customers don't know how the magic black boxes work. But they sure benefit from the magic created when those who do know can do their thing.
Also - for a company who "doesn't give a damn about open source", they sure do a lot of it.
It should be noted that this particular attack (base station impersonation) was actually demo'd and performed last year during blackhat and defcon.
I highlighted the important part that you should have been paying attention to.
Either way, it's impossible to argue the data collection was accidental. You don't send a van out running software without having RTFM and testing it out in some trial runs.
Not impossible at all. Kismet provides data in various different formats. And even then, if what you're doing is extracting particular pieces of data from the traffic capture but not paying much attention to everything else, it isn't unreasonable to not really notice what else you've captured.
I used to occasionally run Kismet during my commute. I was curious about what access points I could see during my route and what state of configuration they were in (with the expectation to scoff at all the default unsecured - actually surprised that those numbers had fallen out in the real world). After doing this for a few months, I was going back through my directory to clean up. Just for giggles I decided to actually look at the caps I had collected and see if there was anything interesting in the packet payloads. Most of it was junk; driving around isn't a particularly good way to snoop on a network. But I did find one email password from a slice of captured POP traffic. So I did end up with someone's sensitive data sitting on my drive for possibly several months despite the fact that I wasn't particularly interested in it or being aware of it.
I suspect this is more or less what happened with Google. Scanning through the Google van captures might have turned up nothing. But Google was doing this on a larger scale so the odds were in the favor of something turning up due to the sheer amount of unsecured traffic out there.
Not if your discussion is being done via bullhorn.
Bullhorns imply you want your words heard by many people. The WiFi equivalent of a bullhorn would be either a signal booster or a publicly advertised network (like at a coffee shop).
It's possible to eavesdrop on conversations in your house from miles away, no bullhorn required. But people reasonably don't expect this to happen. The same is true for their WiFi signals. People reasonably don't expect a company going around and logging their information like this.
The problem is that we have people using bullhorns to communicate and don't realize the implications of doing so. Then they're all shocked when people can hear what they're saying just by listening.
I'm not terribly outraged by this, although I do think Google knowingly went well beyond what is reasonable. I mostly find the nerd hypocrisy here to be ridiculous.
Apple gets called "evil" and thoroughly trashed here for *not* recording people's, or even any particular device's, locations, but Google gets a pass for *actually* treading on this territory (definitely logging the location of devices), and even logging actual network traffic!
I expect I'd be upset if I thought Google was actually logging the data in the sense of trying to catalog and use it. The fault that I lay at Google's feet is to not have realized the potential sensitivity of what they were collecting and done proper cleanup afterwards. As for Apple.... unless I'm missing something, Apple was not doing the exact same thing as Google was. The method and intent is likely as important as the resulting data. And so to decode the "nerd hypocrisy", you probably have to go in to the details.
OK, let me rephrase. If this tool does something you want, but also does things you don't want, then it may not be the right tool for the job. (A hammer will kill pesky houseflies, but it will also leave holes in your walls.) Try it like this:
The tool is perfectly suitable for what they need. The problem is that they didn't scrub the data they collected and then destroyed everything else collected.
The TSA wants to collect information about each passenger (whether or not they are carrying prohibited items). They have a tool that collects that information, but also collects information that the TSA doesn't need, but that has potential to upset people (images of their privates). If the TSA goes forward with using that tool, they can expect blowback. It might be a great tool for collecting the desired information, but that by-product causes problems - perhaps enough problems that it's worth finding a different tool.
If I'm walking past a security camera in a public location and it gets pictures of me naked because I'm wearing no clothes, I have little reason to be upset about my nudity being captured. What the TSA is currently doing is taking steps to expose me beyond what I've chosen to expose in public. The problem here is that there's a large population who think they're wearing the finest new Emporer fashion and don't like the idea that they've been naked all along.
This isn't so much a technical problem as a management problem. I don't think it's intentional or malicious, but it might qualify as dumb. The snark comes in when you've got an ex-CIO pooh-poohing project management at the same time that Google is having a really hard time putting this one to bed.
I don't have much say on the management issue but I'd imagine if I'm a big believer in PM processes, this would irk me. As I noted, I think the real problem here is that Google didn't properly handle the data. Either the people running the project or some layer of management should have realized the potential of the data they were collecting and ensuring it was handled more appropriately.
So, if you go out and shoot a rabbit and eat it for dinner, you have done nothing wrong. If Hasenpfeffer Incorporated sends trucks around the nation to systematically shoot every single rabbit in the country so that they can sell the meat, then we have a problem.
But the analogy only works in so far as there are a limited number of rabbits to be had and hunting on a systematically large scale depletes the populations. Meanwhile, systematic capturing of broadcasted, unencrypted network traffic does not decrease the availability of that traffic (although if it did - it'd probably be a Good Thing... security awareness).
The analogy would be different if having a large amount of rabbit from various locations easily accessable would be an issue.
Given Google's history, and the fact that no one has tried to do what they are doing before, I would be likely to give them the benefit of the doubt that they did not intend to be evil by collecting more data than they should have. The ignorance excuse does not extend forward though. If in six months, it comes out that they still gathering that kind of data, they don't get to claim ignorance.
I think the real issue here isn't that Google was able to record this information (any wifi device does this as the most basic level). The problem is that Google didn't realize the significance of the junk traffic and systematically scrub / destroy it (where wifi devices differ is comitting data to long-term storage). It appears that Google won't continue that particular behavior.
C'mon, how do you write a program to log all MAC addresses, and not realize that it's going to collect all MAC addresses? Do you think they just talk to their vans and there was some sort of ambiguity? Like they said, "Google Van, please record MAC addresses and GPS coordinates", and it just interpreted it wrong because they were unclear?
You don't write your own software. You use a common off-the-shelf app that provides a data dump with everything you need. It's called Kismet. You should take a look at it.
Plus, the notion that a company can collect data “accidentally” is laughable, especially considering the process in which it was acquired.
So what you're saying is that you've never used off-the-shelf software to do something and you have absolutely no experience using Kismet.
Real Programs don't use shared text. Otherwise, how can they use functions for scratch space after they are finished calling them?