Still seems more "cute" than "cool".
It's actually really convenient. You'd have to try it to see just how smooth it is. Much easier than the "share" button.
I know everybody talks about encryption, but the word itself is just the tip of security. What's the key size? What's the algorithm?
It uses Linux dm_crypt. Here's the source code that configures it, and protects the dm_crypt master key: https://android.googlesource.c...
What data is encrpyted?
Most of the rest of your post is speculation assuming that Google is intensively mining everything backed up. I'm quite certain that's not true, but I probably shouldn't comment in more detail.
The only thing it will do is keep your private information out of the hands of someone who picked up your lost phone and decided to keep it (or sell it).
Yes, that's what device encryption is for.
(Disclaimer: I'm an Android security engineer. I'm speaking for myself, not for Google.)
You need your head read. Google has shown time and again that it does not care about your security. There is no need to trade off convenience for security in cloud backup. Encrypt locally and send the data encrypted to backup. This would be great but i bet that Google also holds they keys and decrypts on their end. Google says it wouldn't be able to use your data for their massive data mining and information theft machine if it were properly encrypted. This is why the data sits on their servers unprotected by encryption, they are the antithesis of your guardians of security. If you value your data, turn off all Google services and manage your own backups.
There are two different threat models to consider. Device encryption protects against one, but not the other.
The purpose of device encryption is to protect your data from someone who obtains physical possession of it, because it was lost, stolen, confiscated, etc. The goal really isn't so much to protect it from law enforcement or the NSA -- if the NSA is interested in your data, they'll get it, period -- but against people who might want to, for example, steal your bank account information, etc.
Device encryption obviously does nothing to keep your data secret from someone you actively send the data to. If you have Google's backup services enabled on your phone, then it will back up a bunch of stuff. I don't know everything that's backed up, but I think Wifi configuration is, your list of apps are, the list of accounts on your phone, your contacts, and similar. Separately from device backup, you can also have the Google+ app upload your photos and videos automatically, and you can also configure the device to report your location, in various ways and for various services (there are several controls). Whatever you have backed up is (a) not protected by device encryption and (b) cannot be secure from whoever you backed it up to unless you have some sort of encryption key which the holder does not.
It's also clear that anything that is stored by Google and which isn't encrypted with some key not available to Google is also accessible to the US government and local law enforcement, assuming they have the legal right to demand it from Google. Device encryption does not do anything to defend against that. This is all obvious and not in dispute. It also doesn't make device encryption worthless, it just means that it defends against different threat.
Also, I have to say that from my perspective as a security engineer at Google you couldn't be more wrong about Google's concern for user security. Actually, if you look at the company's track record on security technology creation and deployment, I think that point is unarguable. Perhaps what you really meant to say is that Google doesn't care about your privacy, which is different from (but connected to) security. From my perspective, I think that's also wrong. It seems to me that what Google wants to do is to get your permission to make a trade, your data for targeted advertising in exchange for Google's services, and if you don't want that trade, Google wants to enable you to opt out of it (hence all of the opt out tools, privacy dashboard, etc.). Obviously, if Google is not careful to protect users' privacy, no one will be willing to make that trade, so Google is very, very careful.
(Disclaimer: I'm a Google engineer, but I'm speaking for myself, not in an official capacity.)
Google has pointed out that Android already offers the same feature as a user option and that the next version will enable it by default.
Why isn't it already the default setting?
(Android Security Team member here... though these are my own perceptions and opinions, not an official statement.)
First, because it's not completely trivial to make it work correctly, all the time, every time, on hundreds of different devices. Android uses dm_crypt, so the foundation is solid, well-proven code, but that doesn't mean there aren't tricky corner cases. With the huge number and variety of Android devices out there, you can be certain that if there's a way it can go wrong, it will. So, conservatism suggests it's a good idea to make it optional for a while and shake out any issues. It's been optional for three years now, and is in use on many devices (I don't know how many; I'd guess tens of millions, though), so it's time to take the next step.
Second, performance was a problem. Not run-time performance -- AES is really fast -- but the initial encryption required reading and writing many gigabytes so it took a long time just to do that much I/O. Encrypting by default means that either the device has to be encrypted in the factory, which would be a major production bottleneck, or else users would have to wait 20 minutes for their phone/tablet to start up just after they unbox it. That's a bad user experience. For L this was optimized so it only encrypts blocks that are in use. Since on a new device very little of the data partition is in use, very little has to be encrypted. That makes the initial encryption very fast (a few seconds).
There's actually another device encryption-related improvement coming in L. I'd love to describe it in detail since I worked on parts of it, but the article doesn't mention it so I'll hold off.
If you encrypt your Android phone, neither Google nor anyone else has any special access to its contents. However, there is a caveat.
In the current (KitKat) implementation of device encryption, the actual data encryption is done by standard Linux dm_crypt, which is very strong assuming the master encryption key is well-protected. The master encryption key is in turn encrypted by a key derived from your password. The derivation algorithm is good (scrypt) but it's still possible to brute force the password space. How difficult that is depends on how long your password is and unfortunately there's a clear conflict between security and convenience here. You can choose a very long password and have high confidence that it's infeasible for anyone to break it, but then you have to type that long password on your phone all the time.
Apple has undoubtedly made use of the "Secure Vault" chip they have in their devices to store a significant portion of the material needed to derive decryption keys in secure hardware, which is almost certainly configured to rate-limit brute force attempts, and eventually just to lock the device up forever. Given that the obvious and straightforward implementation of such a system would never have given Apple the ability to unlock phones, they must have decided to add a sort of "back door" for themselves, probably to rescue customers who'd locked themselves out. Now, they're removing that back door. Good for them.
Apple Pay is basically a contactless EMV wrapper for iPhones. SoftCard complies with EMV too, but I've seen nothing indicating that Apple Pay will work with SoftCard processors. This is purely a contractual thing though; there's nothing technical to stop it from working.
There aren't even any contractual issues, because there is no such thing as a "SoftCard processor". SoftCard transactions are processed by normal merchant acquirers, through normal clearinghouses and back to their issuing banks. Nothing in between even knows it's not a plastic contactless smart card chip. The same is true of Apple Pay, with the exception that at some point in the network the network token gets translated into a issuer-specific data (SoftCard gets issuer-specific tokens delivered to the device).
Google Wallet is different because Google is acting as the issuing bank for the Wallet proxy card, so Google does have to process payments, charging them back to whatever backing instrument you have selected (which needn't be a credit card).
Basically, you can't roll out something like Google Wallet for an iPhone, but you can support all sorts of NFC payment types with it.
Well, on any network that supports network tokenization, which, so far, is only AMEX, MC and Visa, and only in the US. Discover supports contactless smart card payment, but doesn't support Apple Pay (yet; I'm sure they will) because they have to implement the necessary pieces.
Can you put the above mentioned smart card reader apps into a kind of promiscuous mode? It would be interesting to have your android device in your pocket in a crowded bus or elevator to see what info it could capture.
Sure you can. You won't get much, though, for three reasons.
First, NFC is really short range. Like, less than a centimeter in practice. So you'd pretty much have to actually bump your pocket into another pocket containing a phone or contactless smart card (in a very thin wallet).
Second, at least with Android phones that you might bump into, if the screen is turned off, NFC is turned off, so most of them won't respond.
Third, and most important, the whole point of smart cards is that they're smart. They're microprocessors which implement secure challenge-response protocols, and are picky about what information they'll share with anyone who doesn't authenticate properly. NFC is just smart cards in a different form factor.
I think "SoftCard EMV standard" meant: the EMV standard that SoftCard uses.
So "Google networking standard" is the network protocol that Google uses (HTTP)?
You can transfer pictures, files, and anything else apps care to support.
Apple already has that capability; but it's far less caveman-like. It's WIRELESS. So, if you have already gotten into your respective vehicles, you can still transfer the information... Yes, I'm talking about the almost ubiquitious "Share" Button. Your data-transfer method reminds me of the Zune's "Squirting" feature. How quaint. We iOS users have the internets for that.
You didn't read my post. Android also uses the Internet for file transfer; it uses NFC to make indicating which device to send it to as easy as tapping the phones together. Obviously there are other options if you like typing or picking from lists, the way you have to on iOS.
Oh, and if NFC were the actual data transfer mechanism, it would also be WIRELESS, because it is wireless. Radio frequency. But at just shy of 1 megabit per second it would be a little slow for moving large files.
Paypass terminals will accept Apple Pay payments.
Oh google? You mean mean the Google Wallet that isn't available in large parts of the world? They need to deploy terminals? That's a fail right there.
Apple Pay will be able to use Google Wallet / ISIS (SoftCard) terminals, and vice versa. They all use the same protocols and base technology.
Apple Pay will be successful, and Apple will garner much praise for that success from people like you who don't know the industry, but what's really going to make it successful isn't anything Apple is doing or has done, but what Visa and MasterCard did two years ago, when they announced that the liability shift will be imposed in the US in October 2015. That policy change by the card networks will give merchants huge financial incentive to get all of the necessary terminals deployed, which is why many of them are now (and have been for some time) gearing up to integrate and deploy chip-capable point-of-sale terminals.
And, if you want to look at the causes for Visa and MasterCard's decision... the biggest single factor was almost certainly the deployment of Google Wallet, which moved NFC payment in the US from a "someday" possibility to "people are using it now". At the end of the day, Apple Pay will owe most of it's success to Google.
I don't want to disparage Apple too much here, though, because they have been able to do one thing of huge significance, and it is their market position and clout with the mobile network operators (MNOs) that made it possible: They helped push through the deployment of network-level tokenization. This is a somewhat abstruse technical detail, but it's pretty important.
Right now Apple Pay, SoftPay and Google Wallet all use different approaches to how they push the transaction through the networks. Google Wallet uses a "proxy card". When you pay with Google Wallet you're actually paying with a Google-issues MasterCard debit card. That's what the merchant sees. Then Google turns around and charges whatever backing instrument you've specified (Wallet balance, bank account, debit card or credit card). This approach offers maximum flexibilty; if someone dreams up some payment mechanism and Google integrates it, you can get your payments directed to it. The downsides are that (1) it's the same credit card number every time, which means that if it gets stolen and used fraudulently (which is far harder than for a magstripe card) then Google has to take on the fraud liability; (2) the point-of-sale transaction is "card present" while Google's transaction with your payment instrument on the back end is "card not present", which means if the backing instrument is a credit card Google has to eat the difference between the front and back-end transaction fees; and (3) all transactions pass through Google, which means Google sees how much you spend through Wallet and where (which has some upsides as well; I like the payment notifications it enables and the ability to look up my payment history on any device as well as the level of control it offers me). Note that Google can't see what you bought, but obviously a lot can be inferred from location.
SoftPay (nee ISIS) uses "issuer tokenization". You can only pay with credit cards from a certain (and still fairly small -- AMEX, Chase and Wells Fargo) set of issuing banks. The banks issue "tokens" which look like credit card numbers but are only good for a single use. These are stored in the secure element on your phone and transmitted to the merchant when you pay. Security is arguably better than with Google Wallet, and there are some corner cases that are less problematic. SoftPay doesn't get involved in your transactions, although there are some indications that the app may deliver information about them to SoftPay and to your carrier, though they don't provide that information back to you as a convenient transaction log like Google does. The reason the list of cards you can use is small is because each individual issuer has to get their systems set up to support token issuance. That's actually not quite as limiting as it sounds, because the majority of credit card issuing banks in the US outsource their operations to one of a couple of service companies, so as soon as those companies (First Data and Total Systems Solutions) get set up, the card options will grow dramatically.
Apple Pay uses "network tokenization". You can pay with credit cards from any integrated card network, which right now means Visa and MasterCard. The tokens are generated by the network, not by the issuer, so issuers don't have to change anything. Apple isn't involved in the transactions, and they say they don't receive any information about them. They're obviously trying to use privacy as a way to differentiate from Google Wallet, and maybe from SoftPay as well. Note that the reason I said Apple used their clout with the MNOs to make this happen is because the MNOs have been fighting to retain control of the secure element chip needed to make it work. SoftPay didn't have a problem because the MNOs are part of SoftPay. Google Wallet ultimately had to find a way to avoid using the SE, instead falling back on something called "Host Card Emulation", where the security-sensitive stuff happens on a server (though you don't need a data connection at the point of sale; it still works offline). Apple of course, tells the MNOs how it's going to be and, beyond that, since Apple manufactures all the iPhones itself with no MNO participation, Apple will have the secret keys that provide access to their secure elements and the MNOs will not, so from a technical perspective the MNOs will have zero ability to try to block or influence what Apple is doing, even if they dared. Which they don't.
Now that network tokenization is available, it seems pretty likely that SoftPay and Google Wallet will switch to it, at least for credit card-backed transactions. I expect Google Wallet will continue to support an expanding array of backend payment options, and the proxy card approach will still be required for non-CC backing instruments. So Apple did make a major technical contribution and actually helped Google solve one of the problems that Wallet has.
Still, Apple's technical achievements are incremental at best, and really Apple didn't design or build the network tokenization infrastructure, they just used their clout to get others to do it. That's worthwhile, but not nearly as groundbreaking as what Google did with the first version of Wallet.