It's the same reason that your own small company should be trying to implement its own CRM (assuming that's not its core business), or drastically changing the way that it considers sales compensation. Being revolutionary in one area is hard enough - don't make anything even harder than it has to be.
Slashdot videos: Now with more Slashdot!
Seriously. Good, readable names for everything make code far more self-documenting than otherwise, don't cost the compiler a single cycle, and make it far easier to understand when someone comes across it five years down the road.
Other than that, I'd add methods that only perform one action, with no side effects, and that only work at one level of abstraction.
Finally, code that matches its method name - don't say "if thing.checkValues()", say "if thing.isValid()" - and if isValid() does anything like trimming whitespace it should do it on its own transient copies of things, since its not obvious that the method would ever change state.
And so on. It all boils down to code that doesn't surprise you.
Have you ever read Programming Pearls? Full of great examples of how to do elegant work with no resources that apply just as much today (just prepend "giga" to all of the numbers they use). From your comment I think you'd enjoy it.
Having a very lightweight laptop get the same performance as a high end desktop is still newsworthy, isn't it?
So you can hook up to an external monitor OR charge your Iphone OR make a powerpoint presentation! In 2016, it will be even lighter when they reduce the number of letters in the alphabet for the keyboard.
Or they can just introduce a slipstream charger adapter so that you can plug the power cord and the monitor cord into the same thing, leave that on your desk, and only connect one cable when you get home. And once monitors start supporting USB-C natively, they'll just do the same.
Just as with dropping PS/2, floppy drives, and optical drives, someone always has to go first.
Keep in mind that the target audience for the Macbook is far less likely to use an external monitor, or even to plug it in at all during the day (made possible by using that port space for more battery).
Other than putting the first mass-market GUI onto a UNIX product that people could actually use, giving us the first real "year of *NIX on the desktop" ever?
Oh, and redesigning the mobile phone (go back and watch the iPhone launch keynote and remember just how much was new, even things like "visual voicemail"?
Launching products that look obvious in retrospect yet were somehow not readily available before is a mark of good design.
Yup, which only had standing because Apple started to include sound support as part of the OS in the Mac. There's a reason that Apple named one of their sounds SOSUMI after all.
They're also incurring a shit-ton of liability in storing magstripe copies - something that's a PCI violation however you interpret the standards. That means that in the case of fraud the cardholder (phoneholder?) will be considered liable instead of the bank or merchant, and as soon as that happens the inevitable class-action lawsuit against Samsung (far more lucrative than going after LoopPay would have been) will be a doozy.
Google also insisted on getting in the middle of the transaction with their Wallet, and if my understanding is correct the results basically ended up providing an inferior experience to merchants since the cards ultimately got recognized as card-not-present. Apple on the other hand worked closely with everyone in the chain rather than trying to muscle their way in. If you're a merchant, the difference between paying card-present and card-not-present is often around 1% (because CNP has a lot more fraud associated with it). Apple's use of tokenization adds more security as well - its just a better thought out system.
Having said all that your point about the launch country was also completely valid. Again, Google tried (IMO) to be far too arrogant for their own good.
In a lot of ways its similar to the 3G issues. Many dinged the original iPhone for launching without 3G at a time when that was cutting edge and not very well supported. Once the iPhone 3G came out, all the issues had been resolved and it was a very smooth experience for users (and had been done in close partnership with AT&T). Some other phones had loudly claimed 3G before then and often had frustratingly poor network connectivity, sometimes amusingly slower than the iPhone's 2G had at the same time.
System integration: its important in all sorts of areas.
Its really aggravating too that US banks will refuse to offer chip-and-pin even though they could. Chip-and-signature is better than magstripe but its still a royal PITA whenever you travel in Europe.
Their PCI wording on the website is intentionally deceptive, I feel. When asked about PCI they talk specifically only about the fact that their datacenter is compliant in the storage of card numbers.
Its strictly against PCI requirements to store trackdata in any way, with the single exception of reading it in-memory and relaying it upstream to another PCI compliant service provider. Since this is exactly what their product does, I fail to see how they can claim its compliant (and, as I mentioned, they very carefully don't ever actually say that it is).
That still bites me every now and then after all these years, and continues to introduce discussions as to at what point in backend code the typo should be fixed.
If a checkbox is not checked does it still not come in on the post / get ?
if not then it is still broken on arrival.
Is a textarea still not a text control?
if not then it is still broken on arrival.
I think you're confusing HTTP with HTML.
HTTP/2 over TLS could have been made mandatory. But for some easy-to-guess reason they "decided" otherwise.
Because its the wrong "they" to be making that decision. The working group for HTTP/2 should never be dictating how they feel its use should be restricted. There's plenty of other opportunities for people at the appropriate levels of the chain to make that recommendation. This is a big part of the point of a layered technology.
4 000 000 000 is 10 characters once the whitespace is stripped out. Roughly the same number in HEX is FF FF FF FF, a saving of 20%
You're kidding, right? The number 4 billion can be represented in 32 bits, or the same total space as 4 ASCII characters.