Building anything of complexity in a profit-driven, timing-sensitive marketplace requires concessions—some really desirable stuff isn't going to make the cut, and other stuff is going to have to be simplified. This is the result of balancing what is technically possible against practical constraints on labor, cost, supply chain, and so on.
When something has to go, do you keep A or B in the product? And when C has to be simplified, do you simplify it using method C1 or C2?
These are the things that Jobs tended to get right, often with counterintuitive decisions. People often say that Apple is all about ease of use, but this can encompass a lot of different things:
- Intuitive use for those with no prior knowledge
- Use that requires the fewest number of steps or user-initiated actions
- Use that requires the fewest number of adjustments relative to existing expectations and habits
- Use that maximally shortens the absolute time until results arrive
- Use that has the highest possible correlation between inputs and desired, complete results ...and so on.
And these things are often at odds, and they're often the kids of decisions that line up with the aforementioned A/B/C1/C2/etc. decisions in multiple, complex ways. Steve Jobs had a knack for balancing these in such a way that:
- Those with no prior knowledge were not alienated or intimated, even if they had to learn
- The number of steps or actions was not onerous
- Existing expectations and habits were managed in a way that minimized cognitive load
- Results were accomplished reasonably quickly
- Correlation between inputs and desired results was relatively high
I say that his decisions were often counterintuitive because he often thought outside the box of mere feature delivery. For example, if it was proving tough to design for existing expectations and habits, the choice might be instead to change things more, rather than less—so that the new feature was taken *out* of the realm of existing expectations, even if in some design alternatives there could be a minimum overlap. Most companies would go for "we'll meet existing expectations and habits as well as we can, and 15% overlap is better than a 10% overlap if that's what we can bring to market effectively."
Apple in its heyday would say, "A 15% overlap is poor; let's revamp this so that it doesn't bring to mind any expectations or habits. We could design with some familiarity, sure, but if it's only 15% match, some familiarity is actually worse than 0% familiarity, since in the second case we don't fool the user into thinking they know more than they already do, and they understand from the start that it is something new that they will need to learn [even if it wasn't actually new at all, as people here would often point out]."
Similar counterintuitive decisions for the other bullet points. Maybe the right thing isn't to deliver something that produces results "as quickly as we can made it do so," but in fact not to deliver it at all if the net result is frustration because it's still just too slow or the correlation between inputs and desired results was too low. The traditional strategy would be to make it "as good as we can make it" and release it.
Jobs' famous "knowing when to say no" thing is really a subset of this larger sphere of judgment. Not just knowing when to say "no" but also knowing when to reshape it as an entirely new feature (from the UX perspective) without reference to previous similars, even if there were many; knowing which framings of new things intimate new users vs. excite new users (even if in both cases the net effect is that new learning is required), and so on.
This is the sort of thing where user research is often misleading. Most users will say "I prefer that one, at least it's a little bit familar" when in fact the familiarity, combined with the ultimate variation of the totality of the product from their expectations, might ultimately lead to less use or suboptimal use—yet an "ugh, that's strange and new" might become an indispensable, optimally used product once users get over the learning hump.
During its best years, Apple's design prowess was lauded because the picture was very big—it was UX design plus human factors done at billboard scale, with an eye toward the big picture of how the products would integrate into users entire conceptual ecosystems, across all of their devices, habits, living context, and across time as well—over which learning and experience occur.
Design isn't often done that way; more often it's just more mechanical than that, and more rote. That's what I see in Apple right now—a company that does design by the numbers and never does anything that is both counterintutive but also right anymore. Like most other companies, they're now in the business of "optimizing their compromises" once again by the numbers, rather than in the business of thinking about the big picture beyond the device to decide how to make the compromises.
Too much "design principles" and "use research" and not enough in the way of the bigger social picture and ethnographic quality research and understanding.
Basically, Jobs had an innate, natural talent for social science perspectives and thinking that is underappreciated to this day, and it is now missing from Apple, who can only see in aesthetic/technological/UI/UX perspectives at this point.
TL;DR version—
The single word "design" is often used to reference aesthetics, UI, UX, and human factors.
At its best, Apple balanced these well, and Jobs had a particular innate vision that privileged human factors, then UX, then UI, then aesthetics, in that order.
Now, Apple privileges aesthetics, then UI, then UX, and human factors least of all.
So everything is prettier than it's ever been. Yet at first touch it's not quite as great use as it once was. And over time, it gels far, far less effectively and completely than it used to.