Standard deviations go both ways; above & below the mean. "High end" on the other hand, is synonymous with "above average" - i.e. better than what most people use/need.
I'd say, IDEALLY it's better to build high end, but in reality we build according to requirements, financial constraints, time to market, return on investment, etc. Using the original topic as an example, you could build high end 4K Ultra High Definition monitors that would certainly work almost everyone. However, at this time, to make a profit you're probably going to have to sell your monitors at a price point that 2-3 standard deviations of the mean population can't afford.
Actually, history has proven just the opposite... When you build for the average person, you maximize the selling potential of your product. That's why everything from ramen noodles to cars are designed for an "average" person. Moreover, in many situations you're required to design & build products to conform to some regulation designed for an average person. Not meeting those requirements immediately dooms your product to failure.
Sure there's some variability to what is considered average, depending on your country and/or market, and even level of technology, but the point being, that by not designing products for the average person, you're designing for a niche or specialty market, or experimenting with new or novel technology.
Obligatory car analogy: That's why there's a whole lot more people driving Toyota Corollas then there are driving Bugatti Veyrons - even though side by side, most people would probably notice a clear difference and prefer a Veyron.
'If all you see in your workday are your co-workers and all you see out your window is the green perimeter of your carefully tended property,' Mozingo writes, and you drive to and from work in the cocoon of your private car, 'the notion of a shared responsibility in the collective metropolitan realm is predictably distant."
So where did this guy get his geek credentials from anyway? I don't even have window... I get enough UV radiation from my wall o'monitors down here in the basement, thanks. The only scenery I look at is rendered through a pixel shader... And my commute is never lonely - since my mom does all the driving.
This guy really needs to get in touch with reality...
Also, in this case let's not forget that Apple has had a history of forcing innovation on the public and taking (borrowing and even stealing) cutting edge technologies & making them practical, popular & even necessary. For instance, before the iPhone, I remember trying to decide if I should toss my Palm Treo & go with a Blackberry. Back then, there weren't really any other smart phone options for me. I remember people laughing at the idea of a touch screen phone and harping about all that was wrong with the iPhone's simplistic interface and wondering how successful they'd be by not allowing the iPhone to be carrier-branded
Now look at how far we've come. Almost every phone out there today has benefitted in some way from the iPhone - not to mention that there are alot more smart phones on the market today & they all resemble the basic iPhone design.
So come on, the iPhone saga has been a hot topic since its vaporware days, and even if you don't use an iPhone, you're inner geek has to be at least curious to see what the next version will have to offer.
Is that the Lion share?...
And HP, Motorola, Samsung, RIM, Microsoft, Nokia, HTC... these are fly-by-night upstarts, new to the industry?
...No, but as far as iPad-style tablets go, they're all a little late to the party. I'm sure that they'll catch up eventually - just like they did with the iPhone, but it's gonna be more of an uphill battle than with the iPhone because the tablets aren't as closely tied to cell phone carriers.
Fair enough, in a general/abstract sense... But that's not my point. The point that I was trying to make is that online education is ripe for "gameification". This is because video games - especially online video games - are incredibly efficient mechanisms for learning, while online instruction is not - even though many of the high level patterns that you encounter in video game design are very similar to the patterns that you find in online instruction... In fact, the basic problem domain seems to be pretty much the same between the two.
I tend to believe that this doesn't have as much to do with content as it does with the fact that the technology used for online instruction is relatively primitive when compared to even simple online video games; in that online instruction generally doesn't implement many of the technologies that we would consider "normal" elements of online video games.
Of course, probably the greatest challenge to bridging the two worlds is adapting various course materials to work as games. Although I've seen a lot of great "expert systems" that are good at introducing a concept, then following up with questions until it is determined that a student understands the concept, these are usually text-based and not very engaging. Not much better than the early text-based computer games from the back in the 80's.
As paper is becoming more & more obsolete, it would be nice to see textbook publishers think beyond e-books & online CMSs and get into game development. Then maybe we'd see some truly immersive, interactive games that can help students learn.
"When in doubt, print 'em out." -- Karl's Programming Proverb 0x7