Standard deviations go both ways; above & below the mean. "High end" on the other hand, is synonymous with "above average" - i.e. better than what most people use/need.
I'd say, IDEALLY it's better to build high end, but in reality we build according to requirements, financial constraints, time to market, return on investment, etc. Using the original topic as an example, you could build high end 4K Ultra High Definition monitors that would certainly work almost everyone. However, at this time, to make a profit you're probably going to have to sell your monitors at a price point that 2-3 standard deviations of the mean population can't afford.
Actually, history has proven just the opposite... When you build for the average person, you maximize the selling potential of your product. That's why everything from ramen noodles to cars are designed for an "average" person. Moreover, in many situations you're required to design & build products to conform to some regulation designed for an average person. Not meeting those requirements immediately dooms your product to failure.
Sure there's some variability to what is considered average, depending on your country and/or market, and even level of technology, but the point being, that by not designing products for the average person, you're designing for a niche or specialty market, or experimenting with new or novel technology.
Obligatory car analogy: That's why there's a whole lot more people driving Toyota Corollas then there are driving Bugatti Veyrons - even though side by side, most people would probably notice a clear difference and prefer a Veyron.
'If all you see in your workday are your co-workers and all you see out your window is the green perimeter of your carefully tended property,' Mozingo writes, and you drive to and from work in the cocoon of your private car, 'the notion of a shared responsibility in the collective metropolitan realm is predictably distant."
So where did this guy get his geek credentials from anyway? I don't even have window... I get enough UV radiation from my wall o'monitors down here in the basement, thanks. The only scenery I look at is rendered through a pixel shader... And my commute is never lonely - since my mom does all the driving.
This guy really needs to get in touch with reality...
Also, in this case let's not forget that Apple has had a history of forcing innovation on the public and taking (borrowing and even stealing) cutting edge technologies & making them practical, popular & even necessary. For instance, before the iPhone, I remember trying to decide if I should toss my Palm Treo & go with a Blackberry. Back then, there weren't really any other smart phone options for me. I remember people laughing at the idea of a touch screen phone and harping about all that was wrong with the iPhone's simplistic interface and wondering how successful they'd be by not allowing the iPhone to be carrier-branded
Now look at how far we've come. Almost every phone out there today has benefitted in some way from the iPhone - not to mention that there are alot more smart phones on the market today & they all resemble the basic iPhone design.
So come on, the iPhone saga has been a hot topic since its vaporware days, and even if you don't use an iPhone, you're inner geek has to be at least curious to see what the next version will have to offer.
Is that the Lion share?...
And HP, Motorola, Samsung, RIM, Microsoft, Nokia, HTC... these are fly-by-night upstarts, new to the industry?
...No, but as far as iPad-style tablets go, they're all a little late to the party. I'm sure that they'll catch up eventually - just like they did with the iPhone, but it's gonna be more of an uphill battle than with the iPhone because the tablets aren't as closely tied to cell phone carriers.
Fair enough, in a general/abstract sense... But that's not my point. The point that I was trying to make is that online education is ripe for "gameification". This is because video games - especially online video games - are incredibly efficient mechanisms for learning, while online instruction is not - even though many of the high level patterns that you encounter in video game design are very similar to the patterns that you find in online instruction... In fact, the basic problem domain seems to be pretty much the same between the two.
I tend to believe that this doesn't have as much to do with content as it does with the fact that the technology used for online instruction is relatively primitive when compared to even simple online video games; in that online instruction generally doesn't implement many of the technologies that we would consider "normal" elements of online video games.
Of course, probably the greatest challenge to bridging the two worlds is adapting various course materials to work as games. Although I've seen a lot of great "expert systems" that are good at introducing a concept, then following up with questions until it is determined that a student understands the concept, these are usually text-based and not very engaging. Not much better than the early text-based computer games from the back in the 80's.
As paper is becoming more & more obsolete, it would be nice to see textbook publishers think beyond e-books & online CMSs and get into game development. Then maybe we'd see some truly immersive, interactive games that can help students learn.
Failure in the PC marketspace doesn't mean that having great hardware gets you nowhere. It simply means that the Mac wasn't better than a Wintel PC by a large enough margin for people to consider switching.
Exactly! And the same was true of the iPod in the beginning... There just wasn't enough of a compelling reason to really separate it from the pack; at least until the iTunes Music Store - this was the true innovation of the iPod something that nobody else had been able to pull of (legally)
You're right in that what makes them successful is the ecosystem. But from their ecosystem, it is the end to end integration on the focus of the user experience that makes them successful, not from the consumer lockdown.
The way I see it, the "consumer lockdown" is responsible for the successful user experience. This is nothing new for Apple, of course. The've been doing business this way for a long time, providing locked down hardware that has really tightly integrated software - and in my opinion, a whole lot less bloatware.
Why do I say that?
Because when the iPod came out of nowhere and took over the market, there were three parts: The Mac (desktop), iTunes (the app), and the iPod (device).
Not exactly the way I remember it... iTunes was introduced for Mac OS 9, then The iPod came out later that year. Originally, the iPod only worked on Macs, it wasn't until the second generation iPod that Windows support was added (through third party software). Even then, the iPod was far from taking over the market. This didn't happen until iTunes 4 (with the iTunes Music Store) was released for Windows. This was when the rules of the game changed, and the iPod + iTunes Music Store created an entirely new market - an all-in-one ecosystem where for the first time buying, building and playing back your music library became a seamless experience.
What did it compete against? The Rio and the Creative Nomad. The Rio was over parallel port and limited to around 32MB. The Nomad devices were mostly USB1 and limited to around 256MB.
When I first tried iTunes, I used it on OS 9 with a portable CD player that could play MP3s - a great little gadget that allowed me to hold hundreds of songs on one CD. I later bought a Rio that did integrate nicely with iTunes on OS 9 (even had a custom icon in iTunes when you plugged it in). My Rio used USB 1 to connect. The slow transfer rate wasn't very noticeable because the Rio's memory modules were so small. I basically used it like a cassette player; keeping different songs on different memory modules. Still, the Rio memory was expensive and I eventually got rid of it in favor of some off-brand mp3 player that used standard flash memory and connected with USB 1.1. Although iTunes didn't recognize it, it wasn't to hard to drag & drop mp3s into it. This was the last stand alone mp3 player that I bought. To tell the truth, out of all of them, I liked the CD player the best - sure it was a little bit more bulky, but I could use those CD in just about any CD player or computer - even in my car.
The Rio had a great user interface and it was integrated with iTunes. Although the iPod promised "1000 songs in your pocked", I didn't like the fact that you couldn't upgrade the memory or change out the battery. I also avoided the Firewire interface - which was pretty much exclusive to Macs at that time
Thanks for the fun chat. I've enjoyed the trip down memory lane.
Yes, seriously. Little to do with hardware... Don't get me wrong, I'm not arguing that Apple's hasn't led the industry with innovative hardware (they always have and continue to), but where did all the cool hardware get them before the iPod? At best, only some marginal share of the consumer PC market.
There've been plenty of cool devices that have come & gone over the years (some were even Apple products). So, it's certainly not the hardware alone that makes great product, or makes a product great.
No, the success of the iPod & later iOS devices wasn't the hardware (not that having great hardware didn't help). It was Apple's ability to control the whole ecosystem: hardware, software *and services*. Love it or hate it, it's the success of the "walled garden" approach that's given these products such a lead on the competitors. Moreover, this strategy has been able to either create the entire market for these products, or at least redefine it to the point where everyone else is forced to play catch up.
I think there may be a misunderstanding here. If we're talking about open standards, we're not necessarily talking about training people to use different software, just different standards. You don't have to use Writer, just make sure that your people are saving their Word documents in ODT, XML, HTML or RTF.
I completely agree with your reasoning here. The original article claims that the BSA is getting all bent out of shape because of a new procurement policy note that recommends using Open Standards when "purchasing software, ICT infrastructure, ICT security and other ICT goods and services." For the most part, it seems that the note was well received & makes a lot of sense. However, the BSA seems to take issue with the part that states: "Government defines “open standards” as standards which... [among other items listed] have intellectual property made irrevocably available on a royalty free basis... "
Maybe, I'm having some difficulty parsing the meaning of this, but I interpret this as "any IP that you create using said product should remain viewable/usable in the future without having to pay additional money for another upgrade/version of said product." Maybe I'm wrong (I'm American and speak a different dialect of English)... Or maybe the BSA thinks that requiring customers to pay for upgrades in order to access their IP is important to their constituents...
The aim of science is to seek the simplest explanations of complex facts. Seek simplicity and distrust it. -- Whitehead.