"...The only bad news? It runs XP."
OK, don't get me wrong, I'm all for a good old fashioned bashing against the almighty iSteve with my Ballmer signature series chair thrower, but c'mon, seriously? Do we have to consider every damn application that runs XP a bad thing?
Seems the "damned" OS has managed to survive in the corporate world years past Vista (we're STILL ordering brand-new systems with XP), and Netbooks have seen their own resurgence of support for the aging yet stable and predictable OS.
I run a Macbook for school. What do I have loaded on Fusion? Yup, you guessed it. XP, for when I MUST run a Windows app. Every student comes marching in every year with a new Vista or OSX-loaded laptop, yet my entire computer lab is still running...yup, right again. Good ol' XP. Old, yet functional.
And rounding out this volley back to the subject at hand, some simple applications (like a microscope) I would rather NOT have to worry about the bullshit bloat of some other OS, especially when you consider your target audience is USED to seeing XP.
Ok. I'll bite. I am a Scientific Instrument Engineer. I have worked in government and Universersity labs. When you have an expensive instrument like a NMR or mass spec with a price in excess of $100K then you would expect a long service life. And in fact the more you use and instrument the more valuable it becomes as you gain 'trust' in the instrument capabilities through multiple calibrations. When the instrument is controlled through a OS like XP then you limit the life time of the instrument. Also you have manufactures unwilling to provide support on untested variations of the OS (installing service packs etc). A solution I have seen is to put instruments with ancient ( i have seen win98 and SunOS 4.0 in current use ) on a subnet that is nated and firewalled off from the rest of the network.
"From there to here, from here to there, funny things are everywhere." -- Dr. Seuss