Comment Re:Already they're misrepresenting the past. (Score 1) 54
The machine he is running on (an IBM Portable PC) *only* came with an amber monitor. Parts of the machine have been changed, but that part is very authentic.
The machine he is running on (an IBM Portable PC) *only* came with an amber monitor. Parts of the machine have been changed, but that part is very authentic.
Doubt it. There were a handful of decent games provided on cartridge from Imagic and some games that took advantage of PCjr specific video or sound, like Kings Quest and MS Flight Simulator. That level of interest does not indicate an entire industry was hoodwinked.
Spinaker's educational games for pre-schoolers were terrible and they deserved to go out of business on their own merits.
Mike
Regards, Mike (yes, the one that owns the page referenced in the summary)
No, that is only what they were guaranteed
I don't remember hearing about low yield problems. Sony took delivery of quite a few chips
No, to be quite blunt a big part of the problem was a lack of vision. Without a roadmap nobody was going to use to product. IBM stumbled when it did not backup the roadmap with real dollars to fund the new chips and programming tools.
Disclaimer: I used to teach Cell programming classes for people who were looking to do HPC on the blades.
Cell failed. But the reasons behind the failure are more interesting.
The obvious answer is that it was hard to program. On a single chip you had the PowerPC processor and 8 SPUs. Communication was through mailboxes for small messages and DMA transfers for larger messages. To get the most out of a chip you had to juggle all 9 processor elements at the same time, try to vectorize all of your ops, and keep the memory moving while you were doing computation. That is the recipe for success for most architectures - keeping everything as utilized as possible. But it is also hard to do on most architectures, and the embedded nature of Cell made it that much more difficult.
There were better software tools in the works for people who didn't want to drop down to the SPU intrinsic level to program. There were better chips in the works too; more SPUs, stronger PowerPC cores, and better communications with main memory. Those things did not come to fruition because IBM was looking to cut expenses to keep profits high (instead of boosting revenue). The Cell project was killed when a new VP known for cost cutting came in. We finally had a good Cell blade to sell (QS22 - two chips, 32GB RAM, fully pipelined double precision, etc.) and that lasted four months before the project got whacked. And we lost a lot of good people as a result. (That VP, Bob Moffat, was part of the Galleon insider trading scandal.)
So yes, Cell failed. But not necessarily for the obvious reasons. IBM has been on a great cost cutting binge the past few years - it lets them meet their earnings per share targets. But it causes collateral damage.
DOS and FreeDOS are still relevant in some niche areas:
- Turn-key and embedded hardware often use DOS
- Retro-computing: Some of us like dragging out our old hardware to play with it
- Learning to code closer to the metal; DOS gives you enough services to get you going, while giving you a feel for embedded programming
FreeDOS runs on almost everything from an original IBM PC (1981) to a virtual machine under VMWare and VirtualBox. People (hobbyists) are continuing to work on the utilities to keep it refreshed. For example, in the last year there was a new set of TCP/IP programs added, a utility for sharing folders with a VMWare host, and a new web browser based on Dillo.
It's not for everyone, but if you are curious check it out - it's pretty painless to run in a VM. (Or you can drag out your XT or Pentium 90 for the full effect.)
"The Mets were great in 'sixty eight, The Cards were fine in 'sixty nine, But the Cubs will be heavenly in nineteen and seventy." -- Ernie Banks