Sorry, just to clarify
PC industry == kind of open
I can't honestly say that things are where I'd like them... we still have patent issues and even the EISA bus had to be licensed... chips are very closed usually... so we are a far cry away from a truly "Open PC".
I'm not suggesting that these companies do anything for any other reason than profit. And I'm not saying that Apple stuff isn't better than PC stuff.
But Apple still stands out as a poster child for proprietary hardware and software, even over M$ and IBM, which is an accomplishment.
Well, you are not "straight" yet. The poster said that in 1986 "most home computers probably still had built in chips" and that talked about "IBM clones that only did text." I corrected him about that (by 1987 you had the Mac II and VGA)
Yes, Macs were closed systems. Are you disputing this? That's nuts. Microchannel? Really? I get that you might be one of these rabid Apple fanbois I've heard about, but that doesn't make me an IBM apologist. What does MCA have to do with this discussion? Ease up. I've owned an Apple II, TI 99/4a, Trash 80, PCs, Mac, a VAX in the basement at one point, and even a Mac Mini... computers are computers, and corporations do what corporations do...
It is true that IBM never intended the PC to be "open" - that was an accident. The MCA was an attempt to go back. At first the VGA may have been MCA-only (way to miss the point) but CGA and EGA were very popular open standards and the VGA standard became the most popular of all. The ISA bus was a mistake as far as IBM was concerned. But by this point in time, the "industry" (the "I" in ISA and EISA) had gotten together to clone all the BIOS and create open standards. Apple did not (and doesn't) have open standards, and I remember leaning towards PCs because of that. Everybody knows this, and many people feel like this hurt Apple in the late 80s and 90s, regardless of their "superior video cards". Some people (me) think that while Apple may have done some very cool things lately, this very old (and common) attitude ("it's just like 1986 again") will hurt them, and even if it doesn't hurt Apple it will hurt consumers and the industry. And so far, we've only been talking about hardware. Don't even get me started on software development.
Oh, and the point?
Open >> Closed
PC industry != Microsoft or IBM
Facts, facts, and more facts.
In this way, all "faults" are hardware faults. Now some "faults" - such as the "Double Fault" - can cause the OS to hang... windows for example will blue screen on a double fault. What is the "cause" of the Double Fault? Well, it's either a software error (kernel bug) or something physically wrong with the machine, such as bad memory. But I think by "fault" you mean something like the "blue screen of death." So are you saying "most blue screens are caused by physical hardware problems" or that "most blue screens are caused by microprocessor faults"? Because "most faults are hardware faults" is at best trivially true, at worst flat-out wrong because ALL faults are hardware faults by definition, i.e. being exceptions, raised by the microprocessor. I would suggest something like "while most kernel failures are caused by hardware faults (by definition), and a correct kernel could theoretically prevent 100% of failures caused by kernel bugs, there will always be kernel failures due to hardware issues that you cannot prevent with formal review." But I guess that's so obvious nobody would bother posting that. On the other hand, nobody will call "bullshit" either.
Those who can, do; those who can't, simulate.