Comment Re:FIXED THAT FOR YOU (Score 1) 126
Frak that. Forget about PowerPC. Let's take it way back.
IBM should have chosen the 68000 for the PC.
How would starting the 32-bit age in 1981 sound to you?
While the 68K had a 24-bit address bus and 16-bit data bus, all of the internal registers were 32-bit, aside from the CCR. That meant any non-droolingly-retarded-code would run just fine when the 68020 was released with complete and total 32-bit capabilities.
If the cost was too high (and that's utter BS right there, we're talking about an $80 part in a $5,000 heap of shit), Motorola did release the 68008 later on, which was more in line with the 8088 that IBM did select: 20 bits of address bus, and 8 bits of data bus. Still 32-bit inside*.
NB: The Macintosh was an example of droolingly-retarded-code, the original ROM was NOT 32-bit clean. Motorola explicitly warned developers in the 68K literature that the upper 8 bits of the address registers would be connected to address lines in the future. Commodore and Atari(er, I think) listened. Apple did not. Some game developers did not.
640k of segmented memory can bite my shiny metal ass.
(* = raving intel fanbois often point out that the original 68000 lacked 32x32 multiply and such, and that the ALU was really only processing 16 bits at once in most cases. That's generally irrelevant as the 68000 was not matched against the 80386, but rather the 8086/8088. Also, the 16x16 multiply resulted in a 32-bit number, which could be used to create a 32x32=32-bit answer, which is all that 99% of high level languages can handle anyways. This was addressed with the '020, which offered 32x32=64-bit multiply, still before the 386 hit market. The ALU issue was even less important, as it was utterly invisible to even low level programmers. Again, the release of the '020 in 1984 fixed that too.)