"identically specced" Only for very liberal interpretations for "identically specced". The problem is that when you actually try to build one identically-specced, in some cases, you'd find you spend more money on a PC than a Mac. There are specs you may not care about: small form factor, workstation processors, etc which may drive the price down. However ignoring them means you don't have an identically specced machine.
Take for example the cost of the video chips in the Mac Pro. It is actually cheaper to buy a Mac Pro upgrades than discrete cards. The D300 cards are roughly equivalent to the FirePro W7000 (~$750) while the D500 is almost equivalent to the W8000(~$1250). The D700 is roughly equivalent to a W9000 (~$3200). The prices are newegg prices. To upgrade from D300 to D500 is $400 on Apple. If you had two W7000 discrete cards, the upgrade price to dual W8000 would be $800. To upgrade to D700s would be $1000. To upgrade from dual W7000 to dual W9000 is $5500.
Now you make say you don't need workstation level cards, but that's the problem with your argument. Using a consumer level card would be cheaper; however, a Mac Pro is not designed for consumers. It's designed for professionals.
- 10.0 "Cheetah": $0
I don't think there was a price as it was the first OS X to be installed on new machines.
- 10.1 "Puma": $129
- 10.2 "Jaguar": $129
- 10.3 "Panther": $129
- 10.4 "Tiger": $129
- 10.5 "Leopard": $129
- 10.6 "Snow Leopard": $29
- 10.7 "Lion": $29
- 10.8 "Mountain Lion": $19
- 10.9 "Mavericks": $0
- 10.10 "Yosemite": $0
- Division Gross Margin (% or revenue)
- Devices and Consumer Licensing: 93.8%
- Computing and Gaming Hardware: 1.25%
- Phone Hardware: 2.72%
- Devices and Consumer Other: 23:72%
- Commercial Licensing: 91.75%
- Commercial Other: 30.54%
However when it comes to hardware, MS barely makes any profit.
More importantly, the size and scale of the conspiracy would have to be massive. Tens of thousands of people worked on the program would have to be fooled. How many hundreds of thousands of people would have to work to keep ten thousand people fooled or not divulging? Neil deGrasse Tyson said it best in that only 3 people knew about Bill Clinton's sex scandal yet it got out. Also it would have cost the US government more money to create a hoax than it would to actually go to the moon. And they would have done such a piss poor job at it anyway.
Also they tend to fixate on a few things that seem to be the smoking gun; however, when looked in detail are not as definitive as they seem. For example photos from NASA show that shadows are not parallel. According to conspiracies, this must have been because more than one light source was present, ergo, it was staged. This however does not take into account that the surface of the moon is not flat. Mythbusters verified this.
Another one is the crosses (+) in the photos appear sometimes behind the subject instead of in front. This can only be because the crosses were added later to photos and not originally taken with the "moon" camera. Anyone with sufficient expertise photography knows that can be caused by overexposure. Overexposure was necessary for some photos in order to get a decent image. And the list goes on.
You should really read the paper and not just the press release. This line in the press release hides a dirty little secret:
I have and there is no secret. The press release does a good job of summarizing the results.
Of the over 10,000 scientists contacted and the over 3,000 that replied they narrowed down the "climatologists who are active in research" to 79 individuals. The 97% figure represents just 77 people out of those 79.
Now that is a gross mischaracterization of the data. 10,000 scientists were contacted. Their expertise ranged from many disciplines. 3,146 responded. The two questions were asked with 90% and 82% voting "Yes" respectively (2831 and 2579). Out of the 3146, then the list was narrowed down to scientists who were actively publishing and more than 50% of their papers in climate science. That eliminated most of the respondents down to 79 which are basically the experts in the field.
Even if you discard the 97% number, the 90% and 82% are hard to ignore.
I'm amazed that anyone would answer no to either, particularly a "climatologist active in research".
Yet two experts did. There are biology professors like Michael Behe who argue for Intelligent Design instead of evolution based on very little evidence. Thankfully there are in the minority.
Macs didn't "make USB", they forced it on their users while giving a big "fuck you" to all of their old customers running anything else. It's not like the old stuff was horrible either (ADB, SCSI).
The move to USB was a practical matter. One interface for low bandwidth connections: USB. One for high bandwidth ones: FireWire. It was about future proofing than legacy. And it had the effect that it brought down costs when you could use the same peripherals for Mac and PC if the drivers were there.
In the meantime, USB was everywhere on PCs. It just wasn't forced down everyone's throats. Even recent systems with USB3 quietly included will still include interfaces from the
It also wasn't well supported until way after Apple made their change. Oh it was there. But adoption was poor. Drivers were non-existent or poor. Even Windows didn't have proper USB support til Windows 98. As for the dark ages, yes you can still get MBs with PS/2. I haven't used that port in a decade or more like I haven't used a serial or parallel port. I don't have a need.
At the time, PowerPC chips were more powerful than x86 in terms of raw computing power. I believe that the G5 Mac was technically classified as a supercomputer based on an old standard of flops and could not be exported until the US government updated the definitions.
The reason for the switchover to x86 had to do more with power efficiency, customization, and logistics. While the PowerPC architecture did lend itself to better overall computing performance, it was lacking in power efficiency and heat. For a desktop that's not a major problem, but it is a problem for laptops. It's a problem that IBM never really solved as they never released a mobile G5 and Apple was stuck with mobile G4s until the Intel switchover. Here is one area where Intel was way ahead.
The two other related issues have to do with Apple's needs and IBM and Motorola's manufacturing logistics. Apple despite ordering millions of chips a year was always going to be a small customer in terms of volume. However Apple was going to need a heavily customized consumer PowerPC chip that required to be updated almost every year. Meanwhile most other PowerPC customers would want server/workstation chips that IBM used in their own products. Now these can be done but these factors cost time and money. I can see why Motorola and IBM (and also Apple) would be less likely to invest into new chips.
On the flipside, the Xbox 360's Xenon processor would be more the model of what IBM/Motorola wanted. Although it was heavily customized, the basic design has not changed in 8 years when the Xbox One was launched with estimated sales of 40+ million. This gave IBM enough time to do a die shrink to cut costs.
The change to Intel gave Apple many advantages. First of all, faster and more efficient mobile processors were available. Second, most of the features that Apple wanted were already in the x86 design as they were designed for consumer PCs. Third, any customization Apple requested from Intel, Intel could sell to competitors like Dell. For example, the first MacBook Airs used customized Intel Core processors in which the chip package had been shrunk 40%. Intel didn't mind investing the money for this customization as they sold them immediately to other customers. Many of the features that Intel got in the collaboration became part of the Ultrabook specification.