I remember playing around with my dad's 10MB MFM on the Wang PC-clone (80386 at 20MHz and a Turbo button that would take it to 25MHz) that he borrowed from work in '85 when I was 7 (he worked at Wang as a computer imaging scientist and engineer). It was a half-height drive (which for those who don't know means it only took up a single 5" slot) and could store oh-so-much more than I could throw at it at the time.
He also brought home an 85MB MFM full-height drive (two 5" bays) for me to play with to see if I could get it to work with that same computer. After struggling for a week he brought me a DIP and said "here, try swapping this with the one that's installed" (it was an experimental PROM BIOS chip, though I didn't realize it at the time). Worked fine after that.
Not quite old enough to remember the FM drives. The IDEs were a god-send; the MFM's ISA expansion cards were massive (>12" long?), and a pain to deal with (all those jumpers -shudder-).
By the way, anyone care to make a guess how big my Windows partition is?
Bigger than your penis?
My 20GB Windows partition is on an 80GB Western Digital drive, so it should be possible to somehow figure out the length it takes up. By length, I mean the longest straight line that can be placed against the physical area taken up on the platter(s).
Assuming *at most* that the 20GB tracks are on the outside of the 3.5" drive, I would say that makes - 3.5"? I'm not certain you would necessarily want to advertise that....
What you say is true for power-users, but an average user has neither the requisite understanding, nor desire, nor availability to do the manual labor necessary.
My wife doesn't know the first thing about auto-updates beyond asking me "hey I'm getting this pop-up in the bottom right-corner of my screen, do you know why I'm getting it?" And I just don't have the time to do it on her laptop regularly. I don't auto-update everything on her laptop and periodically I'll update her software (about every three or fours months, like I did for 4 hours yesterday), but for some things it's a necessity in order to get the bona fide security patches she actually needs in a timely manner.
The problem with having everyone use only a single version is that while known-problems would get patched, unknown-problems would bite the ENTIRE network and take it down again all at once. Diversity has its downsides, but a slight amount is a good way to prevent that.
Midshipmen majoring in Computer Science at the US Naval Academy (my major and alma mater, class of '00) are indeed cognizant of Admiral Hopper, though I don't think there's anything specifically that teaches about her contributions. Part of this (and here I start to hypothesize) is the relative age - ADM Hopper's contributions, though extremely important and noteworthy, are relatively recent, in comparison to the rest of what goes on at USNA - the goal is, after all, to provide highly technically-trained graduates to drive ships, not go on to academic careers. Much of the infrastructure and heritage stems from the people and events of the Revolutionary War (aka "War for American Independence") through World War II, heavily favoring the mid- to late-1800's. Operational topics before and after that (and during, to give meaning and context to the heritage) are taught in classroom settings. But though ADM Hopper's contributions to the field of computer science are important, at best it's the contributions that are taught (not the name), and definitely not in an operational context (she spent her entire career as a reservist and rarely was operational).
Several other comments talk about a pair of particles being created out of nothing, one gets absorbed and the other flies away. This is basically right, but can be confusing (the one that gets absorbed has negative energy in order to conserve energy). Here's an easier mental model....
Steve Hawking came up with an idea a while ago (70's perhaps?). He was thinking about black holes whose event horizon was around the size of an atom. Then he put it up against the Heizenberg Uncertainty Principle. He realized that particles in these black holes would have such a high degree of certainty about their position, that there would be such a low certainty about their velocity. Therefor, there would be some that would be REALLY fast. Not fast enough that they could escape the pull of the black hole, but fast enough that they could get just above the event horizon. There, they could give off a high-energy photon, and fall back in. This photon, since it was emitted outside the event horizon, would actually escape. This radiation can (and has been) detected, and causes what is known as evaporation. http://en.wikipedia.org/wiki/Hawking_radiation#Black_hole_evaporation
Ironically, this means that smaller black holes (which have higher certainty about a particle's position) evaporate faster. Large-ish black holes absorb more energy cosmic microwave radiation than they emit in Hawking radiation, but if they have small enough mass (I believe smaller than the size of our moon), they emit more Hawking radiation than they receive from the cosmic background.
Q: How many IBM CPU's does it take to execute a job? A: Four; three to hold it down, and one to rip its head off.