Yes, you did used to be able to do everything you described in 256MB of RAM. But to attribute the biggest increases in web browser memory usage to programmer laziness is to ignore a drastic change in the way we (and by we, I mean the general internet-using public) use web browsers. It's no longer enough to display static web pages. Web applications are mainstream, JavaScript and Flash are practically inescapable.
I was curious, so I just checked memory usage of a web browser (Firefox 3) and an office app (Word 2007). Total memory usage, with four tabs open to fairly intensive sites (slashdot, ars technica, gmail, facebook) and a 10-page document open in Word? 150MB. I do almost all of my web browsing and general computing on a computer with a 1.8GHz Celeron processor and 1GB of RAM. The P4 system you described should be doing just fine.
Even small schools almost always have different math courses based on skill level (I went to a tiny high school, and we certainly did). You misunderstand the issue.
Unless you have classes with only 2-3 students of equal ability you're going to have this problem. Even in advanced classes there are some people who learn things faster than others, and the people who learn faster are almost always forced to sit through lectures and do work that is for them pointless.
The prevailing attitude in US education is that people who learn slowly are most helped by being in the same classes as those who learn quickly. This isn't wrong, but it does mean that those who learn quickly are slowed down to help others keep up.
This isn't a problem unique to math education though--it's an issue for almost everything. Unsurprisingly, things like art classes and music classes are least susceptible to this problem. The people who excel can do so, and the people who don't are still able to learn from those who do.
Maybe, but part of the goal is also to not fry the motherboard with static.
Which I had a friend do when he vacuumed his computer out.
3 and a half minutes to boot XP!?
My old computer (6-7 years old now, I think) used to boot XP in about 30s. And it wasn't a very expensive computer.
It already was overpriced, had too little storage, awkward/annoying controls, and played far too few music formats.
So now, instead of remedying any of the above Apple's gone and made it so I can't even use the expensive, good earphones I already have with it? *And* made the controls worse?
Yeah, this would be nice.
Fedora 10 detects multiple monitors perfectly but Ubuntu steadfastly refuses to recognize any display not built into my laptop, making it completely unusable for me.
As someone who used to occasionally use Ruckus, it really was pretty terrible for a lot of reasons:
-very little music from independent artists. I couldn't find 3/4 of what I wanted on there. (Although I can't find a third or so of what I listen to on Amazon either, so your mileage may have varied.)
-absolutely horrific client software that only worked on Windows (because the DRM was available only there). This was a big deal when 60-70% of your campus was running OS X.
-wma's don't work on iPods, which are far and away the most popular mp3 players.
-you had to pay to put the songs on an mp3 player that *did* support FairUse4WM (it was something like $5 a semester, but still)
-the music catalog was labeled terribly and frequently had mislabeled tracks or albums, and albums were often missing songs. (Whoever marked albums with the explicit tag also apparently decided it was a fun idea to go through and mark about a third of the purely instrumental music 'explicit', which was really quite obnoxious.)
I had one friend who still used it, I think. She's sorry to see it go, but I don't know of anybody else who was.
So, to summarize: I'm just about as close to the opposite of an Apple fanboy as one can get, but when I saw that article summary I just nodded my head in agreement.
"I think Michael is like litmus paper - he's always trying to learn." -- Elizabeth Taylor, absurd non-sequitir about Michael Jackson