I used to reinstall XP every year or two to get it back to a fresh copy, but I ran the same install of 7 from 2009 until... well, the present though that computer is my second one instead of my main one now. Including installing games and at one point switching from an nvidia to an ATI video card. Runs as awesome as ever. This problem has basically been solved... SSDs and huge amounts of ram help.
I just wish people would stop linking to Daily Currant articles. Their version of "Satire" is posting articles that aren't funny, but ARE plausible, just to incite a reaction. Its not like The Onion where the humor is usually right in the headline.
I wouldn't have a problem with it if they were skilled writers and I cracked up laughing while reading the article. Except its crap like "Sarah Palin: âEat Less Chinese Foodâ(TM) to Reduce Trade Deficit" or "Hilary: I'm running!"
I wouldn't be surprised if Sarah Palin said something like that, and I wouldn't be surprised if Hilary Clinton was running for president. Except there's no humor in fabricating plausible stories.
While its true that cameras with large sensors tend to have shallower depth of field, its actually a side effect of needing to use longer focal length lenses to get the same field of view. You might need 70mm on a 35mm camera to frame a subject for a portrait but only 12mm on a point and shoot to frame the same subject. Longer focal length means bigger actual lens aperture for the same f-stop, and thats what decreases depth of field.
For example, a 35mm f/2 lens on a full frame camera will have the same depth of field as a 35mm f/2 lens on a 2/3" CCD point and shoot, but the 35mm on a full frame camera is going to be a standard angle and 35mm on the point and shoot is going to be considerable telephoto.
People generally don't use the same range of focal lengths on full frame cameras as they do on tiny sensor point and shoots (or cell phones) so thats why it seems like its easier to achieve the shallower depth of field with a bigger imager.
Did the same myself. I even said that I know they can't and won't reply for legal reasons but that I hope the words reach his eyes. Shame shame shame.
Or $227: http://www.amazon.com/Crucial-...
The problem is that anyone can get a loan, even people who definitely have no prospect of paying it back. With guarenteed loan money, schools can charge whatever they want and you'll just have to take out a bigger loan. And of course 18 year olds fresh out of high school don't understand the power of compound interest, they just know that they "have" to go to college to get a good job and they'll get a better job if they go to a fancy private school.
While you can't get a bachelors from our local community college, it only costs $2,500 a year in tuition and you're getting credits that can transfer to any state school. Why can a community college offer actual college classes for that little, but a 4 year school can charge $10,000, $20,000 or more for largely the same education? Its just insane.
The 4MB ram upgrade I put in my 386 in 1992 only cost $200 at the time... so ram prices were discontinuously high in 1995 if thats the case. My ~$1300 Compaq Presario 7180 came with 8MB ram in November 1995, and that included a 1.2GB hard drive and a P100 processor. I doubt the ram was the majority of the price of that system.
Do I think you're making it up? No. Do I think you might have been looking at some weirdly expensive memory? Probably.
Oh yeah, also, my 16MB upgrade cost about $150 the year after, in 1996... so if it was to a dollar a megabyte, I certainly got ripped off.
My own personal recollection of jumpy ram prices was that I paid about $20 for 256MB ram in October 2001 and $100 for 256MB ram a mere few months later. The prices definitely shot up after a shortage. However, I remember keeping my eyes on the advertised ram prices back in '95/'96 and $100 a megabyte sounds like way too much and $1 a megabyte sounds like way too little. It had to fluctuate around $10-$50 a megabyte, but I doubt it was ever as high as $100 or as low as $1 in the time span you said.
Obviously, its a problem when "Season Pass" doesn't actually get you the whole season. If I hadn't RTFA'd I might have presumed that the guy was complaining that he didn't get access to either all 16 episodes including the ones that weren't even played yet (that would be absurd) or that he didn't get access to the first 8 + the ones that have been played already (not absurd but I wouldn't be on his side)
If Apple's intention was that buying a season pass to season 5 of breaking bad would get you the first 8 episodes now, and the last 8 episodes when they were released to dvd/bluray/download, it would just be a matter of patience and I'd still be on Apple's side on this one.
Except from the sounds of it, Apple was selling a season pass to "Season 5" and not listing it as "The first 8 episodes of season 5." They had no intention of ever giving him access to the last 8 episodes of Season 5 for that price, making it "Not really a season pass." Clearly this is a problem and the guy just wants his money back for misleading advertising. If I were him, I'd be ok with a gift card in the amount of the price of the first 8 episodes, since the second 8 will presumably be priced the same anyway, effectively getting me what was advertised. The whole season for one price.
Heres the real benefit I see to 3840x2160 (or 3840x2400). Whatever. I'll call it 4k like everybody else is.
The real benefit is that you can start treating your monitor like a CRT again, feeding it arbitrary resolutions. First off, 1080p would work fine on a 3840x2160, and with any luck the monitor would just display it pixel-doubled so it wouldn't be any more blurry than a native 1080p monitor. That would be awesome. You can also run 1280x720p natively, as 3840x2160 is triple that, just like its double 1080p. But heres the real kicker - say you have some old game that tops out at 1280x1024 or something. You'll have to accept the black bars on the sides for games that aren't widescreen, but given that, you can upscale 1280x1024 to 2700x2160 or whatever. It'll still look good because theres so many excess pixels - more than double. Back when we were switching from CRTs to 15 and 17" or maybe a 19 if you're lucky, we had the issue that 800x600 looked like junk on a 1024x768 monitor and 1024x768 looked like junk on 1280x1024. At 3840x2160, we can display 1080p and 720p with literally no artifacts, and anything in between with minimal artifacts. In fact, the dot pitch of a 3840x2160 24" monitor is smaller than that of a typical 21" fine dot pitch aperture grille CRT. 3840x2160 at that resolution is only