A wardriver would know how to spoof a MAC. A MAC whitelist would prevent your basic neighborhood internet mooch, but simply setting up a WPA/WPA2 password would be sufficient to prevent this, and much easier to maintain.
This is great if you already have one, but DDR2 was most popular during the Core2 era. If you don't already have a LGA775 DDR3 motherboard, procuring a quality used one will cost you practically as much as a new board.
The caveat to sticking with the Socket 775 platform is DDR2 memory, which is usually going for twice as much as comparable DDR3. What with 2GB being the maximum practical size for a DDR2 DIMM, many boards are limited to a 4 - 8GB maximum.
Some might entertain the notion of going with an AMD AM3+ board. Going from a low end dual-core Intel solution, to a AMD quad-core solution with 8GB of RAM for around $150 - $175 is a nice performance boost. You could put that money towards a Q6600 and some more RAM, but then you have effectively maxed out your system, and the next time you upgrade you will have to rip everything out anyways. If you wanted to jump to Intel's new lineup, then you will be spending $150 - $175 on the CPU alone to see a performance increase.
One could argue that it lowers perceived cost of living by allowing employers to pay non-liveable wages, so the price of products are lower at the cash register.
I think the part also about him faking the loss of this fortune and leaving the USA to avoid a wrongful death lawsuit also suggests that he is not terribly trustworthy.
Is there anything more recent? Six years is a long time ago for graphics cards.
Which has nothing to do with the point the person you're replying to was stating.
He is in Australia, something could be said about exchange rates and cost of living...
This is a great quote from the wiki:
"Fulton first used instrumented dummies as he prepared for a live pickup. He next used a pig, as pigs have nervous systems close to humans. Lifted off the ground, the pig began to spin as it flew through the air at 125 mph (200 km/h). It arrived on board uninjured but in a disoriented state. Once it recovered, it attacked the crew."
I don't know about that...
Me:What are the names of team rocket?
Cleverbot: Sufian Stevens and Elvis Prestly.
It didn't just stop in the 1950s.
Not all disks in the Google study were highly utilized 24/7. Arguably it might be better to turn a hard drive on and leave it on then to park & re-initialize the heads every day. A controlled data center environment is more likely to be beneficial to a hard drive than sitting on or under someone's desk getting knocked or collecting dust.
I am not doubting that SSDs are still experimental and have failures but the concept that HDD are way more reliable is overblown. Seagate has released many crap firmware updates or drives with bad firmware that tank the performance or brick the drive. Hitachi(previously IBM) was known for the "DeathStar" drive. Some manufacturers try to tell you to only run your drive 6 - 8 hours a day. Warranties are also shrinking.
Jeff Atwoods awesome for creating Stack Overflow, but I am not taking him as the end all be all SSD guru. Again, I could look through NewEgg reviews and give you 40 anecdotal cases of DOA disks or drives that just died. You could probably do the same for SSDs. A blog post has a terrible sample size.
If you look at Intel SSDs then actually you'll find them to be be about 6-8 times more reliable than HDDs and they will blow any hard drive or raid setup out of the water in terms of performance.
If anecdotal evidence on SSDs scares you perhaps you should re-review Google's hard data on hard disk failures. Certain brands of SSDs are already many times more reliable than hard drives if looking at failure rates over time. Hard drives are no more reliable. You will find plenty of anecdotes in NewEgg reviews of people buying x number of hard drives and y number of them arriving DOA or dying in 3 months.