Comment Re:I skipped Windows 7... (Score 1) 681
I challenge you to build a Mac that can accept a PCIe x16 video card at any price.
I challenge you to build a Mac that can accept a PCIe x16 video card at any price.
I think I've got at least one of everything you mentioned laying around.
If power consumption is a concern, I would try for an early Coppermine P3 processor. These usually have the 'E" suffix, such as the 600EB. These processors usually used less than 20W, which was pretty good compared to the P2 which were more like 45W chips.
If you get your hands on a Dell Optilex or XPS from this era, these are generally good, solid machines but keep mind that the while the motherboards use the ATX power connector, the pin out is not ATX, so don't mix and match or you'll blow something up. HP Vectras aren't bad machines either.
Keep in mind that a lot these P2/P3 boards won't accept 512MB SDRAM modules (these modules seemed to be primarily for the early socket 478 boards that used SDRAM). You can try, but unless you find documentation that says otherwise assume the max memory is 256MB times the number of memory slots.
Hard drives are kind of a crapshoot. I've found a lot of drives from the era that have been sitting for the past few years will still start up and run and seem to work fine for a few days, then will just crap out. Usually just long enough for you to get everything set up
That was true until BMW decided to Bangle them all up, of course. Now, it's just... ugh.
I'm not so sure about televisions. A lot of people replaced their CRTs with LCDs and plasmas, so most of the people I know have televisions that are less than 10 years old. Many of them less than 5 years. From what I've seen of the build quality of most modern TVs, they'd be lucky to get 10 years out of them if they are used regularly, so I think the era of buying a TV and keeping it for 20-30 years is probably over for most people.
Java is still slow. But unless you're using it for heavy number crunching, you won't notice it on a 2014 computer like you did on a 1998 computer.
The big thing that I thought the K1000 lacked in terms of a student camera was a DOF preview lever. That's something I always found to be valuable, and for a student learning the concepts being able to see what changing the aperture does in the camera is a good learning tool. The self timer? Nice to have, but probably not as necessary.
I've always liked the Pentax "M" cameras. One of the best viewfinders on any SLR I've ever laid my hands on, even on the more basic ME.
The original Celeron was a Pentium II without the L2 cache (which was a separate chip back then in the Slot 1 package so could be easily eliminated). This pretty much got laughed out of the marketplace.
The next Celeron was a different chip, with an onboard 128k L2 cache that ran at the CPU speed as opposed to 512k that ran at half speed on the Pentium II. For many people, the smaller, faster L2 cache performed just as well as the larger, slower L2 cache on the Pentium II. Furthermore, these were some of the most overclockable chips ever made, most could easily overclock 50% without any special considerations. And as you mentioned they could be used in a dual CPU set up. What you probably remember is people buying the Celeron 300A and then bitching when they got one that wouldn't overclock.
The next Celeron was basically a P3 with half the L2 cache disabled, and for a long time hobbled with a 66Mhz bus speed. They were also sure to disable the ability for them to be used in a dual CPU set up. These were okay chips, which got better once they bumped the bus to 100Mhz, but as far as I know there is no way to turn them back into a P3.
Crucial had a bad run of memory back in the DDR2-era. Easily identified by the yellow heat spreaders. Not sure what the problem was but I knew several people who bought that memory for vastly different computers (some Intel, some AMD, some namebrand, and some whitebox), and all had problems. Since then I've avoided Crucial. Corsair is good, and I've never had problems with Kingston either.
I believe that you can get a manual in the Honda CRZ, though apparently you take a noticeable mileage hit compared to the CVT. Though the most disappointing thing about the CRZ to me is it actually gets worse mileage than the CRX from the late 80's.
Also, by buying a new car you can also get what you want, as opposed to what you manage to find on the used car lot. Which may be important if you have to live with the car for 15 years. With that said though, nowadays you just don't have the option to order it exactly how you want from the factory like you could 50 years ago, with some high-end exceptions.
It's a bit more like the house has a toaster in it, and the wiring was upgraded to handle the current the toaster draws, which gives you a rough idea of how old the house must be because of how old the toaster is. Except now you discover that the upgraded wiring may have been done to power some other appliance before the toaster showed up.
Maybe I should try a car analogy?
Perhaps to keep it from transmitting and interfering with radio astronomy?
The world population is still growing at just over 200k people per day. That's like 200 Enterprise-D's worth of people every day that you would need to transport off the planet. Barring some crazy new technology straight out of science fiction, there's just no way we could move that many people off the planet.
I have wondered about those i3-taking, ECC-supporting server boards if the error checking still works with the consumer processors.
Since the memory controller is part of the CPU you can't just drop in a regular consumer processor and get ECC this way. You're stuck with whatever models that Intel decides to turn on the ECC bit for, which is pretty much the Xeons and a few oddball embedded versions.
My experience is that it's decent enough hardware, but you pay through the nose for it. On the other hand, their software is the worst bloatware I've ever seen, which basically installs a whole interdependent ecosystem of NI drivers and services on top of Windows, with many of the drivers, libraries, and services doing little more than duplicating the functionality that's already there.
On the other hand, it is pretty simple to get started. It's likely you could connect that device up to a computer, fire up LabView, drop in a few VI's and drag some wires, and have it plotting data and turning LEDs on and off in a few minutes, which in an education environment may be the way to go. In many ways, it's a lot like doing things in Excel - you can whip up something to solve a problem quickly and easy, but it may not be a good solution for building a complex application.
I have hardly ever known a mathematician who was capable of reasoning. -- Plato