Learning how to use older/simpler machines is an excellent way to learn about a number of fundamental concepts. Modern computing, for all its advances, still operates off the same fundamental principles as it did fifty years ago; it's simply become orders of magnitude more complex.
I don't understand. Soldiers don't train with halberds, swords and crossbows. Officers don't train with cavalry formations, trebuchets and castle. Engineers don't start off recreating Stephenson's Rocket or the Wright Flier. Cooks don't start off rubbing wood to start a bonfire and roast mammoth meat. Architects do not stick to cathedrals until they "get their chops".
In fact, science is about the only profession that does anything remotely similar for training, and even then, if old experiments are recreated, the setup has as many non-crucial elements replaced with modern equivalents as possible. We didn't bother using 19th century batteries for the Ohm's Law experiment, because it didn't matter where the current came from. We didn't bother sucking up chemicals with our mouth through pasteur pipettes for cholesterol extraction when we had automatic pipetters. We didn't bother using Bunsen burners instead of hot plates for chemistry.
Low level programming for ICs and the like aside, I don't see what about high level programming is so difficult to teach with, say, Eclipse. Possibly linking up the libraries can seem confusing at first, since a default project has a bunch of them already (and generated code for the GUI) but it's not like it won't let you start basic command line app projects.
This class would definitely teach much history, but just because you love reminiscing about back in the day doesn't mean you should inflict it on everyone. Not only does it make the student's job unnecessarily difficult, but it also takes away the motivation that comes from going home, and being able to immediately apply the thing you learned in class on your computer.