I remember the first time I saw Unix, in 1976. The first step in installing it was to compile the C compiler (supplied IIRC in PDP-11 assembler) and then compile the kernal, and then the shell and all the utilities. You had an option as to whether you wanted to put the man pages online since they took up a significant (in those days) amount of disk space. Make was not yet released by AT&T so this was all done either by typing at the command line or (once the shell was running) from shell scripts.
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
fascinating. Moderators- mod this post up. It's knowledgeable, true balanced, multi-viewpoint science. Something we don't see a lot of here.
Interesting that in the 70's a "computer" exhibit it was an analogue computer. Sounds like it was an evolution of the AT&T "VODER" system at the 1939 World's fair. A simulation of the human voice track it had four controls that were run by trained operators (all cute young girls, given the sensitivities of the time) who used their hands and feet to "speak" to visitors. In the 50s and early 60s computations by analog computers were cheaper although less accurate in general. Keep in mind that computation then meant solving differential equations, something that amplifiers, capacitors and inductors do naturally. Also round off error was poorly understood and bits were expensive. By the early 70's the price of digital circuitry was coming down fast and digital was clearly the computer of the future. Analogue components have to be consistent over their whole range in order to be used in mass produced hardware, digital just needs to switch consistently.
This was a different winged dinosaur than the one on the PBS Show.