Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
It's funny.  Laugh.

How Computers Work... in 1971 353

prostoalex writes "A recent submission to my free tech books site included a title that I thought many Slashdotters would enjoy. How It Works: The Computer (published 1971 and re-published 1979) is an exciting look into this new thing called computer. The site presents the scanned pages of 1971 and 1979 editions, and you can see how the page on computer code changes over 8 years from punchcards exclusively to magnetic tapes."
This discussion has been archived. No new comments can be posted.

How Computers Work... in 1971

Comments Filter:
  • by Anonymous Coward on Friday November 12, 2004 @09:19AM (#10797002)
    I recently met someone I hadn't seen in twenty years. He used to be a programmer where I worked and now he's teaching at a college.

    He told me that his students call him 'the old fart' and accuse him of being antiquated. I told him that the solution was to prefix anything he said with the word 'embedded'. All of the stuff that he used to do on mini-computers in the seventies is exactly what we are doing on chips today. In fact some chips have exactly the same architecture as the minis that he used to program. Plus ca change ...
  • Wonder how much (Score:3, Interesting)

    by gargonia ( 798684 ) on Friday November 12, 2004 @09:23AM (#10797024)
    I wonder what the cost on one of these babies was? I have an old magazine ad for the "all new 386!" with a CGA monitor 16MB of RAM and a 40MB hard drive for several thousand dollars. It always cracks me up to think about what fools we'll feel like in the future for paying top dollar for the latest and greatest hardware now.

    "Yeah, I remember paying almost a thousand dollars for just a ONE TERABYTE hard drive!"

  • by Average_Joe_Sixpack ( 534373 ) on Friday November 12, 2004 @09:26AM (#10797033)
    The first programmers [witi.com]were women who worked on the ENIAC during WWII.
  • Always ENIAC (Score:4, Interesting)

    by Draoi ( 99421 ) * <draiocht&mac,com> on Friday November 12, 2004 @09:26AM (#10797040)
    "1943 saw the need for computing artillery firing charts, and ENIAC [...] was born. [...] And so the modern electronic computer came into being."

    I guess we now know different, with Atanasoff/Turing/Flowers. We were always taught that ENIAC was first when I did my studies back in the early '80s ....

  • by suso ( 153703 ) on Friday November 12, 2004 @09:27AM (#10797045) Journal
    Actually, my parents have the original book from 1971. It was part of a series.
  • by Alioth ( 221270 ) <no@spam> on Friday November 12, 2004 @09:32AM (#10797066) Journal
    ...is this book goes into quite some detail (like the method of magnetic polarity changes on a tape). Now you might not think that particularly remarkable - but the book was published by Ladybird - i.e. it was a children's book published in Britain, aimed at children between 8 and 10 years old!

    I remember Ladybird books from my childhood - starting with "Magnets, Bulbs and Batteries." That book had the advice to test a battery, stick the terminals on your tongue (but it admonished you to never do it with a large battery). Just imagine trying to publish that advice now :-) I still test 9v PP9 batteries on my tongue!
  • by Draoi ( 99421 ) * <draiocht&mac,com> on Friday November 12, 2004 @09:50AM (#10797169)
    Not to take away what those women achieved, but Grace Hopper [about.com] was programming computers two years before ENIAC came along. Indeed, she had a major hand in producing the COBOL language.
  • Computer magic (Score:3, Interesting)

    by Sai Babu ( 827212 ) on Friday November 12, 2004 @10:11AM (#10797296) Homepage

    Born in the early 50's with a hand in electronics since messing with old radios in my grandfathers chicken coup at age 4, I've never felt any 'magic' associated with computers. Adders, registers, programs 'written' in wire on a card were all easy to understand. I messed with early RTL IC's in high school and have played with computer hardware ever since. However, while computers are grand tools, they've never seemed 'magical'. Not like radio. Radio was and always has held a much greater fascination. I attribute this to the deterministic nature of the computer as opposed ot the 'fishing' aspect of radio. With radio, you never really know if it is going to do what you ask it to. A computer does exactly what you ask it to. Yet, I see this aura of magic in the eyes of others when they work with computers. Where does it come from? The humorous answer is that their computers don't seem to behave in a deterministic way (spare me the Mr. Softie humor). But, many postings on /., from people who appear to know how their computers work, reek of this sense of magic. What gives? Does one have to be born with a plutonium atom in the center of ones brain or am I misreading an enourmous appreciation for the power of the tool as a fascination with some quality that I fail to perceive?

    These comments apply to digital electronic computers. I can't help but see some magic in wetware (mouse brains flying airplanes).

  • On a similar note: (Score:5, Interesting)

    by ledow ( 319597 ) on Friday November 12, 2004 @10:17AM (#10797339) Homepage
    On a similar note, I can remember the series that was published by Marshall Cavendish called INPUT. This was a fantastic bi-weekly serial magazine published in the early 80's that focused almost exclusively on programming for the early micros.

    I owned about six or seven issues and it was the best explanation of programming, also containing loads of example programs for about six different home machines, so that no matter what machine you had you could use the same program as everyone else. The learning curve was perfect when I was a kid and isn't patronising now that I'm an adult re-reading them. Those issues almost single-handedly started my love of computing (along with the ZX Spectrum).

    My brother found the entire first volume at a boot sale some years back and I read through them all again, despite knowing several languages by then (the books primarily focused on BASIC and assembly for the revelant micros, Z80 or 6502 etc.).

    Recently, I purchased the missing volumes off of eBay and they are fantastic. I only wish I had the enthusiasm to actually still sit and type out my programs any more. One text adventure had about 10 pages full of encrypted hexadecimal that you had to type in by hand, perfectly, for it to work! I don't miss those days...

    Reading back through them, like this book, the parts that were generic to computing, i.e. hardware, peripherals, storage etc., were very quickly outdated. However the computing and programming principles still stand strong and many's the time that my understanding of binary, assembly and the deeper workings of the computer have helped me.

    But it's still amazing how quickly something can go from being state-of-the-art to back-of-the-cupboard.
  • trivia (Score:3, Interesting)

    by museumpeace ( 735109 ) on Friday November 12, 2004 @10:36AM (#10797481) Journal
    Isn't the picture on pg 8 of the 1971 edition actually an IBM 360? I operated one as a student and this sure looks like a 360 without the power supply cabinet or tape drives. That would not have been considered a small system even in the early 70's. Looks like a 1403 line printer with it too.

    Having signaled that I am ancient, I may not surprise a few of you to note that the quaint and amusing quality of the book in the article is a misleading offering if you take it as history. The development of computing is both a technical and a human story of considerable depth and much more interesting reading is available.

    Anybody who actually finds this stuff interesting need not confess. Just quietly make your way to the libraray and look up Paul Ceruzzi's A History of Computing [MIT PRESS] which gets all the facts and personalities straight as well as properly labeling the pictures. If you are in a hurry to waste time, there are tons of documents on line re the history of computing, for instance such as this page of links [iit.edu] from an IIT prof.
  • by peter303 ( 12292 ) on Friday November 12, 2004 @10:46AM (#10797550)
    My school got a teletype connection to some nearby college computer, probably a DEC machine. We were allowed to write BASIC programs. At the same time John Conway's "Game of Life" was the rage in Scientific American. So I coded it up in BASIC. Took about a minute to print each generation in asterisks and blanks.
    A few years later I implemented the algorithm in bipolar circuits for digital electronics lab at the university. The display was was blobs on an oscilliscope. I recall it did several hundred generations a second. CRT computer terminals didnt really become widespread until shortly after that in 1975. They required that the price of a half kilobyte of ROM to fall to $100 (thanks to that upstart Intel). Type fonts patterns were stored in ROM. A 5x7 bit character set required 320 bytes of ROM.
  • Re:Computer magic (Score:2, Interesting)

    by cathouse ( 602815 ) * on Friday November 12, 2004 @10:48AM (#10797572)
    Pre-dating you by a decade, I remember very clearly the two semesters that I took consecutive 5 unit courses in programming *Unit Record Machines* by plugging [really] thousands of wires into holes in program boards as large as 1x1.5 meters. Each and every move of each and every character required plugging a wire from one hole to another. And any branching operation took twice, plus running the wires for the condititional, etc. No there ain't no magic.
  • by Draoi ( 99421 ) * <draiocht&mac,com> on Friday November 12, 2004 @10:53AM (#10797611)
    One word: Ada

    Mmm. While Ada was cool and described how Babbage's Analytical Engine could be programmed, she never actually programmed a computer [st-and.ac.uk].

  • by jc42 ( 318812 ) on Friday November 12, 2004 @10:57AM (#10797636) Homepage Journal
    Actually, some of the first computers were women.

    My wife likes to tell people that one of her first job titles, back in the 1970's, was "computer". This was working for a survey department in the New York state government. She did have an electronic computer available as part of the departmental equipment, and the conflict in the terminology led to a change in the job title after a couple of years.

    She got the job partly because she'd done well in math classes in high school and college. While it was true that there was a lot of social pressure on girls to be technically ignorant, there was also a lot of counter-pressure from many parents and teachers, who often didn't agree with the "barefoot and pregnant" approach.

    Of course, we really haven't totally outgrown that attitude yet. Lots of young women would still agree with that Barbie doll who said "Math is hard." Lots of parents and teachers are still working hard to overcome all the pressures on kids (girls and boys) to remain technically ignorant. This social battle will go on for a long time.

  • Another Old Computer (Score:3, Interesting)

    by Mignon ( 34109 ) <satan@programmer.net> on Friday November 12, 2004 @11:00AM (#10797662)
    Apropos old computers, I've had a recent fixation on the Olivetti Video Display Terminal [tudelft.nl], which I saw in a book of Mario Bellini's industrial designs. It's probably just as well it hasn't shown up on eBay lately 'cause I sure don't have the space.
  • by wandazulu ( 265281 ) on Friday November 12, 2004 @11:10AM (#10797733)
    I don't think I ever read this book (born in 1970), but flipping through the pages, it makes me realize what computers still mean to my folks; batch cards, mag tapes, green-n-white printouts.

    Therein lies the rub; to my folks, any computer that can be fit in a single box and doesn't live in a raised-floor room, is a toy. It's actually very black and white for them..."yes it's all very nice what those toys can do for the movies, but it takes a *computer* to process GE's payroll."

    It also reminds me of when a friend of mine brought his dad in to work to show him what he did. His dad was a serious old school programmer for custom chips for Navy jets. He looked it too...checkered shirt, crew cut, pocket protector (first time I'd ever seen one). My friend shows him the *cough* Powerbuilder app we'd be working on, with its buttons and datawindows, etc., and his dad just went *pft* and waved his hand.

    The fact that I can run emulators of any of those systems and they run 10x faster has never made a dent in my folks opinion. As far as they can see, and as far as my friend's dad can see, we're just playing with toys.

    Anyone else had that happen?
  • by Ronin Developer ( 67677 ) on Friday November 12, 2004 @11:15AM (#10797772)
    Funny that this should come up as the ENIAC was a subject of conversation in our office with one individual stating that it was the first digital computer. It wasn't (there are arguments which actually came first, but it was not the ENIAC).

    The ENIAC officially made the history books as becoming fully operational in 1946. For those not knowing their history, this is AFTER WWII (which ended Sept. 2, 1945 with Japnan surrendering on the USS Missouri in Tokyo Bay).

    "The ENIAC was placed in operation at the Moore School, component by component, beginning with the cycling unit and an accumulator in June 1944. This was followed in rapid succession by the initiating unit and function tables in September 1945 and the divider and square-root unit in October 1945. Final assembly took place during the fall of 1945." see http://ftp.arl.army.mil/~mike/comphist/eniac-story .html

    The question is whether or not it was completed and sctually used for meaningful work during WWII (supposedly, it was used to calculate ballistic trajectory tables). According to the Army's story, it was not. Differential Engines and calculators where the state of the art during the war. The teams that programmed those devices most certainly were then chosen to program the ENIAC. And, this is not to say that progamming the ENIAC did not begin prior to its completion.

    RD
  • Re:Student Flashback (Score:2, Interesting)

    by SenatorOrrinHatch ( 741838 ) on Friday November 12, 2004 @11:19AM (#10797803)
    I find it interesting that no present student programming textbook that I have ever seen states so clearly: "In fact, computers do not have brains and they cannot really think for themselves." Actually, it seems to me that present AI researchers (at least the ones I've read and met) try to reject this idea. Of course, on their side, there is the fact that brains, insofar as they are physical objects, can be simulated by computers.
  • by ballpoint ( 192660 ) on Friday November 12, 2004 @11:56AM (#10798158)
    I did my first real programming in Fortran on punched cards. Nobody could punch 'CONTINUE' faster than I did at that time.

    I still remember the sound of the card reader (fla-bap, flapflapflapflapflap......) and of the line printer. To recogize when my job was done, I inserted a few carefully spaced cards full of '*'s in front of the deck, producing a unique rhythmic sound pattern when printed.

  • Slashdot in History (Score:3, Interesting)

    by LouisvilleDebugger ( 414168 ) on Friday November 12, 2004 @12:46PM (#10798753) Journal
    ca. 3500 BC: Calculi is dying

    ca. 2500 BC: Cuneiform is dying

    1835: Babbage Design: 1. Make a precisely-machined brass gear 2: now do it a million times 3. ??? 4. Profit!

    1837: The Analytical Engine is Dying

    1978: BSD is Dying
  • Re:Always ENIAC (Score:1, Interesting)

    by Anonymous Coward on Friday November 12, 2004 @01:12PM (#10799234)
    Or discover something.

    In math, calculus was either Newton or Leibniz, depending on who you listen to.

    In chemistry, oxygen was either discovered by Scheele, Lavoisier or Dalton depending on if you're Swedish, French or English.

    (IIRC Scheele was first, didn't publish, Lavoisier was first to publish, Dalton first to recognize it as an element)
  • by wandazulu ( 265281 ) on Friday November 12, 2004 @01:51PM (#10799790)
    Because these books talk about a time when punch cards were still all the rage, and because my Ask Slashdot article was rejected, I'll ask here:

    Does anyone still use punch cards? I know some states used punch cards for the Nov 2 election, but I'm wondering if there are still decks of cards at companies waiting to be run through and the output printed on green-n-white paper.

    It's not a criticism or putdown question, I can believe there are some jobs on some equipment that just can't (or won't) be ported to something newer, and "what worked for us back then works for us now."

    Just curious.

There are two ways to write error-free programs; only the third one works.

Working...