Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
It's funny.  Laugh.

How Computers Work... in 1971 353

prostoalex writes "A recent submission to my free tech books site included a title that I thought many Slashdotters would enjoy. How It Works: The Computer (published 1971 and re-published 1979) is an exciting look into this new thing called computer. The site presents the scanned pages of 1971 and 1979 editions, and you can see how the page on computer code changes over 8 years from punchcards exclusively to magnetic tapes."
This discussion has been archived. No new comments can be posted.

How Computers Work... in 1971

Comments Filter:
  • for everything even remotely related to computation is the intellectual property of SC0.
  • Careful... (Score:5, Funny)

    by ideatrack ( 702667 ) on Friday November 12, 2004 @08:13AM (#10796980)
    you'll have SCO on your ass, you're distributing their code.
  • Sweet (Score:5, Funny)

    by hcob$ ( 766699 ) on Friday November 12, 2004 @08:15AM (#10796985)
    Now let me go get my soldering iron, a trained monkey and a monitor I can get a tan from and we got it made. The monkey is for fetching stuff and "debugging" btw..... (hands monkey a hammer)
  • by Anonymous Coward on Friday November 12, 2004 @08:19AM (#10797002)
    I recently met someone I hadn't seen in twenty years. He used to be a programmer where I worked and now he's teaching at a college.

    He told me that his students call him 'the old fart' and accuse him of being antiquated. I told him that the solution was to prefix anything he said with the word 'embedded'. All of the stuff that he used to do on mini-computers in the seventies is exactly what we are doing on chips today. In fact some chips have exactly the same architecture as the minis that he used to program. Plus ca change ...
    • by Anonymous Coward
      Just what I was thinking. Seriously, if we'd start kids off with a book like this they might grow up with a better understanding that it's just a machine and it can't read your mind.

      This principle is not just for technical works, either. I recently happened upon a copy of George Fischer's "Your Career in Computers" (Meredith Press, 1968[!]). The chapter list reads almost like a modern IT-career tome: "A computer in your life", "For high school grads", "From Wall Street to Main Street", "Opportunities in go
    • by jellomizer ( 103300 ) * on Friday November 12, 2004 @08:37AM (#10797101)
      General Principles dont change and sometimes Retro Prinicpals come back. While the concept on how binary computer work havn't change much many solutions may have. But it is funny when a tech company comes up with this "new" method of doing somthing how long does it take before "the old fart" to go oh I have seen this approach before it was used on this so and so system for this. It actually worked good then but they fased it out because this other method was faster. But now that computers are maginitude faster then this old method should be great because of its added so and so. Actually if you look at some of the old technology I am amiased on some of the methods that they used to acomplish the taskes. And how today many of them are done a lot easier.
    • by maxwell demon ( 590494 ) on Friday November 12, 2004 @08:53AM (#10797189) Journal
      He told me that his students call him 'the old fart' and accuse him of being antiquated. I told him that the solution was to prefix anything he said with the word 'embedded'.

      And in his next lecture, he teached the following:

      Now, we type this onto embedded punch cards and put it into the embedded mainframe. The embedded program will then run, and finally the embedded printer will print the embedded result.
    • and I would like to add..

      Introduction to Microcomputers
      Adam Osborne

      was written in 1979 and can still teach some things. I wish I new who I loaned my old copy out to.
  • Is this... (Score:3, Funny)

    by ^Case^ ( 135042 ) on Friday November 12, 2004 @08:19AM (#10797003)
    ...what you call old news?
  • Student Flashback (Score:5, Informative)

    by Burb ( 620144 ) on Friday November 12, 2004 @08:20AM (#10797009)
    When starting an Electronics degree course in 1981 (was it really so long ago, sigh) the lecturers recommended this book as a start point for anyone who had no idea about computers.

    I presume it was the 79 edition they recommended.

    What a lovely nostalgia trip. Thanks!

    • My mother used to buy me lots of Ladybird books when I was very young in the hope I'd read them and learn lots (sorry to make you feel old :) however I think I'd have skipped over this one when 5 years old. ;)
      Reading it now shows how valuable these books really were. If only I had had more of an interest! Might have learned something...

  • by lottameez ( 816335 ) on Friday November 12, 2004 @08:20AM (#10797011)
    I think CmdrTaco is showing us the instruction booklet for the /. webserver
  • Cool! (Score:5, Funny)

    by Anonymous Coward on Friday November 12, 2004 @08:21AM (#10797012)
    My first thought when I saw this [brinkster.net] picture was:

    "Honey, what's this magnetic tape labelled 'pr0n'?"
  • Wow (Score:2, Funny)

    by Karpe ( 1147 )
    I really want to have one of those when I grow up. :)
  • Is it me, or were they a little optimistic that there would be just as many women as men working on computers?
    • The first programmers [witi.com]were women who worked on the ENIAC during WWII.
      • by Draoi ( 99421 ) * <{draiocht} {at} {mac.com}> on Friday November 12, 2004 @08:50AM (#10797169)
        Not to take away what those women achieved, but Grace Hopper [about.com] was programming computers two years before ENIAC came along. Indeed, she had a major hand in producing the COBOL language.
      • The military employed rooms of women to compute ballistic tables and the like using mechanical calculators. I recall Richard Feyman mentioning this in one of his autobiographies. I presume this what lead women into programming work on the early electronic calculators and computers.
      • Funny that this should come up as the ENIAC was a subject of conversation in our office with one individual stating that it was the first digital computer. It wasn't (there are arguments which actually came first, but it was not the ENIAC).

        The ENIAC officially made the history books as becoming fully operational in 1946. For those not knowing their history, this is AFTER WWII (which ended Sept. 2, 1945 with Japnan surrendering on the USS Missouri in Tokyo Bay).

        "The ENIAC was placed in operation at the M
    • If you include porn sites....
    • Working on computers does not equal "being in IT"

      In regards to people doing office jobs or data entry on computers women most likely outnumber men.
    • by ljavelin ( 41345 ) on Friday November 12, 2004 @08:30AM (#10797060)
      were they a little optimistic that there would be just as many women as men working on computers?

      No, they were showing reality.

      Most (but not all!) programmers were men - they'd be writing the code.

      But most men weren't expected to type... at least not all that well or fast. So they had special purpose "keypunch operators" - mostly women - who would take the hand-written code (written on "coding sheets") and key it onto punchcards. Accuracy and speed in typing were key.

      In addition, operators would feed cards into the computers, etc etc.

      It wasn't a glamorous or creative job. As "on-line" systems and terminals like the 3270 and VT-100 were deployed, the keypunch operators slowly faded away.

      I'd assume that a few exceptionally interested keypunch operators learned to identify programming and machine errors and found their way into programmer ranks.
    • by Draoi ( 99421 ) * <{draiocht} {at} {mac.com}> on Friday November 12, 2004 @08:32AM (#10797068)
      Is it me, or were they a little optimistic that there would be just as many women as men working on computers?

      Probably, though back in the early days, the first programmers were women. Ada Lovelace has been described as Founder of Scientific Computing [sdsc.edu] Grace Hopper also comes to mind. Futhermore, back in the days of cracking Enigma codes, it was teams of women who programmed the bombes [demon.co.uk]. Somewhere along the line, computer programming was co-opted into professional studies as 'engineering' and 'science' and unfortunately, women were actively discouraged from entering those professions. Only now is this changing ...

      • Somewhere along the line, computer programming was co-opted into professional studies as 'engineering' and 'science' and unfortunately, women were actively discouraged from entering those professions. Only now is this changing ...

        I'm sick and tired of hearing this bullcrap. For the past 20-30 years, there's been nothing but active encouragement for women to denounce their traditional gender roles and perform tasks normally associated with men. I'm not saying that isolated instances of discrimination don
        • by Draoi ( 99421 ) *
          For the past 20-30 years, there's been nothing but active encouragement for women to denounce their traditional gender roles

          Not in my country, nor in my experience. When I say 'actively discouraged', I mean it. Been there, done that, saw it happen myself. Many of my contemporaries ( I graduated in '89) tend to concur, BTW.

        • ...Actually your reply is an excellent example of avtively discouraging someone... Is it at all conceivable to you that this is one of the forms of discrimination the parent post was talking about?
        • It amazes me that you can have this attitude. Have you ever met women involved in CS or Engineering that can say that haven't felt "actively discouraged" in their field of choice?

          I haven't. Never, not one. There aren't many women in CS, and every single one I have ever met has confessed that she's felt discriminated against, made to feel stupid, been hit on, and just generally treated differently from everyone else.

          I suppose all these women are lying, eh? Just want attention, maybe? Oh, oh, I know, i
        • A man's perspective...

          Back in 1980 I was 'promoted' from office clerk to computer operator for a small manufacturing company, running a Burroughs B1700. I was to take over for a female computer operator who was retiring. I found out several years later that the reason I was given the job was not because they thought I would be better at it, but because I was not as good at my current job as my female counterpart was. Whether or not that was the real reason I don't know, but I do know that I was a better
      • back in the early days, the first programmers were women.

        Actually, some of the first computers were women. NACA, I guess a precursor to NASA, used to have women whose job title was "computer", because they would do calculations for things such as forces and pressures in wind tunnels.

        More about this here:

        http://www.centennialofflight.gov/essay/Evolution _ of_Technology/Computers/Tech37.htm [centennialofflight.gov]
        • by jc42 ( 318812 ) on Friday November 12, 2004 @09:57AM (#10797636) Homepage Journal
          Actually, some of the first computers were women.

          My wife likes to tell people that one of her first job titles, back in the 1970's, was "computer". This was working for a survey department in the New York state government. She did have an electronic computer available as part of the departmental equipment, and the conflict in the terminology led to a change in the job title after a couple of years.

          She got the job partly because she'd done well in math classes in high school and college. While it was true that there was a lot of social pressure on girls to be technically ignorant, there was also a lot of counter-pressure from many parents and teachers, who often didn't agree with the "barefoot and pregnant" approach.

          Of course, we really haven't totally outgrown that attitude yet. Lots of young women would still agree with that Barbie doll who said "Math is hard." Lots of parents and teachers are still working hard to overcome all the pressures on kids (girls and boys) to remain technically ignorant. This social battle will go on for a long time.

    • Let's put it another way...what if they said, "Computers will continue to be a male dominated industry because women in general have no interest in hi-tech stuff." It's true, but they would have caught hell for actually saying it.
    • Notice that the mistake was corrected in the 1979 edition :)
    • Is it me, or were they a little optimistic that there would be just as many women as men working on computers?
      Actually, I think they were a little optimistic about the dress code for most of their programmers...
    • There's been plenty of surveys that show the number of women in IT has been steadily decreasing in recent years. I started my first IT job in 1990 and some 35-40% of my colleagues were women, including four of the five group managers in my section. Admittedly, this was in local government, which may not be a very representative sample.

      I've worked two places since - at my last billet (1999-2000) there were no women. Now, I can take the blame for this as I was doing the recruiting, but I saw maybe 2-3 CVs
  • Wonder how much (Score:3, Interesting)

    by gargonia ( 798684 ) on Friday November 12, 2004 @08:23AM (#10797024)
    I wonder what the cost on one of these babies was? I have an old magazine ad for the "all new 386!" with a CGA monitor 16MB of RAM and a 40MB hard drive for several thousand dollars. It always cracks me up to think about what fools we'll feel like in the future for paying top dollar for the latest and greatest hardware now.

    "Yeah, I remember paying almost a thousand dollars for just a ONE TERABYTE hard drive!"

    • Bah, that's nothing against an old ZX81 with a whole 1 KB (yes, that's 1024 Bytes) of RAM (you could upgrade to 16 KB by adding a memory pack to the extension slot).
      • Bah, that's nothing against an old ZX81 with a whole 1 KB (yes, that's 1024 Bytes) of RAM (you could upgrade to 16 KB by adding a memory pack to the extension slot).

        Don't forget to mention that this 1kB of memory *of course* included video memory with a worst case requirement of 768 bytes (24x32).

        Thus, the ZX81 detected if it would run with stock 1kB or more and in case of 1kB using some basic video memory "compression" (i.e. a line was not necessarily occupying 32 lines but ended with the last non-s

        • Don't forget to mention that this 1kB of memory *of course* included video memory with a worst case requirement of 768 bytes (24x32)

          What video memory? The ZX81 generates screen output something like this: an interrupt routine eating 75% CPU time feeds character data to hardware shift registers, that produce a line of black&white dots on the screen. Repeat (carefully timed) until screen is done, and then remaining 25% CPU time (vertical blank period) is left for doing useful work until new TV frame beg

    • Are you sure its 16MB?
      At the time of 40MB HDs and early 386, 2-4MB was outrageous, 16MB unheard of.
      I still remember paying 400DM (at that time 300$) for 4MB extra... for my 486...
    • It always cracks me up to think about what fools we'll feel like in the future for paying top dollar for the latest and greatest hardware now.

      The key word is NOW. Why is it foolish, if you need state of the art hardware to do work (or play games) on, to pay the current prices for it? Sure, it'll be 1/2 the cost in 1 year but that's in 1 year. You need it/want it immediately, so you pay the current market rate. If your need for the item is less urgent, or you have less money, you will perhaps wait and
  • by YetAnotherName ( 168064 ) on Friday November 12, 2004 @08:24AM (#10797027) Homepage
    ... is how well your site's holding up under the slashdotting!

    On topic, though, it is a quaint little trip you've provided. It's fun to see the historical context of a chosen career (a chosen passion, I should say). In 1971 I was 1 to 2 years old, and don't recall what the professional goal was. Later it would be "astronaut," until grade school, when video games (c.f. this posting [slashdot.org]) made "computer programmer" be the new (and final) choice.

    Apparently, the publisher, Ladybird Books [ladybird.co.uk], has had its own interesting history [theweeweb.co.uk], and is now part of Penguin.
    • Hmm This explains a lot about the British Police Force...

      From the ladybird
      site [theweeweb.co.uk]...

      The Learnabout books of the 1960s helped children to develop new interests, but these books were not strictly read by children.

      How it Works: The Motor Car (1965) was used by Thames Valley police driving school as a general guide...

      Well made me laugh!
    • Thanks for the great link. I fondly remember the Ladybird books from when I grew up. Anyways, I found this in the second link, and as it relates to the book being discussed, thought it approriate.

      How it Works: The Computer was used by university lecturers to make sure that students started at the same level. Two hundred copies of this same book were ordered by the Ministry of Defence. The MOD wanted the books to be bound in plain brown covers and without any copyright information, to save embarrassing

  • by palfreman ( 164768 ) on Friday November 12, 2004 @08:26AM (#10797034) Homepage
    I've still got that book. It's been pretty out of date for a long time (er, very out of date), but it is very good at explaining things like assembler, old style core memory and flow charting for kids - sets them on the right path, instead of messing them up with an a childized gui's, talking elephants and suchlike.

    The people who wrote this book basically felt that a child of 8 should not have the inner workings of a computer being hidden from them, but be taught th technical side from day 1.

    Anyway, 20 years later this book is still where I first learnt about flow charts and cpu registers!

  • Always ENIAC (Score:4, Interesting)

    by Draoi ( 99421 ) * <{draiocht} {at} {mac.com}> on Friday November 12, 2004 @08:26AM (#10797040)
    "1943 saw the need for computing artillery firing charts, and ENIAC [...] was born. [...] And so the modern electronic computer came into being."

    I guess we now know different, with Atanasoff/Turing/Flowers. We were always taught that ENIAC was first when I did my studies back in the early '80s ....

    • Re:Always ENIAC (Score:2, Informative)

      by basingwerk ( 521105 )
      It often happens that the British invent something and Americans claim it. Everybody here (Cambridge, UK) would tell you that Logi Baird invented the television, but Americans learn it was some other bloke. It's mad.
      • Weeell, Atanasoff was American, as it happens & he was officially first (it went to a legal challenge). Arguably, Konrad Zuse - a German - produced the first programmable computer back in the '30s, though it was mechanical.

        Anyways - just look at some of the coolest computer products that came out of Cambridge; The Sinclair range of computers were the first affordable home computers. I started on a ZX80 kit as a child & had to assemble the thing myself. Programming starts with a soldering iron! :-)

    • Re:Always ENIAC (Score:3, Informative)

      by cybergrue ( 696844 )
      The main problem was that the work done by Turing (and many others) during the war on the Colossus machine (used to break the Enigma code) was classified for at least 30 years after the war. Hence we only started learning about these achievements in the mid 70's, after some of the influential "history of computing" texts had their first editition. Even after its declassification, the work done on Colossus was (and still is) not widely known.

      The ther problem is the defition of Computer. Depending on how

  • Actually, my parents have the original book from 1971. It was part of a series.
  • by Alioth ( 221270 ) <no@spam> on Friday November 12, 2004 @08:32AM (#10797066) Journal
    ...is this book goes into quite some detail (like the method of magnetic polarity changes on a tape). Now you might not think that particularly remarkable - but the book was published by Ladybird - i.e. it was a children's book published in Britain, aimed at children between 8 and 10 years old!

    I remember Ladybird books from my childhood - starting with "Magnets, Bulbs and Batteries." That book had the advice to test a battery, stick the terminals on your tongue (but it admonished you to never do it with a large battery). Just imagine trying to publish that advice now :-) I still test 9v PP9 batteries on my tongue!
    • it was a children's book published in Britain, aimed at children between 8 and 10 years old!

      I think in general Ladybird Books were aimed at younger children, but this series in particular was not. A poster above [slashdot.org] mentioned the organisations that made use of this and other Ladybird Books:

      • How it Works: The Computer was used by university lecturers to make sure that students started at the same level. Two hundred copies of this same book were ordered by the Ministry of Defence.
      • How it Works: The Motor
  • "A small digital computer designed for the businessman." Very humorous indeed. LOL!
  • "Programming in machine code is a job for a highly trained person, whereas programming in a high level language is something most people can do provided they are given time to learn the rules that must be followed"

    That was optimistic. We have languages such as C++, Python, Java etc now (compared to FORTRAN and COBOL they mentioned in the book) and still programming is sort of a geek thing.
  • by mccalli ( 323026 ) on Friday November 12, 2004 @08:47AM (#10797148) Homepage
    Wow, a quick blast from my own past here as well. I had this book - it was part of a series, and they were all very well written.

    Obviously written for a young, general audience rather than technical people. Then again that's exactly what I was part of at the time. I wasn't actually born in 1971, I was born in 1972. Strangely though, I remember the first cover not the second - perhaps I had an old edition? Anyway, my point here is that despite being a supposedly non-technical book, look at the language and level of detail covered. Look at this page [brinkster.net], for example - get that in many introductory books these days? No, you don't. Interesting how depth of knowledge changes.

    Anyway, can confirm that this piqued my interest enough to be excited about computers when the first wave of home computing hit the UK (about 1982, a ZX Spectrum 48k for me). Haven't really looked back - I now have a computing career, and whilst many factors lead to me wanting that it must be said that this book was the first to nudge me in the right direction.

    Cheers,
    Ian

  • Emma Peel! (Score:3, Funny)

    by mccalli ( 323026 ) on Friday November 12, 2004 @09:00AM (#10797228) Homepage
    Hmm. I didn't know The Avengers [brinkster.net] worked in computing. Admittedly, Steed seems to have changed a little but my god is that a dead-ringer for Emma Peel.

    Cheers,
    Ian

  • by sifi ( 170630 ) on Friday November 12, 2004 @09:09AM (#10797281)
    I wish my PC had a built in washing machine, like the one the guy is using on page 9 'mini computer system'.

    And they call it progress.
  • Computer magic (Score:3, Interesting)

    by Sai Babu ( 827212 ) on Friday November 12, 2004 @09:11AM (#10797296) Homepage

    Born in the early 50's with a hand in electronics since messing with old radios in my grandfathers chicken coup at age 4, I've never felt any 'magic' associated with computers. Adders, registers, programs 'written' in wire on a card were all easy to understand. I messed with early RTL IC's in high school and have played with computer hardware ever since. However, while computers are grand tools, they've never seemed 'magical'. Not like radio. Radio was and always has held a much greater fascination. I attribute this to the deterministic nature of the computer as opposed ot the 'fishing' aspect of radio. With radio, you never really know if it is going to do what you ask it to. A computer does exactly what you ask it to. Yet, I see this aura of magic in the eyes of others when they work with computers. Where does it come from? The humorous answer is that their computers don't seem to behave in a deterministic way (spare me the Mr. Softie humor). But, many postings on /., from people who appear to know how their computers work, reek of this sense of magic. What gives? Does one have to be born with a plutonium atom in the center of ones brain or am I misreading an enourmous appreciation for the power of the tool as a fascination with some quality that I fail to perceive?

    These comments apply to digital electronic computers. I can't help but see some magic in wetware (mouse brains flying airplanes).

    • Re:Computer magic (Score:2, Interesting)

      by cathouse ( 602815 ) *
      Pre-dating you by a decade, I remember very clearly the two semesters that I took consecutive 5 unit courses in programming *Unit Record Machines* by plugging [really] thousands of wires into holes in program boards as large as 1x1.5 meters. Each and every move of each and every character required plugging a wire from one hole to another. And any branching operation took twice, plus running the wires for the condititional, etc. No there ain't no magic.
  • On a similar note: (Score:5, Interesting)

    by ledow ( 319597 ) on Friday November 12, 2004 @09:17AM (#10797339) Homepage
    On a similar note, I can remember the series that was published by Marshall Cavendish called INPUT. This was a fantastic bi-weekly serial magazine published in the early 80's that focused almost exclusively on programming for the early micros.

    I owned about six or seven issues and it was the best explanation of programming, also containing loads of example programs for about six different home machines, so that no matter what machine you had you could use the same program as everyone else. The learning curve was perfect when I was a kid and isn't patronising now that I'm an adult re-reading them. Those issues almost single-handedly started my love of computing (along with the ZX Spectrum).

    My brother found the entire first volume at a boot sale some years back and I read through them all again, despite knowing several languages by then (the books primarily focused on BASIC and assembly for the revelant micros, Z80 or 6502 etc.).

    Recently, I purchased the missing volumes off of eBay and they are fantastic. I only wish I had the enthusiasm to actually still sit and type out my programs any more. One text adventure had about 10 pages full of encrypted hexadecimal that you had to type in by hand, perfectly, for it to work! I don't miss those days...

    Reading back through them, like this book, the parts that were generic to computing, i.e. hardware, peripherals, storage etc., were very quickly outdated. However the computing and programming principles still stand strong and many's the time that my understanding of binary, assembly and the deeper workings of the computer have helped me.

    But it's still amazing how quickly something can go from being state-of-the-art to back-of-the-cupboard.
  • by __aavljf5849 ( 636069 ) on Friday November 12, 2004 @09:25AM (#10797399)
    Although of course wildly outdated even when it was published (as all useful computer books always are) a good book explaining the basics is never wrong. And the basics still are the same. There is still is loads of information in these old books that would be useful to anyone getting into computers, surprisingly enough... :-)

    I held a course in TCP/IP in the early nineties. The part that most clearly divided the class was the net mask. People that had studied computer science, or were self-taught nerds, of course already knew binary arithmetic. They found using net masks trivial. The people who had ended up as network administrators by mistake (most of them, really) had huge problems. After holding this course a couple of times, I simply extended it with teaching everybody binary arithmetic first. That made it easier for most people.

    You don't need to know how a computer works to use it anymore, but a good network manager should still know it, and a programmer won't last two weeks without understanding what actually goes on.

    Well, maybe if he is using Python. ;)
  • from an old late book aimed at kids about robots, computers and the like, went something like -

    "A programmer is somebody who converts problems from the real word into a language the computer can understand"

    I just love it! not 'creats solutions' but 'converts problems' LOL!

    (wish i could look it up but everything is packed away at the mo)
  • trivia (Score:3, Interesting)

    by museumpeace ( 735109 ) on Friday November 12, 2004 @09:36AM (#10797481) Journal
    Isn't the picture on pg 8 of the 1971 edition actually an IBM 360? I operated one as a student and this sure looks like a 360 without the power supply cabinet or tape drives. That would not have been considered a small system even in the early 70's. Looks like a 1403 line printer with it too.

    Having signaled that I am ancient, I may not surprise a few of you to note that the quaint and amusing quality of the book in the article is a misleading offering if you take it as history. The development of computing is both a technical and a human story of considerable depth and much more interesting reading is available.

    Anybody who actually finds this stuff interesting need not confess. Just quietly make your way to the libraray and look up Paul Ceruzzi's A History of Computing [MIT PRESS] which gets all the facts and personalities straight as well as properly labeling the pictures. If you are in a hurry to waste time, there are tons of documents on line re the history of computing, for instance such as this page of links [iit.edu] from an IIT prof.
  • It's the Longhorn manual.
  • Displayed proudly in my cube. I tend to refer people to it when they ask about BGP/MPLS/EIGRP.
  • by peter303 ( 12292 ) on Friday November 12, 2004 @09:46AM (#10797550)
    My school got a teletype connection to some nearby college computer, probably a DEC machine. We were allowed to write BASIC programs. At the same time John Conway's "Game of Life" was the rage in Scientific American. So I coded it up in BASIC. Took about a minute to print each generation in asterisks and blanks.
    A few years later I implemented the algorithm in bipolar circuits for digital electronics lab at the university. The display was was blobs on an oscilliscope. I recall it did several hundred generations a second. CRT computer terminals didnt really become widespread until shortly after that in 1975. They required that the price of a half kilobyte of ROM to fall to $100 (thanks to that upstart Intel). Type fonts patterns were stored in ROM. A 5x7 bit character set required 320 bytes of ROM.
  • Another Old Computer (Score:3, Interesting)

    by Mignon ( 34109 ) <satan@programmer.net> on Friday November 12, 2004 @10:00AM (#10797662)
    Apropos old computers, I've had a recent fixation on the Olivetti Video Display Terminal [tudelft.nl], which I saw in a book of Mario Bellini's industrial designs. It's probably just as well it hasn't shown up on eBay lately 'cause I sure don't have the space.
  • by wandazulu ( 265281 ) on Friday November 12, 2004 @10:10AM (#10797733)
    I don't think I ever read this book (born in 1970), but flipping through the pages, it makes me realize what computers still mean to my folks; batch cards, mag tapes, green-n-white printouts.

    Therein lies the rub; to my folks, any computer that can be fit in a single box and doesn't live in a raised-floor room, is a toy. It's actually very black and white for them..."yes it's all very nice what those toys can do for the movies, but it takes a *computer* to process GE's payroll."

    It also reminds me of when a friend of mine brought his dad in to work to show him what he did. His dad was a serious old school programmer for custom chips for Navy jets. He looked it too...checkered shirt, crew cut, pocket protector (first time I'd ever seen one). My friend shows him the *cough* Powerbuilder app we'd be working on, with its buttons and datawindows, etc., and his dad just went *pft* and waved his hand.

    The fact that I can run emulators of any of those systems and they run 10x faster has never made a dent in my folks opinion. As far as they can see, and as far as my friend's dad can see, we're just playing with toys.

    Anyone else had that happen?
    • I'd agree.. Especially after Microsoft teamed up with Fisher Price on the design work for Windows XP...
    • As far as they can see, and as far as my friend's dad can see, we're just playing with toys.

      My mom worked as Wire Chief (read: senior technician with some management responsibilities) for the Burlington Northern Railroad in the '80s when they installed a Xerox Star network. It was the first GUI I'd ever seen. Well, actually, it was the first GUI that pretty much anyone had ever seen. Anyway, there she was back at the start of the Reagan era using a graphical networked workstation with remote file stora

  • by ballpoint ( 192660 ) on Friday November 12, 2004 @10:56AM (#10798158)
    I did my first real programming in Fortran on punched cards. Nobody could punch 'CONTINUE' faster than I did at that time.

    I still remember the sound of the card reader (fla-bap, flapflapflapflapflap......) and of the line printer. To recogize when my job was done, I inserted a few carefully spaced cards full of '*'s in front of the deck, producing a unique rhythmic sound pattern when printed.

  • by wandazulu ( 265281 ) on Friday November 12, 2004 @12:51PM (#10799790)
    Because these books talk about a time when punch cards were still all the rage, and because my Ask Slashdot article was rejected, I'll ask here:

    Does anyone still use punch cards? I know some states used punch cards for the Nov 2 election, but I'm wondering if there are still decks of cards at companies waiting to be run through and the output printed on green-n-white paper.

    It's not a criticism or putdown question, I can believe there are some jobs on some equipment that just can't (or won't) be ported to something newer, and "what worked for us back then works for us now."

    Just curious.

Talent does what it can. Genius does what it must. You do what you get paid to do.

Working...