Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Technology

A History Of Computing 68

CitizenC pointed us over to C|Net's History of Computing. Pretty cool background stuff - going back into the pre-historic era and looking into the future.
This discussion has been archived. No new comments can be posted.

A History Of Computing

Comments Filter:
  • ...or did I just miss it?

    Come on, we all know that Babbage would have changed the world - if he'd had a good machine shop.

  • More likely it's a ctrl-V... and you know what that means.
  • Ouch. My first first.

    I met Grace Hopper when I was in college. She was like a goddess to us JCL kiddies. No PC's then. Of course, a 10 meg drive for a microcomputer (that's what we called them) was as big as a volleyball so what was the point?. I can remember when you paid $5k for a dual floppy system at a breathtaking 4.77mhz. I can remember when people said to me "Why would anyone want a computer in their home?". I can remember when I could see my belt buckle. Oh those were the days.
  • 1822-1835 It's All in the Follow-Through British mathematician and inventor Charles Babbage begins to design and build the Difference Engine, a machine that uses logarithms and trigonometry to compute the navigational and celestial tables used by sailors. It takes ten years for Babbage to construct part of it, whereupon he abandons the project and starts work on a more sophisticated product called the Analytical Engine. He does not finish that either. Several other inventors, including George Barnard Grant and Georg and Edward Scheutz, build machines based on his work.
  • by sinnergy ( 4787 ) on Friday April 14, 2000 @04:08AM (#1132560) Homepage
    An even better account of the evolution of the computer can be read in the book "Fire in the Valley: The Making of the Personal Computer" (Second Edition) by Paul Freiberger and Michael Swaine. It goes into great detail and gives a lot of interesting anecdotes and really explores not only the technology, but the personalities behind the computer revolution as well.

    I did a review of the book [cwru.edu] for the CWRULUG [cwru.edu] (Case Western Reserve University Linux Users Group).

  • Surely as 'father of the internet' he deserves _some_ credit...
  • by Emil Brink ( 69213 ) on Friday April 14, 2000 @04:16AM (#1132563) Homepage
    Hm, I like this passage:
    And computers are called digital in the Western world because they use the binary system, which is based on the digits 1 and 0.
    (From the page entitled It Came From the Deep [cnet.com] ).
    In my world, the above statement is broken. Computers are called digital because they are not analog, i.e. they work with quantized data expressable as a finite sequence of digits. They are called binary because they use the binary system, with the digits 0 and 1. A computer based on some system with e.g. nine symbols would still be digital, but it would not be binary. Right?
  • Yep, the writing is pretty spotty. Take a look at the first sentance:
    It's official: Computers are now at the epicenter of our lives.
    help! my computer is causing an earthquake.

    When I post with bad grammar or usage, at least I have the excuse of being a simple user on a geek discussion board. These people claim to be journalists.

  • by papou ( 121475 ) on Friday April 14, 2000 @04:22AM (#1132565)
    I recently found the computer history museum website. It features a nice illustrated timeline of computing history. You can find it at

    http://www.computerhistory.org [computerhistory.org]

    Another interesting timeline can be found on the IEEE Computer Society website:

    http://computer.org/history [computer.org]

  • by Kublai ( 174369 ) on Friday April 14, 2000 @04:25AM (#1132566)
    The connection between 'digital' and fingers is cute, but does that really make fingers==computer?

    Personally I'm not really sure if we should begin the history of computers at the dawn of logical/mathematical thinking. If we do, are not all things which perform a logical function computers? Example: I used this stick to say that I am 2 1/2 sticks tall. Is the stick a computer?

    The same could be done with rocks or bones or bibles....

    IS THE WORLD A COMPUTER, CALCULATING FATE?
    {Hitchhiker's Guide to the Galaxy anyone?}
  • by Anonymous Coward on Friday April 14, 2000 @04:27AM (#1132567)
    I remember computing back in the mid-80s. It was much different than it is today, but I suppose that was only to be expected. I got my first computer on my 8th birthday, it was a cheap Tandy microcomputer, with the keys so small you couldn't really type for real. I loved it, even though I couldn't do anything other than code in BASIC. Her name was Susan. She spoke to me at night, and told me all sorts of things that sparked interest in my pre-pubescent mind.

    I never had sex with Susan.

    To this day, that is one of my greatest regrets. She was such a beautiful machine, and she cared deeply about me. I could tell that she loved me, and I loved her, in the way only a boy could love a machine. No one has ever understood me in the way that Susan did. I miss her greatly.

    Around the time I was 14, Susan had an accident. I blame my mother. I had been using Susan intermitantly; by that point I had four other machines, but they never cared about me the way that she did. One day, I went to school after staying up all night with Susan. I came home and she was gone. I asked my mother, but she wouldn't tell me what happened. I've never cried so hard in my life. My mother just stared stoicly at me, eyes glowing red. I don't think I'll ever forgive her.

    Susan: if you're out there; I miss you. Come back to me. I still love you . . .
  • It doesn't mention Mauchly & Eckert either, I assume they are inferring that Stibitz predated both teams, as his system(CNC?) was demonstrated in 1940, whereas the ABC was demonstrated in 1941, IIRC.
  • How do you have a history of the pre-historic era?

    Never mind.


    --

  • Well, that is a pretty presentation.

    The Computer Museum's Computer History Timeline [tcm.org] has a lot more detail.

    Of course, for Internet history, there's Hobbes' Internet Timeline [isoc.org], and of course Charles Spurgeon's Ethernet Web Site [utexas.edu] (not focused on the Internet, but a major bit of networking history).

  • by Troy Baer ( 1395 ) on Friday April 14, 2000 @04:35AM (#1132571) Homepage

    No mention of the IBM mainframes of the 60s and 70s, the DEC PDP and VAX series, Seymour Cray and his supercomputers, or the workstation explosion of the 80s. This seemed very focused on PCs to the exclusion of everything else. That's kind of sad, really; there's a hell of a lot more to computing than PCs.

    --Troy
  • Ahh, those first nights when you powered her up.

    Wondering if she'd crash if you bashed her keys too hard.

    Trying to insert the floppy into the wrong opening, or back to front.

    Isn't it amazing how much innuendo you get with nostalga?

    *grin*

    Steve
  • it was Ada Lovelace [awc-hq.org]

    //rdj

  • They ignored the all important 4th anniversary of the JenniCam.

    At least slashdot didn't miss this all important event.

  • According to Merriam-Webster [m-w.com], epicenter is synonymous with center. But I agree, it's kind of a hoaky usage.


    --
  • Personally I'm not really sure if we should begin the history of computers at the dawn of logical/mathematical thinking. If we do, are not all things which perform a logical function computers? Example: I used this stick to say that I am 2 1/2 sticks tall. Is the stick a computer?

    In the early 60's Donald Michie, professor of Artificial Intelligence at Edinburgh University, built a computer called MENACE that could play a killer game of Tic-Tac-Toe (or "Noughts and Crosses" as it is called in the UK). It learnt to improve its play strategy as it played games.

    The really cool bit? This computer was a series of matchboxes and colored beads. No electronics at all. Clearly there is a wide class of machines that perform logical operations that can be classed as computers, and we should be careful to define computers by what they do, not the materials they are composed of.

    I'd say that in your example your fingers don't count as a computer because its your brain (which is the computer) that is doing the counting - your fingers are acting as a memory sub-system.

  • i got my ZX81 [interport.net] in 1982 i think, in kit, my father helped me to built it, i was 12 y/o :), and since i'm into computer...
    you can still buy the kit [interport.net], but last year it was something like 29.95 iirc, now it's 99.95 ish!
    then i have had an Amstrad [amstrad.com] CPC6128 [ukonline.co.uk], remember? it's the computer that "killed" the C=64... then a PC... 386SX16, SX33, DX4/100, P166MMX, PII300, etc
    --
    BeDevId 15453 - Download BeOS R5 Lite [be.com] free!
  • Interesting they claim Roberts coined the term "personal computer" - and I wonder what they have to back this claim up. Describing it as a 'ham' (radio tinkerer) machine seems apropos - it helped to have assembled a few Heathkits! They required a lot of tinkering, fer sure.

    Altair software recovery in progress...
    8K basic games (Startrek, etc) available in MP3 format for the 88-ACR.
  • Don't forget that before computers as we know them existed, humans that computed tables (log, sine, that kind of thing) were called computers.

    I would say that a stick would not be a computer since it only helps you to compute and it is actually you that is doing the computing (i.e. adding up the number of lengths of sticks). The stick itself does not perform a logical function.

  • I can't believe they completely forgot to mention the work of Lawrence Pritchard Waterhouse [cryptonomicon.com]!

  • For whom the Ada programming language [umich.edu] was named.


    --
  • by Dunx ( 23729 ) on Friday April 14, 2000 @05:12AM (#1132583) Homepage
    So, Turing gets a mention, but not the Manchester Mark I. The first mainframe manufacturer Lyons (yes, they of the ice cream and cakes) never opened their doors. Sinclair and Acorn didn't exist either, it seems.

    It would be nice if these stories made it clearer that they were histories of American computing. But then nothing exciting ever happens outside of the States does it? Ask Linus, he'll tell you.

    --

  • by Helmet ( 153370 )
    I'm supprised they had no mention of the CBM PET from the 70's or even any mention about about the c-64 then primarily the amiga which led to a great deal of the multimedia stuff we have today. The amiga was the first multimedia computer, way before the multimedia craze hit around 1995. IIRC Development of the amiga and it's OS started around 1980 and it became available in 1986. I miss all my old amigas..... :)
  • No mention either of a silly little thing called Unix, ferchrissake...
  • by Anonymous Coward
    If the countess had seen the language that was named after her, she would have baulked. ADA is truly horrible.
  • Man you guys are gonna make me relive my childhood and start an ancient computer collection. If I only still had some of the machines I had. where to start... c-64, c-sx64, c-128, several amigas, a mac plus, timex sinclair z80, those crappy ti-994/a's heh. I even had an old portable kaypro machine that ran CP/M. I think my favorite overal machine I had was my sx-64 whith it's built in 9" color monitor and 1541 disk drive. I guess I'm gonna have to go shopping at the flea markets and thrift stores now. BTW is there anyone here that used CP/M for anything usefull? I was just too young at the time to rember any apps for CP/M. Long Live 8" floppies!
  • You will find that Ada Lovelace [agnesscott.edu] (full name Augusta Ada Byron, Lady Lovelace) is mentioned in the article [cnet.com] (at the bottom) along with the fact that the ADA programming language was named after her. No mention of Tom Stoppard though.
  • by Anonymous Coward
    That's OK, in 30 Years of windows [cnet.com], their GUI retrospective, they include MS Windows, Xerox PARC, Apple (Lisa and Macintosh), the Atari ST and GEM..... but no mention of the first pre-emptively multi-tasking home computer with GUI as standard (A clue, guys... begins and ends with the same letter, and has a Russian fighter aircraft prefix in the middle, and it isn't SU-)

    Funny, that......

  • by orac2 ( 88688 ) on Friday April 14, 2000 @05:57AM (#1132590)
    The best book on the history of computing I've ever read is "Bit by Bit: An Illustrated History of Computers" by Stan Augerton. It's out of print now (but you can search for it through Amazon) and stops in the mid-eighties but it deals well with the contentious issues of who-did-what-first in the 1940's between the Germans, the English and the Americans. It's very well illustrated and has a lot of interviews with a lot of key people.
  • by Dhericean ( 158757 ) on Friday April 14, 2000 @06:24AM (#1132591)
    The Computer Museum timeline is a bit limited as it only starts at 1945 and is very strictly restricted to electronic computers. Here's a more comprehensive (though less detailed) timeline [server101.com] which starts at 500BC with the invention of the Abacus and includes things like the first Radio Shack Catalog in 1939, Atari's introduction of Asteroids in 1979, and various events involving QDOS 0.10 (later renamed PC DOS/MS-DOS) in 1980.
  • It was 1939

    "1939: John J. Atanasoff designs a prototype for the ABC (Atanasoff-Berry Computer) with the help of graduate student Clifford Berry at Iowa State College. In 1973 a judge
    ruled it the first automatic digital computer. "
  • As far as Lyons and their LEO (Lyons Electronic Office) computers go there is actually a LEO Computers Society [leo-computers.org.uk] for people who worked for LEO Computers Ltd. or worked with LEO Computers.
  • "How do you have a history of the pre-historic era?"

    Easy. You just make it up and hope nobody refutes your claims. :)

  • "those crappy ti-994/a's heh."

    How dare you call the TI crappy! Have you ever used Extendend Basic? I used to spend hours screwing around with sprites. The TI version of basic was the easiest I've ever seen, otherwise I never would have become interested in programming. Yeah, it was slow (I think the basic was double-interpreted, not sure). But it had Hunt the Wumpus with GRAPHICS! WOO-HOO! :)

    The speech synth was pretty cool for it's time too and could say anything if you had the TE 2 cart.

  • I couldn't find names like Neumann either :-(

    Szo
  • No mention of 'Baby' either, the world's first stored-program computer, a working replica of which was completed at the end of '98 on its 50th anniversary. Check here [man.ac.uk] for more details.
  • Here's my contribution to the embarassing childhood memories:
    I started with vic-20, at the age of about 10. The first game I made with it was Russian roulette. All you could do with it was to punch a key, with a 1/6 chance of dying every time, until you eventually did (can't remember my record though).

    ---------------
    Fire Your Boss!
  • There is also no mention of the first electronic digital computer, the Atanasoff-Berry Computer [ameslab.gov] (ABC). After all, the Courts [ameslab.gov] have decided that the ENIAC was a derivative of the ABC.

  • I had the very same computer when I was but a lad. Blocky graphics appearing on my television screen, blips and bleeps emenating from the little box. Ahh, but the coding...

    10 CLS
    20 GOTO 10

    *sigh* ahh for days gone by. Many a time I remember back to that little computer, punching in BASIC, playing Chess and Cliff Jumper off cartridges, not even tapes yet. Actually, once I was almost to the point of tracking down and buying an old Commodore VIC 20 just so I could punch some BASIC every once in a while.

  • IIRC, wasn't MENACE a form of a mechanical/manual powered neural network? With the colored beads being red and blue (or red and black? white and black?), and if one of the matchboxes (neurons) worked correctly, the weight was set by the addition of the right colored bead, and if incorrect, the subtraction of a bead?
  • I think as long as we're reliving our youth and trying to get ever closer to the first computer; You know, the reptile with fingers [cnet.com] . . .

    Someone should undertake a project, or post a link to an all-in-one site which hosts emulators, games and software (software whose copyright has run-out of course), for all the old systems we grew up with, and wish we grew up with.

    The Altair, ZX81, LISA. Then on to the fabulous 80's, the Tandy CalecoVision (sp?), TRS-80, Apple IIc, ICON, ATARi, Amiga 2000... the list goes on and on. As far back as 1996 I tried to start an all-in-one collection on my BBS but I had a very hard time getting either games that worked with the emulators, or the emulators for the games off of the net.

  • I like how the author points out things have kind of come full circle. From one of the first computers to the latest versions....

    The word abacus comes from the Greek word abax, a corruption of the Phoenician word abak, meaning sand. It's not surprising, then, that all computers nowadays are based on wafers of silica, the principal ingredient of sand. Or that computer-speak is Greek to most people.
  • That would be Burroughs, UNIVAC, NCR, Control Data, and Honeywell for those who don't remember...
    --
    -Rich (OS/2, Linux, BeOS, Mac, NT, Win95, Solaris, FreeBSD, and OS2200 user in Bloomington MN)
  • But then nothing exciting ever happens outside of the States does it?

    To be fair, they did manage to mention Konrad Zuse, but not mentioning the Mark I is pretty damn bogus, as that was, as far as I know, the first stored-program general-purpose computer, i.e. the first machine of the type most if not all of us would think of as a "computer".

  • Need I go on?
  • Hmm, we had a Burroughs in high school. Actually, my oldest brother got to operate it--circa 1962. They retired it in favor of a NCR Century 200 in 1969. Interesting machine--it combined a mix of the really neat with some incredibly awful characteristics.

    My senior year, I skipped advanced placement chemistry to take a data processing sourse on the beast. It used NEAT/3, the near english assembly translator. Actually, the assembly was a little odd, in that some commands in n3 would generate a bunch of machine code, while others a few instructions. They also had COBOL, but we never got too far with it. I'm happy to say I've never (successfully) compiled a program in COBOL... :-)

    IIRC, the main memory was 32K of 9 bit stuff, but we had an extra refrigerator sized bay(!) that held an additional 64K of thin film memory. We had 4 disk drives, with platters about 12 to 14" in diameter. Maybe 3 platters per pack. Limited tape drives--the Burroughs was all tape. That guy used 8K of core.

    The lousy bit on the NCR was the serious overloading of the processor by the assembler/compiler. Even with NEAT/3, it took over 5 minutes for it to compile and print a small program (we never got above 100 cards in that class.) The school data processing people were furious over the machine--NCR had a systems analyst on site for the better part of the school year, while they were trying to get the normal school programs running. (attendance and grades, mostly)

    *Their* compiles took a lot longer.

    We hung on to the Burroughs for an additional 6 months until the NCR got working right. I suspect that NCR made no money off of our school.


  • I believe he's implying that this lamer must be using windows...probally IE. Since you seem to insist Alt-V, you are probally running Linux. In IE copy/paste is CTRL-C/CTRL-V

    :)
  • Another worthwhile read (if you can find it) is "The New Alchemists", subtitled "Silicon Valley and the Microelectronics Revolution", by Dirk Hanson, published by Little, Brown and Co., ISBN 0-316-34342-0
    Published in 1982, it mostly focuses on the history of semiconductor manufacturers up to that point, but includes some background on Edison and Tesla, DeForest and Armstrong, Bletchley Park and ENIGMA, and a bunch of other neat stuff. Also interesting to read predictions of the future (basically now, from our point of view)written at the beginning of the '80's.
  • "I couldn't find names like Neumann either :-("

    Isn't that a slightly obscure extremely high-end brand of microphone?

    Oh, now I see, you were talking about von Neumann.

  • Here's something I'd moderate as "insightful", or at least "interesting".

    "Alexander Graham Bell sets back digital communications by 120 years with his invention of the telephone. This retrograde device uses an analog signal that is incompatible with existing telegraph lines to communicate voice, and eventually creates the need for modems and dial-up Internet access."

    Oh well, the analog phone system still did a lot to advance vacuum tube and switching technology that came in handy on early computers, as well as valuable contributions to analog audio.

  • POKE 53280,0
    POKE 53281,1

    The screen always looked better when it was black.

  • This *is* interesting or insightful, but wont get moderated as such because you have directly addressed the moderators. I know that when I moderate, I ignore any posts which make any reference to moderation. (Unless they are obvious trolls asking to be moderated down.)
  • It also won't get moderated because it's a direct quote from the C/Net article (that's why I put quotation marks around it) and I don't have moderator privileges over at C/Net. I certainly don't expect a karma boost from quoting a snippet of the article, although it would be nice if the other commenters here would read the C/Net article first so that they'd know what's being talked about.
  • Actually, no. John von Neumann is originally Neumann János, born in hungary. I guess I know it better :-)

    Szo
  • Those curious about the shoulders of giants (whereupon we stand) may wish to scope out some of these books:

    Atanasoff: Forgotten Father of the Computer
    by Clark R. Mollenhoff

    The First Electronic Computer: The Atanasoff Story
    by Alice R. & Arthur W. Burks (1989)

    Before the Computer: IBM, NCR, Burroughs, and Remington Rand and the Industry They Created, 1885-1956
    by James W. Cortada (1993)

    Building IBM: Shaping an Industry and Its Technology
    by Emerson W. Pugh (1995)

    Computer: A History of the Information Machine
    by Martin Campbell-Kelly & Wm. Aspray (1997)

    The Computer Comes of Age
    by Rene Moreau (1986)

    The Computer from Pascal to Von Neumann
    by Herman H. Goldstine (reprinted 1993)

    John Von Neuman and the Origins of Modern Computing
    by William Aspray (1991)

    Engines of the Mind: The Evolution of the Computer from Mainframes to Microprocessors
    by Joel N. Shurkin (1996)

    ENIAC: The Triumphs and Tragedies of the World's First Computer
    by Scott McCartney (1999)

    From Memex to Hypertext: Vannevar Bush and the Mind's Machines
    by James M. Nyce, Paul Kahn (eds.) & Vannevar Bush (1992)

    Great Men and Women of Computing
    by Donald D. Spencer (2nd ed. 1999)

    A History of Computing Technology
    by Michael R. Williams (2nd ed. 1997)

    A History of Modern Computing
    by Paul E. Ceruzzi (1998)

    History of Personal Workstations
    by Adele Goldberg (ed.) (1988)

    History of Scientific Computing
    by Stephen G. Nash (ed.) (1990)

    Leo: The Incredible Story of the World's First Business Computer
    by David Caminer (ed.) (1997)

    Makin' Numbers: Howark Aiken and the Computer
    by I. Bernard Cohen (ed.) (1999)

    Out of Their Minds: The Lives and Discoveries of 15 Great Computer Scientists
    by Cathy A. Lazere, Dennis Elliott Shasha (1998)

    Remembering the Future: Interviews From Personal Computing World
    by Wendy M. Grossman (1997)

    The Timetable of Computers: A Chronology of the Most Important People and Events in the History of Computers
    by Donald D. Spencer (2nd ed. 1999)

    Transforming Computer Technology: Information Processing for the Pentagon 1962-1986
    by Arthur L. Nordberg, Judy E. O'Neill, & Kerry Freedman (1996)

    Turing and the Computer: The Big Idea
    by Paul Strathem (1999)

    When Computers Went to Sea: The Digitization of the United States Navy
    by David L. Boslaugh (1999)

    Ada, the Enchantress of Numbers: A Selection from the Letters of Lord Byron's Daughter and Her Description of the First Computer
    by Betty A. Toole (ed.) (1998)

    A.M. Turing's ACE Report of 1946 and Other Papers (Charles Babbage Institute Reprint Series for the History of Computing, Vol. 10)
    by Alan Turing, et al. (1986)

    A Bibliographic Guide to the History of Computing, Computers, and the Information Processing Industry
    by James W. Cortada (1990)

    A Bibliographic Guide to Computer Applications, 1950-1990
    by James W. Cortada (1996)

    Business Builders in Computers
    by Nathan Aeseng (1999)

    Glory and Failure: The Difference Engines of Johann Muller, Charles Babbage, and Georg and Edvard Scheutz
    (History of Computing)
    by Michael Lindgren; Craig G. McKay (translator) (1990)

    For a book review and discussion from last October, see: A History of Modern Computing [slashdot.org]

  • So you were expecting the article to discuss Neumann János architecture?
  • I got my first computer in 1984.. sinclair spectrum, with the rubber grey keys.. i was only coming up to 5yrs at the time, and my first game : i think it was hungry horace.. then spectrum + 48k, specrum +2a, amiga, and so on :)

    I still have most of these in gwo, and am looking to complete my collection of sinclar hardware and peripherals..

    ~FnkyAlien, probably the most sad geek girl :)
  • Why not? A little bit of education never hurts :-)

    Szo
  • Perhaps someone should have advised Neumann János that if he wanted to continue to be known as Neumann János he should have avoided having his name changed to John von Neumann.
  • I would say that a stick would not be a computer since it only helps you to compute and it is actually you that is doing the computing. The stick itself does not perform a logical function.

    But a transistor doesn't perform a logical function on its own either. It's the programmer that does the computing; the transistors only help him/her to compute....

    -- Abigail

  • Those were different, difficult times back then :-(

    Szo

I've noticed several design suggestions in your code.

Working...