Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Updating the Computer, Circa 1969 124

Coudal points out a "Swell article from UK Magazine 'Design' from 1969," excerpting "Designing a computer is a continuous process in which technological breakthroughs must be matched by new hardware, and new hardware by new software, without invalidating the systems already in use."
This discussion has been archived. No new comments can be posted.

Updating the Computer, Circa 1969

Comments Filter:
  • by (1+-sqrt(5))*(2**-1) ( 868173 ) <1.61803phi@gmail.com> on Friday June 23, 2006 @08:51PM (#15593785) Homepage
    From TFA:
    The 1903A [...] can handle conversational computing on nine remote consoles.
    “Conversational computing” is a fantastic euphemism for command-line-interaction; more sophisticated, in any case, than the point-and-grunt interface of today's hoi polloi.

    My theory is that computing and humanity interrelate: in an environment where Latin is taught alongside math, your users and developers are sharper and more humane.

    • by CRCulver ( 715279 ) <crculver@christopherculver.com> on Friday June 23, 2006 @09:06PM (#15593860) Homepage

      My theory is that computing and humanity interrelate: in an environment where Latin is taught alongside math, your users and developers are sharper and more humane.

      Why Latin? It's really no different than any other language, it doesn't make you more intelligent or allow you to express concepts any other language can't. All human languages are functionally equal, and while some might have ideas encoded in single lexical units (although please don't believe that myth about Eskimos and snow [amazon.com]), all languages can express all concepts through circumlocutions. And if you want to say that Latin is teaches the learner something special about structure due to its synthetic nature, Russian or any other Slavonic language (or half of the languages worldwide) would do just the same.

      Latin is vital for two things, one being able to read Roman literature or works in fields influenced by Latin-speaking culture such as law, or in understanding the genetic affiliation of languages in the Indo-European language family. Otherwise, it's nothing special and shouldn't be taught to anyone as just a matter of course.

      • by Arker ( 91948 ) on Friday June 23, 2006 @09:44PM (#15594024) Homepage
        No, of course the language per se is not particularly special, at least not in ways that are unique to it in broad outlines. The point however is what that language connects you with. The literature, a HUGE chunk of the literature of western civilisation. The classic and medieval european literature is overwhelmingly in Latin, because that was quite simply the language literate people all over western europe wrote in. And, of course, it also helps to understand the underpinnings of ALL the Romance languages. On top of that, it's crucial to understanding much of the more formal parts of the English language, even though English is not, actually, a Romance language, because so much of our terminology, in science, in law, and so forth, comes from Latin.
        • by Anonymous Coward
          Actually, that's debatable. English draws much of its vocabulary from French (thanks to the Norman Invasion of England in 1066), while it gets a lot of its grammar and structure from its Germanic roots. English is probably more accurately considered a Romance-influenced Germanic tongue. Interestingly, the language we know as English is really a creation of the linguistic forces that existed in England a thousand or more years ago. A little Anglo, and little Saxon, a little French, and a good measure of the
          • Actually, that's debatable.

            I'm unsure what 'that' refers to, what you think is debateable.

            English draws much of its vocabulary from French (thanks to the Norman Invasion of England in 1066), while it gets a lot of its grammar and structure from its Germanic roots. English is probably more accurately considered a Romance-influenced Germanic tongue.

            Which does not contradict what I wrote at all.

            Norman-french is one of the (many) ways that Latin roots have crept into English. More have come in via Class

      • all languages can express all concepts through circumlocutions.

        So you don't subscribe to the Sapir-Whorf hypothesis? Fair enough, you can certainly poke it quite full of holes - Though somehow, the shreds remain basically intact.

        For example, even though the Pirahã (and others) have no words for numbers over two and resultingly cannot grasp even basic arithmatic you could probably form some extended sentence to express "one and one and one". From that "circumlocution", someone already familiar wi
        • by CRCulver ( 715279 ) <crculver@christopherculver.com> on Friday June 23, 2006 @10:29PM (#15594256) Homepage

          So you don't subscribe to the Sapir-Whorf hypothesis?

          No, nor do most linguists.

          For example, even though the Pirahã (and others) have no words for numbers over two and resultingly cannot grasp even basic arithmatic

          The Piraha situation has recently been attacked as wishful thinking on behalf of its major researcher. There's plenty out there that's critical of it.

          Or to gain a deeper understanding of any writings in a language derived frm it - Including English

          Yes, this was included in my professing the usefulness of Latin for understanding culture influenced by Latin.

          though it has too much from the Germanic side of the family, with a Greek uncle sneaking in the mix somewhere along the way, to count as a proper Romance language

          A language's genetic affiliation is decided by phonological correspondences in the morphology, so English would be a Germanic language no matter how many French words it absorbed. To give a similar situation as an example, Armenian is still in its own branch even though most of its lexicon was replaced by Persian loans.

      • by Tri0de ( 182282 ) <dpreynld@pacbell.net> on Saturday June 24, 2006 @12:56AM (#15594818) Journal
        Respectfully, I think Latin is a very effective language for all kinds of communication. Many Latin words have more universal meaning then English, French or Russian phrases. "Ad hoc" is a great example of a concise term with a precise meaning that needs no translation.
        There is an advantage to a dead language; unlike English meanings aren't mutatingl thus you can often get a better feel for exactly what someone from a different era and different culture was saying without the problem of words such as "gay", "Stupid" or "sick" having very different meanings a few generations later.
        'Good' language is that which communicates what and as you intend, be it technical jargon, slang, Oxford English, Spanglish; if the sender and receptor send and receive the right message it's good language; withing that context Latin can, and often is, highly effective.
        • Many Latin words have more universal meaning then English, French or Russian phrases.

          Gah! EVERY language has specific words or phrases that have become common across language barriers.

          You mentioned French, so try:
          à la carte
          agent provocateur
          attaché
          carte blanche
          cliché
          décor
          déjà vu
          dossier
          entrepreneur
          faux
          genre
          laisser-faire

          etc. From: http://en.wikipedia.org/w/index.php?title=List_of_ French_phrases_used_by_English_speakers&oldid=5869 8093#A [wikipedia.org]

          • I respectfully beg to suggest that while English has indeed appropriated many French words and phrases Latin had contributed many more to many more languages. While I am the first to grab a word, phrase or concept from ANYPLACE if it will help me get my point across, and love the French culture and language (I live in the middle of the Sonoma Napa wine country and can spend entire weekends contemplating terroir), Latin is like a big ugly box of tools, coated with grease and blood and a fair amount of rust,
        • Perhaps this is partially because people with poor language skills rarely use grammar.

          could you imagine...

          AH MEUS DEUS! EQUUS PARVULUS!!!

          (sorry, i'm sure someone can come and correct my latin)
      • Why Latin?
        Though I'm more of a Hellenist myself, I find glossal relativism repugnant; and subscribe to the credo of culture's singularity: that failing to detect the difference between psyche and anima is a culpable insensibilty.
      • I don't agree. There is at least one good reason to teach latin. First of all to make sure that once we're done with them nobody will ever be interested again in the subject and so we can give them a dose of Cicero, the roman slumlord who had "National Security Risks" strangled in prison without due process and trial and declared martial law. Yes Cicero was a dirty slumlord and his income came from a couple of dozen hazardous, unhealthy tenements throughout the city, multistoried buildings made of wood that
    • If you're going to drop ancient Greek into your posts to sound like an intellectual then you should make sure your grammar is correct. ;)

      Your use of the definate article is ugly and redundant. What you wrote translates into English as "... than the point-and-grunt interface of today's the people".
      • You're joking right? Even professors of Greek use the phrase with the redunant article (and, having taken a B.A. degree in Classics, I've heard it dozens of times). It's well known that "hoi polloi" has been taken into English as a distinct lexeme, and therefore the English definite article may be added to it. To correct this saying it's wrong is just misinformed and obnoxious pedantry.
      • What you wrote translates into English as "... than the point-and-grunt interface of today's the people".

        Actually, no. It means "(the) masses" or "(the) general populace". That's what makes language so interesting: it tends to transmute over time.

      • What you wrote translates into English as "... than the point-and-grunt interface of today's the people".

        A couple things:

        • Polloi derives from polus, meaning: “many;” and “people” only by extension.
        • There are many places in Greek where it's not advisable to translate the definite article: t'auton, for instance, may mean simply “same.”

        The feater translation, therefore, would be “today's manifold;” with an implicit scilicet: “today's manifold [people, indw

    • ...fantastic euphemism for command-line-interaction; more sophisticated, in any case, than the point-and-grunt interface...

      Wow... maybe someday computers will become powerful enough to use this "command-line-interaction" you speak of...

    • I hope that was supposed to be funny, cuz i can't stop laughing. That was great! Makes me think of UVA. lol
    • "Conversational computing" is a fantastic euphemism for command-line-interaction; more sophisticated, in any case, than the point-and-grunt interface of today's [mouse and GUI's].

      Maybe for most tasks, but for porn, pointing and grunting is a perfect fit.
           
    • (1+-sqrt(5))*(2**-1), 1.61803phi@gmail.com

      Obsessed with the Golden Ratio, are we?

    • in an environment where Latin is taught alongside math, your users and developers are sharper and more humane.

      Which clearly explains the rise of the largest violent empire the West had ever known: the Roman Empire.
    • My brother was a French and Latin teacher. He was a Reagan/Bush fanboy and ate chicken-fried steaks. Hardly humane.
  • Oh, sure (Score:4, Informative)

    by ScrewMaster ( 602015 ) on Friday June 23, 2006 @08:52PM (#15593790)
    without invalidating the systems already in use.

    Everyone knows that Intel and Microsoft have never invalidated a system already in use.
    • Everyone knows that Intel and Microsoft have never invalidated a system already in use.

      They just wait a few hours for it to crash first :P
    • Totally off-topic, but your .sig prompted me to find that story online. Thank you!
    • Everyone knows that Intel and Microsoft have never invalidated a system already in use.
      Neither has Apple. (For the record, IaAFB... I'm an Apple Fan Boi)
  • by Who235 ( 959706 ) <secretagentx9@cia3.14.com minus pi> on Friday June 23, 2006 @08:56PM (#15593812)
    I read about that 37 years ago on Digg.
  • by Ankou ( 261125 ) on Friday June 23, 2006 @08:58PM (#15593825)
    If you put your ear against it you can hear the hampsters running!
  • by bunions ( 970377 ) on Friday June 23, 2006 @08:58PM (#15593827)
    The girl in the photo on the first page is H-A-W-T HOT!
  • The Story of LEO (Score:3, Informative)

    by Sponge Bath ( 413667 ) on Friday June 23, 2006 @09:28PM (#15593949)

    The article's mention of ICL (formerly ICT) made me think of the book "LEO, The Incredible Story of the World's First Business Computer". The 1968 ICT merger with English Electric Computers to form ICL, connects the company with LEO, a computer designed by a Bakery company in the late 1940's/early 1950's. A bizarre and entertaining tale, if you are into obscure computer history.

  • by marciot ( 598356 ) on Friday June 23, 2006 @09:31PM (#15593964)
    Bah.... I much rather have a portable computer [wikimedia.org]...

    • I don't know about you... but I think they should have put a roof on that trailer... Last I heard, water was bad for computers and I'm sure the highway speed winds weren't good for the magnetic tape reels.

      And dude! What's with the half door with no latch? I wouldn't want someone playing Commander Keene in my computrailer...
  • It might be a little easier to read if the paragraphs were INDENTED.
  • heh (Score:3, Funny)

    by icepick72 ( 834363 ) on Friday June 23, 2006 @09:38PM (#15593998)
    1969 called and they want their article back
    • To put this in perspective, modern Electronics was being invented. The hardware advances werer hi8ge at the time.
      Designing your computer you had the choice of something like:
      Rockwell's 6500 (8 bit 1 Mhz cpu)
      Motorola's very first 6800
      Intel's (Who's Intel? Never heard of them) 8080 was under development or mebbe in prototype
      The Next kid onto the block was Zilog with the Z80 in 1973 or thereabouts.
      When Motorola introduced the 16 bit 68000 (at a blistering 15Mhz eventually) hey, that was for
      minicomputers &
  • You'd think... (Score:2, Insightful)

    by owlnation ( 858981 )
    that a page about a design magazine might just, maybe, break up that wall of text into something designed to be easier to read.
  • clearly... (Score:3, Funny)

    by blackcoot ( 124938 ) on Friday June 23, 2006 @09:47PM (#15594036)
    ... these people never heard about vista ;)
  • by Anonymous Coward on Friday June 23, 2006 @10:04PM (#15594108)
    Yeah, 1969 was about the last time attractive women in skirts [ahds.ac.uk] were seen anywhere near a data center... :)
    • Yea, we're a lot more sensible nowadays, and wear pants.
      • I think that pants are almost an obligation today (and its a shame). Its not a question of being sensible though. Those of us who were around then - older and wiser now - can remember the cold-air ducts being in the false ceiling as being the norm. Today of course, the cold air enters the computer room as an upward blast from the false floor. More efficient for cooling but less pleasing to the eye (in my humble opinion, at any rate).
  • by Illbay ( 700081 ) on Friday June 23, 2006 @10:25PM (#15594234) Journal
    ...a mere TEN YEARS LATER, one could purchase a TRS-80 at Radio Shack, featuring 4K of RAM and using a casette tape recorder for storage, for only a thousand bucks or so.
    • They were sweet! I played a game on ours that featured synthesised speech. True, only one 5-letter word (Weird, the name of the game) but still. To hear it you'd have to connect an external radio/amplifier, about the size of the TRS80 at the time...

      Oh, the memories...
  • by solitas ( 916005 )
    http://vads.ahds.ac.uk/diad_search.html [ahds.ac.uk] Thanks! Quite a resource (for some of us).
    • cheers! I worked on this project - it was about 1995 -1997 so really fun to see it's still alive and useful. All praise to the Arts and Humanties Data Service [ahds.ac.uk] for keeping it up there.

      So for folks wondering why it's so basic - a little more info.. short answer: it was a small project and it was about ten years ago.

      I think we started about 1995 or 1996 - description here [ukoln.ac.uk]. Pat Batley is a visionary librarian who could see the value of digitisation and pushed to get archive resources digitised. She got in conta
  • Wow! An old aricle on computers. Big deal!
    • You know how people watch old movies, learn history, carry on traditions, things like that? It's called culture. Now I don't know if you're a professional, or even just a dedicated hobbyist, but if either is true then this is your culture. Knowing who Atanasoff and Barry are, or what ENIAC stands for and what it was used for, or what a Hollerith Card is, or who Charles Babbage and Lady Lovelace (Ada Byron) are and what they did is maybe not a necessity, but I personally don't see how you can take real pride

      • And if you do happen to work in IT proffesionally, or want to, then knowing what mistakes people have made in the past or what they have done that has worked well can help you immeasurably.

        But then, who wants to do something well when you can do a half arsed job, spend twice as long ironing out the bugs and then get a reputation for being a fuckwit.
  • by wbean ( 222522 ) on Friday June 23, 2006 @10:58PM (#15594391)
    That article is a typical pice of sales department puffery. If you really want to know what it was like to design a computer in those days read Tracy Kidder's Soul of a New Machine. It chronicles the efforts by Data General engineers to create a new computer. At the time I was working as an engineer for Honeywell's EDP (Electronic Data Processing) division and I can vouch for the accuracy of Kidder's reporting. I recognized all the problems and all the actors even though it was a different company.

    At a given point in the development of computers a lot of people end up working on the same problems and often come up with similar solutions. While I was at Honeywell they bought GE's computer division and we got to see the design documents for GE's new computer. It was very interesting reading since we could look at each turning point in the design and say: "Oh, they decided to do it that way." All of the problems were ones that we'd worked on and the solutions were all ones that we'd considered. For the most part they'd made the same decisions we had. It was an experience that's given me a real respect for the notion that an invention is "in the air." It isn't necessarily because the problems are being widely discussed but more that a given state of technology dictates certain questions and that the solutions follow logically from the questions.
    • As a writer of fiction and screenplays, I would say that's true of art and the creative process as well. It's something I've believed for awhile.
  • Real Soon Now (Score:5, Interesting)

    by hob42 ( 41735 ) <.moc.liamg. .ta. .24opuj.> on Saturday June 24, 2006 @12:07AM (#15594652) Homepage Journal
    I found this other article even more interesting - 1974, issue 311, "In Praise of Hydrogen [ahds.ac.uk]." It talks about how easily the School of Automotive Studies converted a traditional internal combustion engine to hydrogen, and how with only one major area of research (storage of hydrogen) we should expect our dependance on gasoline to be quickly and easily eliminated.

    Talk about vaporware (pun not intended, though also funny).
  • A few numbers later, make sure you have a look at the new Olivetti teleprinter [ahds.ac.uk].

    When you look at it, you wonder why the US designers are so retarted to design ugly stuff like the KSR-33 [kekatos.com].

    That Olivetti unit looks like it was made 20 years later...

    • by bmo ( 77928 ) on Saturday June 24, 2006 @12:28AM (#15594736)
      "That Olivetti unit looks like it was made 20 years later..."

      Probably because the Olivetti extensively used plastic or die-cast white metal in the case. If you look at the old ugly stuff like the KSR, the cases were _steel_ which is why they look so bland. You can't get the same shapes by stamping steel like you can with plastic-injection molding or die-casting and the style of the Olivetti simply screams "molded parts".

      Back then it was a cultural thing. Plastic was "cheap" and steel meant quality. If the case wasn't heavy enough to kill someone with, it wasn't quality.

      --
      BMO
      • by NoMaster ( 142776 ) on Saturday June 24, 2006 @03:48AM (#15595220) Homepage Journal
        Plastic was "cheap" and steel meant quality. If the case wasn't heavy enough to kill someone with, it wasn't quality.
        Tell that to the Honeywell Rosy 26 teleprinter in my garage. Plastic case, 20 years younger than the 2 Model 100's sitting next to it, much the same feature set, but 3x the weight. I was going to throw it out today, but damned near killed myself just trying to lift it!

        FWIW, I suspect the real reason that Teletype Model 33 looks so ancient is that, from looking at the internals, it appears to be a clone/ripoff of a Siemens Model 100 [iprimus.com.au] or a Creed Model 47 [iprimus.com.au] - both much earlier models - updated with an "electronic" keyboard. IIRC, Teletype Corp bought (or maybe partnered with) the UK-based Creed.

        (Slashdotters with a mechanical bent really should look into the old electromechanical teleprinters. They're amazing machines; a real tribute to the ingenuity of their designers. Given a motor spinning at 3000 RPM, and no electronics, how would you convert a 5-bit code to printed text?)

        • That's not what I meant. I meant that it was a cultural thing, irrespective of the facts. Plastic cases are more durable in the long run if built correctly, but you couldn't tell that to someone in 1965 who just dropped his brand-new Sony transistor radio.

          --
          BMO
        • (Slashdotters with a mechanical bent really should look into the old electromechanical teleprinters. They're amazing machines; a real tribute to the ingenuity of their designers. Given a motor spinning at 3000 RPM, and no electronics, how would you convert a 5-bit code to printed text?)

          Sounds interesting - given enough time, I could even probably come up with a solution (something involving solenoids, cams, clutches, and ratchets - among other things - would be needed). If you want to see something simila

          • something involving solenoids, cams, clutches, and ratchets - among other things - would be needed

            That's pretty much it - the trick is all in the timing (and keeping the timing in sync).

            Oh yeah, I forgot to mention - no relays allowed, at least not in the receive signal path. You can't go building a shift register / buffer that easily ;-)

            If you want to see something similar, look into old "reproducing" player pianos.

            Yup, very similar technologies. In fact, I'd be prepared to bet money that the early e

            • I wonder why more people don't find this old technology fascinating, given the popularity of "steampunk" fiction amongst the Slashdotting class. They were building huge text-based addressable store-and-forward networks before the advent of the microprocessor - or even electronics - y'know...

              Actually, it seems like there is a dearth of interest in anything historically related to computers prior to about 1990 by most people, even among self-described "computer geeks". I personally find the history of compu

    • When you look at it, you wonder why the US designers are so retarted to design ugly stuff like the KSR-33.

      To be fair, the IBM 2741 [columbia.edu] isn't quite so ugly, and the IBM 1050 was also a bit less clunky-looking (i.e., they, like the Olivetti, look more as if they actually belong in the Swinging '60's than did the Models 33 and 35 Teletypes). Even Teletype came out with the Model 37 eventually....

  • by Anonymous Coward on Saturday June 24, 2006 @01:48AM (#15594968)
    International Computers Ltd. whose highly successful 1900 Series computer... Blah blah blah. That series is dead.

    However, the Colt 1911 model still works fine - not really a computer, unless it involves questions where the answer is BANG!

    • However, the Colt 1911 model still works fine - not really a computer, unless it involves questions where the answer is BANG!

      Oh, I don't know... It's a dandy analog machine for solving parabolic trajectory equations of quite a few variables.

      -jcr
  • by Clark_Griswold ( 692490 ) on Saturday June 24, 2006 @02:34AM (#15595070)
    Q: Why don't the British make computers anymore? A: Because they couldn't find a way to make them leak oil.
  • by Quiberon ( 633716 ) on Saturday June 24, 2006 @02:59AM (#15595117) Journal
    I can still program in PLAN (its assembler), and CES-Basic. And FORTRAN.
    • After all, FORTRAN is basically assembler, only less readable
    • I always enjoyed #UPPER and #LOWER. The 1900 had such short instructions (24 bit words, 4 6-bit bytes) that you could only access beyond the first 1024 (I think) bytes by using a register (accumulator in those days) as a modifier. #LOWER memory was directly addressable, #UPPER required the addition of a 24 bit accumulator to the base address. And doing IO by initaiting the card (or whatever) moving, then doing as much as you could before calling SUSBY to wait for the IO to complete. Happy days. I'm not
  • So, here's the metric I want to know...

    I have a computer under my desk. If you go backwards in time, computers get worse and worse. Until, finally, you reach this interesting point, where, if you look at the aggregate computation power of every computer on the planet in active use, my single computer arguably has them all collectively beat. In terms, say, of mathematical operations per second. I'd like to know what year that is. It's going to be later than 1950, 1960, 1970... Could it be as high as 19
    • Well, lets start with Moores law, which tells us that computers double in speed every 18 months or so. This means in 30 years, they go up in speed by a factor of a million. Of course, a business compuiter in 1976 would be a lot more powerful than a home computer, but I'd imagine there weren't a lot of them. There were a number of home computer kits as well, but once again, I can't imagine that there would be a huge number of people wanting to build kits, and certainly not as many as a million, so you pro
      • Mainframes were definitely a part of the equation back in the late 70s (speaking as an old timer). Every company, payroll department, goverment agency, etc. used mainframes. Heck, even the school district from my small town had there own mainframe. Not to mention the non-apple small computers at that time - Commodore PET, Radio Shack TRS-80, etc. My bet is you'd have to go back until at LEAST the 1950s (probably early 50s) to match all the computing power on the planet with a single PC today.
        • But there were only a certain number of companies big enough to use a mainframe, and most of these would have been in the US an Europe until very recently. Other companies would have outsourced anything that needed a computer. I'd say world computing power definitely was higher than a 4GHz Pentium 4 by the time the home computer market had started, but the mainframes do mess things up

          But this is where the problem is. With a little research, it's quite possible to get a good estimate for the speed of,

When it is not necessary to make a decision, it is necessary not to make a decision.

Working...