Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
IBM

What Does It Take To Keep a Classic IBM 1401 Mainframe Alive? (ieee.org) 60

"Think your vintage computer hardware is old?" writes long-time Slashdot reader corrosive_nf. "Ken Shirriff, Robert Garne, and their associates probably have you beat.

"The IBM 1401 was introduced in 1959, and these guys are keeping one alive in a computer museum... [T]he volunteers have to go digging through historical archives and do some detective work to figure out solutions to pretty much anything!" Many things that we take for granted are done very differently in old computers. For instance, the IBM 1401 uses 6-bit characters, not bytes. It used decimal memory addressing, not binary. It's also interesting how much people could accomplish with limited resources, running a Fortran compiler on the 1401 with just 8K of memory. Finally, working on the 1401 has given them a deeper understanding of how computers really work. It's not a black box; you can see the individual transistors that are performing operations and each ferrite core that stores a bit.
"It's a way of keeping history alive," says one of the volunteers at Silicon Valley's Computer History museum. "For museum visitors, seeing the IBM 1401 in operation gives them a feeling for what computers were like in the 1960s, the full experience of punching data onto cards and then seeing and hearing the system processing cards....

"So far, things are breaking slowly enough that we can keep up, so it's more of a challenge than an annoyance."
This discussion has been archived. No new comments can be posted.

What Does It Take To Keep a Classic IBM 1401 Mainframe Alive?

Comments Filter:
  • Oh, man! (Score:5, Interesting)

    by Anonymous Coward on Saturday November 10, 2018 @02:06PM (#57622306)

    The first computer I ever used was a 1401 (at Rice University in 1963). As an undergrad, I only got computer time early in the morning (0700hrs), and usually had to turn the thing on, then wait 30min for it to warm up.

    It was punched-card input and punched-card output; only grad students could use the console typewriter (a real flailing-arm typewriter, not a Selectric).

    But I taught it to do my engineering homework (FORTRAN; not even FORTRAN II). I graduated with a masters in ME, but I was an aerospace programmer for 46 years. Never did a lick of engineering.

    My first job (before I graduated) was as a computer operator trainee, working for IBM at the Manned Spacecraft Center in Houston, in the Real Time Computing Complex, 1st floor of bldg 30, Mission Control. That room had 5 IBM 7094 mod II machines, to do the real computations, and 2 IBM 1460 machines (1401 big brother) to process the input and output magnetic tapes for the 7094s, so my 1401 experience paid off.

    One day I was about to be late for work, for an important test. I ran into the building, down the hall, and turned the corner to the right, and ran smack into someone running around the same corner in the opposite direction. I knocked him down. I reached my hand out to help him up, about to apologize profusely, when I realized it was Gus Grissom. Being a computer operator, working on a very cold floor, we all wore Hush Puppy shoes (thick foam soles; insulators). He was wearing leather-soled brogans, so he went down. As I helped him up, I said, "Ohmygod, I've broken government property. Sorry; I'm late for a test." He replied, "Me, too! You do good, OK?" Four years later, he burned to death in the Apollo fire :(.

    I went on to work in Flight Software for the Viking Program Mars lander, and to install a computer at Cape Canaveral Air Force Station, on the occasion of shuttle Discovery's maiden launch. It was stupendous.

    So, thank you lowly 1401 for giving me a jump-start to a wonderful career.

    • Re:Oh, man! (Score:4, Interesting)

      by AlanObject ( 3603453 ) on Saturday November 10, 2018 @02:19PM (#57622338)

      Ask kids we had access to and played with the IBM 1401 at Lawrence Berkeley Labs. It was a pretty amazing machine but was obsolete at the time such that no "real" work was getting done on it. We learned a lot though.

      But my first real computer was a contemporary of that, the IBM 1620. The CHM has one of those as well that is working but last I saw the typewriter was simulated with a PC.

      The problem with keeping these things running isn't so much the core processors and logic. The mechanical pieces they relied on required a complete industrial base to produce them that simply doesn't exist any more. Can you still find typewriter ribbons and punched card supplies? What about all those little pieces that go into punched tape reader/writers or vacuum column tape drives?

      For that matter, can you still buy 1/2 inch tape reels?

      It can continue for a while but eventually the only hope for keeping these things running will be 3-D printing the parts they need.

      Programming and using one of those systems for a real job is an experience that nobody will ever experience again. I can well understand those old codgers working to preserve them.

      • You can still find printer ribbons, and re-ink those available for a while. I just got a box of Hollerith cards from a friend, IBM-style,. and probably should send them somewhere to be useful.

        Mag tape is a bit harder to get,. but there is a rumor that a manufacture is starting up, and while this is an audio tape project, they may be able to make data quality tape. But that will be a problem. Tape is possibly the most fragile medium, and hardest to produce.

    • Very interesting read. Thank you.

    • I started in those days too, though I was on a 1410, the byte-oriented business mainframe. In those days your code was all on cards, and every program read its data from one 7-track tape and wrote processed data to another tape. Real-time operations, as opposed to this 'batch' paradigm, lay in the unguessable future.

      And when you ran into anyone, you didn't want to be carrying a program card deck.

      • In those days your code was all on cards

        The scrap ones came in useful for relighting the boiler.

    • Re:Oh, man! (Score:4, Insightful)

      by Anonymous Coward on Saturday November 10, 2018 @05:00PM (#57622862)

      For all the whiners who cry for AC posting to be banned... I present the parent post.

      Some of the highest quality submissions to slashdot are from ACs. Granted, some of the lowest too, but you get that from registered accounts too.

    • ... a year later, at Marquette University (Milwaukee), as a EE Sophomore, I got put in with the IBM 1620, and in 1965 on the IBM 7040, which is pretty much
      the same as your 7094. The 7040 had magnetic core memory. You've heard the joke about the IBM repairman that showed up, opened the panel, and
      tapped a memory card? He charged $1000. $5 for the tap, $995 for knowing where. Well, the exact thing happened there at MU. I was there, although the
      billing part I was not privvy to. Interestingly, next time,

    • Have you thought of writing a book about your career? This would be very interesting history.

    • This is why I read slashdot. Thanks.

  • by Anonymous Coward

    That's what a college consortium computing center did where I had my first IT job in 1973 as a programmer-analyst supporting the student record systems for the colleges. I came in fresh from community college with 2-year "Data Processing" degree, and summer internship writing some COBOL programs for a local industry. So, knowing some COBOL, ForTran, and IBM 1130 assembler, I had to teach myself 1401 Autocoder to maintain the existing system. All data input was via 80-column Hollerith cards, updating rec

  • They do use bytes (Score:2, Informative)

    by Anonymous Coward

    They just defined bytes to be 6 bits. Bytes haven't always been 8 bits. Historically some machines used 9-bit bytes or 36-bit bytes.

    • Tell that to the kids who grew up on 8-bit computers.
      • 6-bit bytes? Heaven. When I was a lad we had 5 bits bytes, a punched card for breakfast, Dad would thrash us within 1/2" of our lives using mag tape, and sleep in the hole in a punched tap. Ay we 'ad it 'ard.
        • Selectrics use a 5-bit word to select characters, all mechanical of course, though the printers and electronic models (MT/ST, MC/ST, Memory 50/100, for instance) used reed switch mechanisms (transmit blocks) and multiple switches for functions to be electrified, and worked fine.

    • by shoor ( 33382 )

      In the mid 1970s at Florida Atlantic University in Boca Raton Florida we shared a Univac 1106 with a couple of other colleges in Southern Florida and it used 36 bit words with 6 bit bytes. FIELDATA was a 6 bit character format that was used as it allowed 6 characters per word with no bits left over.

      This was the first computer I learned to program in assembly. I remember how odd the stack architecture of the Intel 8080 seemed at first when I was learning its assembly language, especially for calling subro

    • They just defined bytes to be 6 bits. Bytes haven't always been 8 bits. Historically some machines used 9-bit bytes or 36-bit bytes.

      Bytes were 6 and later 8 bits. A totally distinct family of 'scientific' machines took 32-bit groupings to be computational 'words' that corresponded to your Fortran variables. Bytes had no status in hardware, and had to be implemented little by little by software that treated words as sets of 4 bytes.

    • Re:They do use bytes (Score:4, Interesting)

      by AJWM ( 19027 ) on Saturday November 10, 2018 @06:56PM (#57623274) Homepage

      Burroughs was clever and designed its mainframe series (B6700 etc) to use 48-bit words consisting of either 8 6-bit or 6 8-bit bytes. The hardware could handle either (when dealing with character strings).

      Quite a few years later I was working on a Control Data Cyber series which still used 6-bit characters, in a 60-bit word. Text processing on that was so painful I wrote my own, in Pascal, which handled everything internally as ASCII.

      • Burroughs was clever and designed its mainframe series (B6700 etc) to use 48-bit words consisting of either 8 6-bit or 6 8-bit bytes. The hardware could handle either (when dealing with character strings).

        Quite a few years later I was working on a Control Data Cyber series which still used 6-bit characters, in a 60-bit word. Text processing on that was so painful I wrote my own, in Pascal, which handled everything internally as ASCII.

        The B6700 had 52 bit words, the extra bits were for flags and parity!
        see: http://www.vcfed.org/forum/arc... [vcfed.org]

        We had one when I was at University: https://www.cs.auckland.ac.nz/... [auckland.ac.nz]

    • Technically yes. However 8 bits in a byte has become standard in nearly all teaching of technology.
      This was the Wild West In terms of computing the fact they worked was more important then coming up with good terminology for it.

  • by CAOgdin ( 984672 ) on Saturday November 10, 2018 @03:09PM (#57622470)

    In 1963, I joined the (now long-gone) first "service bureau" in the Country, C-E-I-R, strategically positioned in Arlington, VA, within walking distance of the Pentagon...their first major client. Our "Computer Center" had one IBM 709 (a big boxy group of racks, and 12 tape drives...disk drives hadn't yet been invented...and an adjacent IBM 1401.

    To make the costly (then $800/hour) 709 more efficient, all written programs (in "assembly code" unique to the computer model, or FORTRAN, if you were lucky) were manually typed, line by line into "punch card" decks that were read and copied on the 1401 to magnetic tape reels, the reels than tagged with the project name, then carried to the other end of the room to be loaded from magnetic tape reels into IBM 709 memory for execution.

    During that execution, the tape drives would whirl and the programs and starting data copied to the 709's memory. Memory capacity was, in today's terms, "vanishing small" but the program performed it's computations and produced results. It could be minutes, hours, or even (occasionally) days in duration. Then, the output of the program would be written to tape(s), and returned to the 1401 for printing of results. (One was always suspicious of quick results, because it inevitably meant that the program had a fatal "Bug" in it, which had to be diagnosed and repaired.) I've witnessed piles of printed outputs from some programs that stood taller than any person in the Center. Then, we'd likely find some gibberish in those piles of papers, necessitating finding and fixing the errant instructions on the original punched cards...and the process would be repeated until the results were deemed "bug free."

    These were the fastest and sleekest way to produce meaningful results of the day. Of course, all those military projects were "Classified," so programmers and computer operators all had to have quite high-level security clearances...largely because the projects were all related to military strategy and/or predictions of likely outcomes of warfare under varying conditions. It also meant there was no sharing of programming techniques or skills outside the computer center or the clients' premises.

    Then we got a magical new product: The faster, sleeker, more powerful IBM 7090, and more reliable upgrades.

    But, for all that, it was an exciting time to be engaged in the design, development and coding of new mathematical algorithms (e.g., "Linear Programming", that could yield reports that shaped major decision-making in corporations and government agencies.

    The cadre of programmers at C-E-I-R* even created the first (to my knowledge) shareware. It was called "CELIB"
    (C-E-I-R Library, sometimes "CLIB"). It collected all the basic tools the programmer on an IBM 1401 needed to use, including stock tools for writing punched card data to tape, printing data from tape to the IBM 1402 printer, or even conducting Sorting of data in one order (e.g., by title) to another (e.g., by date). We all carried our "deck" and it's sparse manual around from project to project. It was published via IBM's SHARE project, where tools like these were made freely available to peers. On my first trip to Australia, I was amazed to see, 12 years after CELIB had been shared, that it was still in use, and had been adopted by a University in Sydney as their common convention for all IBM 1401 programmers, and was taught to students as an example of good coding style and practice.

    Smaller, simpler projects (.e.g., inventory management for a chain of retail stores) were programmed for the IBM 1401, with it's arcane idea of variable-length words (an arbitrary number of adjacent 6-bit characters, marked off from one another by the "word mark", a special bit that separated groups of bits into meaningful characters. We did everything from help customers decide how a perishable product (like fruits and vegetables) should be priced to the consumer, to figuring out how many submarines the Pentagon should plan to buy in the next decade...and

  • I'd be more interested in learning what it takes to keep an XDS Sigma 5, 6, 7, 8, or 9 alive.
  • I like Harry Porter's Relay Computer: https://www.youtube.com/watch?... [youtube.com]
  • Never had the pleasure of using a 1401, but I still have my 1959 Faber-Castell 2/82 sliderule and a pad of SDS symbolic coding sheets. Also a couple of pads of FORTRAN coding forms. A few years later I spent many hours coding assembler language and FOCAL language for a DEC PDP-8/L in our data acquisition and processing lab. It had paper-tape input and typewriter output with just 4K of core memory, and some of my programs bumped up against that 4K limit. Man, did I learn fast about optimizing my code!
    • by thomn8r ( 635504 )
      I still have my 1959 Faber-Castell 2/82 sliderule

      Hey, I have one of those too!

  • Memories (Score:5, Interesting)

    by riverat1 ( 1048260 ) on Saturday November 10, 2018 @05:21PM (#57622950)

    This story brings back fond memories for me. In my first term of "data processing" we did our work on an IBM 1401 (circa 1982). The teacher was an old IBM guy who brought it to the school. As the story says, 6 bit "bytes" with a parity bit and a checkmark bit. You had to write your code to bootstrap your program in. The "OS" consisted of loading your card deck in the reader and punching the start button. It would read the first card into memory from address 0 to 79 then start executing at address 0. After that you were on your own. Output was to a card punch (address 100-179) or the 1403 printer (address 200-331). The rest of the total of 4K of memory we had was available for use. The word length was defined by the checkmark bit. You could put in two thousand digit numbers and add or subtract them. It was humorous sometimes when someone would make a programing error during the printing part of their program and get in a fast loop of page feeds. The 1403 printer could shoot the paper to the ceiling when it was feeding the paper that fast and the best solution was to put your foot on the top of the box of paper to rip the paper and stop the feed.

    The last program we wrote was a fairly complex inventory problem. The 1401 only did add and subtract and the input data had one item that needed multiplying by a large number in the 200,000 range. You could tell when we ran the program who had written multiply routines in their programs and who just wrote a loop to do that many additions. The multiply routines would finish in a second or two but the add loops took over a minute to complete. I had great fun running and programming that thing.

    • One could construct instruction set curiosities on the 1401. On a 16K machine, the maximum length divide instruction required around 57 seconds to complete.

    • Your memory is only okay (faulty in some areas).

      Those read and print areas ran from 1-80, not 0-79. Print area started at 201. (The machine addressing was truly decimal user friendly, no counting from zero required.)

      All the 1401's I worked on had hardware multiply/divide. I know they made them without, but I never worked somewhere that tried to save money that way.

      The characters were definitely not "bytes". The term bytes was introduced with S/360.

      There was no "checkmark" bit, it was called a wordmark.

      I

      • Yes, my memory had failed me a bit. Wordmark is the correct term and you are correct that the addressing didn't start at 0. Funny how that happens as you age but I haven't done anything with a 1401 since about 1984. I know bytes is not the correct term but I think it's more relatable to most people here to denote the set of bits that make up a character.

        As far as hardware multiply/divide we didn't have that on the one I worked on. As I said it was purchased for the school I attended by an old ex-IBM guy

  • We have two in daily operation.
  • ,008015,022029 ...

    The above character sequence (minus the ellipsis) begins the first card of nearly all 1401 binary decks. Explain why.

    2000 vs 2980: explain the significance of these numbers in later 1401 configurations.

    • Hmm, you stumped me. Neither of those ring a bell but it's been 35 years since I worked on one. Usually the first part of the cards in my deck was to move my program code toward the end of the card into memory starting at address 500 then read the next card and branch to 0 for the next bit of program. Of course the 1401 wasn't binary but decimal.

  • by Charcharodon ( 611187 ) on Saturday November 10, 2018 @05:44PM (#57623040)
    An Arduino, a bunch of amber colored LEDs, and pre-recorded BEEEP BOOP Star Trek sounds.
  • Old school (Score:5, Funny)

    by Impy the Impiuos Imp ( 442658 ) on Saturday November 10, 2018 @05:58PM (#57623086) Journal

    Imagine a beowulf cluster of these!

  • ...not that there's any bright-line definition. But 1401's were considered "small computers" and the main use I knew for them was as satellite computers--auxiliary equipment used together with real "mainframes." For example, an IBM 7090 might perform input and output only to magnetic tape. The tapes were then mounted on the tape drive of a 1401, which read the tape and printed the contents on a line printer.

"The medium is the massage." -- Crazy Nigel

Working...