What Does It Take To Keep a Classic IBM 1401 Mainframe Alive? (ieee.org) 60
"Think your vintage computer hardware is old?" writes long-time Slashdot reader corrosive_nf. "Ken Shirriff, Robert Garne, and their associates probably have you beat.
"The IBM 1401 was introduced in 1959, and these guys are keeping one alive in a computer museum... [T]he volunteers have to go digging through historical archives and do some detective work to figure out solutions to pretty much anything!" Many things that we take for granted are done very differently in old computers. For instance, the IBM 1401 uses 6-bit characters, not bytes. It used decimal memory addressing, not binary. It's also interesting how much people could accomplish with limited resources, running a Fortran compiler on the 1401 with just 8K of memory. Finally, working on the 1401 has given them a deeper understanding of how computers really work. It's not a black box; you can see the individual transistors that are performing operations and each ferrite core that stores a bit.
"It's a way of keeping history alive," says one of the volunteers at Silicon Valley's Computer History museum. "For museum visitors, seeing the IBM 1401 in operation gives them a feeling for what computers were like in the 1960s, the full experience of punching data onto cards and then seeing and hearing the system processing cards....
"So far, things are breaking slowly enough that we can keep up, so it's more of a challenge than an annoyance."
"The IBM 1401 was introduced in 1959, and these guys are keeping one alive in a computer museum... [T]he volunteers have to go digging through historical archives and do some detective work to figure out solutions to pretty much anything!" Many things that we take for granted are done very differently in old computers. For instance, the IBM 1401 uses 6-bit characters, not bytes. It used decimal memory addressing, not binary. It's also interesting how much people could accomplish with limited resources, running a Fortran compiler on the 1401 with just 8K of memory. Finally, working on the 1401 has given them a deeper understanding of how computers really work. It's not a black box; you can see the individual transistors that are performing operations and each ferrite core that stores a bit.
"It's a way of keeping history alive," says one of the volunteers at Silicon Valley's Computer History museum. "For museum visitors, seeing the IBM 1401 in operation gives them a feeling for what computers were like in the 1960s, the full experience of punching data onto cards and then seeing and hearing the system processing cards....
"So far, things are breaking slowly enough that we can keep up, so it's more of a challenge than an annoyance."
Oh, man! (Score:5, Interesting)
The first computer I ever used was a 1401 (at Rice University in 1963). As an undergrad, I only got computer time early in the morning (0700hrs), and usually had to turn the thing on, then wait 30min for it to warm up.
It was punched-card input and punched-card output; only grad students could use the console typewriter (a real flailing-arm typewriter, not a Selectric).
But I taught it to do my engineering homework (FORTRAN; not even FORTRAN II). I graduated with a masters in ME, but I was an aerospace programmer for 46 years. Never did a lick of engineering.
My first job (before I graduated) was as a computer operator trainee, working for IBM at the Manned Spacecraft Center in Houston, in the Real Time Computing Complex, 1st floor of bldg 30, Mission Control. That room had 5 IBM 7094 mod II machines, to do the real computations, and 2 IBM 1460 machines (1401 big brother) to process the input and output magnetic tapes for the 7094s, so my 1401 experience paid off.
One day I was about to be late for work, for an important test. I ran into the building, down the hall, and turned the corner to the right, and ran smack into someone running around the same corner in the opposite direction. I knocked him down. I reached my hand out to help him up, about to apologize profusely, when I realized it was Gus Grissom. Being a computer operator, working on a very cold floor, we all wore Hush Puppy shoes (thick foam soles; insulators). He was wearing leather-soled brogans, so he went down. As I helped him up, I said, "Ohmygod, I've broken government property. Sorry; I'm late for a test." He replied, "Me, too! You do good, OK?" Four years later, he burned to death in the Apollo fire :(.
I went on to work in Flight Software for the Viking Program Mars lander, and to install a computer at Cape Canaveral Air Force Station, on the occasion of shuttle Discovery's maiden launch. It was stupendous.
So, thank you lowly 1401 for giving me a jump-start to a wonderful career.
Re:Oh, man! (Score:4, Interesting)
Ask kids we had access to and played with the IBM 1401 at Lawrence Berkeley Labs. It was a pretty amazing machine but was obsolete at the time such that no "real" work was getting done on it. We learned a lot though.
But my first real computer was a contemporary of that, the IBM 1620. The CHM has one of those as well that is working but last I saw the typewriter was simulated with a PC.
The problem with keeping these things running isn't so much the core processors and logic. The mechanical pieces they relied on required a complete industrial base to produce them that simply doesn't exist any more. Can you still find typewriter ribbons and punched card supplies? What about all those little pieces that go into punched tape reader/writers or vacuum column tape drives?
For that matter, can you still buy 1/2 inch tape reels?
It can continue for a while but eventually the only hope for keeping these things running will be 3-D printing the parts they need.
Programming and using one of those systems for a real job is an experience that nobody will ever experience again. I can well understand those old codgers working to preserve them.
Re: (Score:2)
You can still find printer ribbons, and re-ink those available for a while. I just got a box of Hollerith cards from a friend, IBM-style,. and probably should send them somewhere to be useful.
Mag tape is a bit harder to get,. but there is a rumor that a manufacture is starting up, and while this is an audio tape project, they may be able to make data quality tape. But that will be a problem. Tape is possibly the most fragile medium, and hardest to produce.
Re: (Score:2)
Very interesting read. Thank you.
Re: (Score:2)
I started in those days too, though I was on a 1410, the byte-oriented business mainframe. In those days your code was all on cards, and every program read its data from one 7-track tape and wrote processed data to another tape. Real-time operations, as opposed to this 'batch' paradigm, lay in the unguessable future.
And when you ran into anyone, you didn't want to be carrying a program card deck.
Re: (Score:3)
The scrap ones came in useful for relighting the boiler.
Re:Oh, man! (Score:4, Insightful)
For all the whiners who cry for AC posting to be banned... I present the parent post.
Some of the highest quality submissions to slashdot are from ACs. Granted, some of the lowest too, but you get that from registered accounts too.
A year later ... (Score:2)
... a year later, at Marquette University (Milwaukee), as a EE Sophomore, I got put in with the IBM 1620, and in 1965 on the IBM 7040, which is pretty much
the same as your 7094. The 7040 had magnetic core memory. You've heard the joke about the IBM repairman that showed up, opened the panel, and
tapped a memory card? He charged $1000. $5 for the tap, $995 for knowing where. Well, the exact thing happened there at MU. I was there, although the
billing part I was not privvy to. Interestingly, next time,
write a book (Score:1)
Have you thought of writing a book about your career? This would be very interesting history.
Re: Oh, man! (Score:2)
This is why I read slashdot. Thanks.
Emulate it on NCR Century 100 "Mainframe" (Score:1)
That's what a college consortium computing center did where I had my first IT job in 1973 as a programmer-analyst supporting the student record systems for the colleges. I came in fresh from community college with 2-year "Data Processing" degree, and summer internship writing some COBOL programs for a local industry. So, knowing some COBOL, ForTran, and IBM 1130 assembler, I had to teach myself 1401 Autocoder to maintain the existing system. All data input was via 80-column Hollerith cards, updating rec
They do use bytes (Score:2, Informative)
They just defined bytes to be 6 bits. Bytes haven't always been 8 bits. Historically some machines used 9-bit bytes or 36-bit bytes.
Re: (Score:1)
You're lucky (Score:1)
Re: (Score:2)
Selectrics use a 5-bit word to select characters, all mechanical of course, though the printers and electronic models (MT/ST, MC/ST, Memory 50/100, for instance) used reed switch mechanisms (transmit blocks) and multiple switches for functions to be electrified, and worked fine.
Re: (Score:1)
In the mid 1970s at Florida Atlantic University in Boca Raton Florida we shared a Univac 1106 with a couple of other colleges in Southern Florida and it used 36 bit words with 6 bit bytes. FIELDATA was a 6 bit character format that was used as it allowed 6 characters per word with no bits left over.
This was the first computer I learned to program in assembly. I remember how odd the stack architecture of the Intel 8080 seemed at first when I was learning its assembly language, especially for calling subro
Re: (Score:3)
They just defined bytes to be 6 bits. Bytes haven't always been 8 bits. Historically some machines used 9-bit bytes or 36-bit bytes.
Bytes were 6 and later 8 bits. A totally distinct family of 'scientific' machines took 32-bit groupings to be computational 'words' that corresponded to your Fortran variables. Bytes had no status in hardware, and had to be implemented little by little by software that treated words as sets of 4 bytes.
Re:They do use bytes (Score:4, Interesting)
Burroughs was clever and designed its mainframe series (B6700 etc) to use 48-bit words consisting of either 8 6-bit or 6 8-bit bytes. The hardware could handle either (when dealing with character strings).
Quite a few years later I was working on a Control Data Cyber series which still used 6-bit characters, in a 60-bit word. Text processing on that was so painful I wrote my own, in Pascal, which handled everything internally as ASCII.
Re: (Score:2)
Burroughs was clever and designed its mainframe series (B6700 etc) to use 48-bit words consisting of either 8 6-bit or 6 8-bit bytes. The hardware could handle either (when dealing with character strings).
Quite a few years later I was working on a Control Data Cyber series which still used 6-bit characters, in a 60-bit word. Text processing on that was so painful I wrote my own, in Pascal, which handled everything internally as ASCII.
The B6700 had 52 bit words, the extra bits were for flags and parity!
see: http://www.vcfed.org/forum/arc... [vcfed.org]
We had one when I was at University: https://www.cs.auckland.ac.nz/... [auckland.ac.nz]
Re: (Score:1)
Technically yes. However 8 bits in a byte has become standard in nearly all teaching of technology.
This was the Wild West In terms of computing the fact they worked was more important then coming up with good terminology for it.
Fond Memories...my first computer! (DRAFT) (Score:5, Interesting)
In 1963, I joined the (now long-gone) first "service bureau" in the Country, C-E-I-R, strategically positioned in Arlington, VA, within walking distance of the Pentagon...their first major client. Our "Computer Center" had one IBM 709 (a big boxy group of racks, and 12 tape drives...disk drives hadn't yet been invented...and an adjacent IBM 1401.
To make the costly (then $800/hour) 709 more efficient, all written programs (in "assembly code" unique to the computer model, or FORTRAN, if you were lucky) were manually typed, line by line into "punch card" decks that were read and copied on the 1401 to magnetic tape reels, the reels than tagged with the project name, then carried to the other end of the room to be loaded from magnetic tape reels into IBM 709 memory for execution.
During that execution, the tape drives would whirl and the programs and starting data copied to the 709's memory. Memory capacity was, in today's terms, "vanishing small" but the program performed it's computations and produced results. It could be minutes, hours, or even (occasionally) days in duration. Then, the output of the program would be written to tape(s), and returned to the 1401 for printing of results. (One was always suspicious of quick results, because it inevitably meant that the program had a fatal "Bug" in it, which had to be diagnosed and repaired.) I've witnessed piles of printed outputs from some programs that stood taller than any person in the Center. Then, we'd likely find some gibberish in those piles of papers, necessitating finding and fixing the errant instructions on the original punched cards...and the process would be repeated until the results were deemed "bug free."
These were the fastest and sleekest way to produce meaningful results of the day. Of course, all those military projects were "Classified," so programmers and computer operators all had to have quite high-level security clearances...largely because the projects were all related to military strategy and/or predictions of likely outcomes of warfare under varying conditions. It also meant there was no sharing of programming techniques or skills outside the computer center or the clients' premises.
Then we got a magical new product: The faster, sleeker, more powerful IBM 7090, and more reliable upgrades.
But, for all that, it was an exciting time to be engaged in the design, development and coding of new mathematical algorithms (e.g., "Linear Programming", that could yield reports that shaped major decision-making in corporations and government agencies.
The cadre of programmers at C-E-I-R* even created the first (to my knowledge) shareware. It was called "CELIB"
(C-E-I-R Library, sometimes "CLIB"). It collected all the basic tools the programmer on an IBM 1401 needed to use, including stock tools for writing punched card data to tape, printing data from tape to the IBM 1402 printer, or even conducting Sorting of data in one order (e.g., by title) to another (e.g., by date). We all carried our "deck" and it's sparse manual around from project to project. It was published via IBM's SHARE project, where tools like these were made freely available to peers. On my first trip to Australia, I was amazed to see, 12 years after CELIB had been shared, that it was still in use, and had been adopted by a University in Sydney as their common convention for all IBM 1401 programmers, and was taught to students as an example of good coding style and practice.
Smaller, simpler projects (.e.g., inventory management for a chain of retail stores) were programmed for the IBM 1401, with it's arcane idea of variable-length words (an arbitrary number of adjacent 6-bit characters, marked off from one another by the "word mark", a special bit that separated groups of bits into meaningful characters. We did everything from help customers decide how a perishable product (like fruits and vegetables) should be priced to the consumer, to figuring out how many submarines the Pentagon should plan to buy in the next decade...and
more interested... (Score:2)
I like relays (Score:2)
Coding sheets, sliderules and optimizing code (Score:2)
Re: (Score:2)
Re: (Score:2)
Hey, I have one of those too!
Memories (Score:5, Interesting)
This story brings back fond memories for me. In my first term of "data processing" we did our work on an IBM 1401 (circa 1982). The teacher was an old IBM guy who brought it to the school. As the story says, 6 bit "bytes" with a parity bit and a checkmark bit. You had to write your code to bootstrap your program in. The "OS" consisted of loading your card deck in the reader and punching the start button. It would read the first card into memory from address 0 to 79 then start executing at address 0. After that you were on your own. Output was to a card punch (address 100-179) or the 1403 printer (address 200-331). The rest of the total of 4K of memory we had was available for use. The word length was defined by the checkmark bit. You could put in two thousand digit numbers and add or subtract them. It was humorous sometimes when someone would make a programing error during the printing part of their program and get in a fast loop of page feeds. The 1403 printer could shoot the paper to the ceiling when it was feeding the paper that fast and the best solution was to put your foot on the top of the box of paper to rip the paper and stop the feed.
The last program we wrote was a fairly complex inventory problem. The 1401 only did add and subtract and the input data had one item that needed multiplying by a large number in the 200,000 range. You could tell when we ran the program who had written multiply routines in their programs and who just wrote a loop to do that many additions. The multiply routines would finish in a second or two but the add loops took over a minute to complete. I had great fun running and programming that thing.
Re: Memories (Score:3)
One could construct instruction set curiosities on the 1401. On a 16K machine, the maximum length divide instruction required around 57 seconds to complete.
Re: (Score:2)
In my real work life I was sys admin for VAX VMS systems (I loved VMS) and Solaris with Oracle DB and a large ERP system. I left work 3 years ago to a comfortable retirement in a home I own free and clear and no debt. No diapers for me yet. Eat your heart out sonny.
Re: (Score:2)
Your memory is only okay (faulty in some areas).
Those read and print areas ran from 1-80, not 0-79. Print area started at 201. (The machine addressing was truly decimal user friendly, no counting from zero required.)
All the 1401's I worked on had hardware multiply/divide. I know they made them without, but I never worked somewhere that tried to save money that way.
The characters were definitely not "bytes". The term bytes was introduced with S/360.
There was no "checkmark" bit, it was called a wordmark.
I
Re: (Score:2)
Yes, my memory had failed me a bit. Wordmark is the correct term and you are correct that the addressing didn't start at 0. Funny how that happens as you age but I haven't done anything with a 1401 since about 1984. I know bytes is not the correct term but I think it's more relatable to most people here to denote the set of bits that make up a character.
As far as hardware multiply/divide we didn't have that on the one I worked on. As I said it was purchased for the school I attended by an old ex-IBM guy
What about a 1978 HP 3000? (Score:2)
1401 midterm exam (Score:2)
The above character sequence (minus the ellipsis) begins the first card of nearly all 1401 binary decks. Explain why.
2000 vs 2980: explain the significance of these numbers in later 1401 configurations.
Re: (Score:2)
Hmm, you stumped me. Neither of those ring a bell but it's been 35 years since I worked on one. Usually the first part of the cards in my deck was to move my program code toward the end of the card into memory starting at address 500 then read the next card and branch to 0 for the next bit of program. Of course the 1401 wasn't binary but decimal.
easy pease (Score:3)
Old school (Score:5, Funny)
Imagine a beowulf cluster of these!
Re: (Score:2)
Which brings up something that puzzles me. I'm a little surprised that you're allowed to plug it in. The chance that it complies with today's electrical safety regulations has to be close to zero...
I don't think "mainframe" is right... (Score:2)
...not that there's any bright-line definition. But 1401's were considered "small computers" and the main use I knew for them was as satellite computers--auxiliary equipment used together with real "mainframes." For example, an IBM 7090 might perform input and output only to magnetic tape. The tapes were then mounted on the tape drive of a 1401, which read the tape and printed the contents on a line printer.