Computer Historian? 209
mike sollanych writes: "Is there any sort of job in the world for someone who's really interested in computer history? I love it, myself, but I'm just approaching the end of high school, and it's time to make some life decisions. So, is there any place in the industry for a computer historian?" How about it? Many businesses and government agencies employ company historians to record activities which might otherwise get overlooked as mundane. What skills would a most benefit a computer historian, and where are such people needed? Does such a job exist in any but the largest of companies now? Tell us what you think.
Smithsonian (Score:1)
Well... (Score:3)
I don't know if you could get a formal position, but by all means, start a web site! Even a lucid history with pointers to resources would be nice.
I have a good book from ~'86 that goes over the languages and the computer internals of the day (specs on the C64 hardware, a basic memory layout of the TRS-80, etc., etc.), and I'm sure you can find more of that at your local library. I got that one from a library book sale, actually!
---
pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
nope (Score:1)
Be an author (Score:4)
Personally, I've always been a bit of a computer history geek myself (as my .sig probably attests) and I'd sure as hell be willing to buy yet another book on the subject... so write it.
My only suggestion is start at Alan Turing (or if you go back to babage, at least include him). Most people look at the pre-dawn of computers as a hardware-only affair and tend to skip over Good Ol' Al's contribution on the software front....
Library Science / Information Science (Score:3)
Good luck, maybe you can set Hollywood straight (Score:2)
Thanks,
Internet Historical Resource (Score:2)
Years ago, when the world wasn't interconnected, this may have been a viable hobby (note, 'hobby', and not 'profession', as computer historians are generally hobbyists), but not today. Anyone can hit up a search engine, and search for....say, ENIAC [google.com] or EDVAC [google.com], and be presented with truckloads of material. The internet was spawned from computer history, so it's only natural that it has plenty of reference material regarding it's roots.
In the age of the Internet, and it's vast amounts of computer-related historical data, a person trying to do the same job would be pretty bored.
-- Give him Head? Be a Beacon?
Might be very handy (Score:2)
Our secret is gamma-irradiated cow manure
Mitsubishi ad
I would guess... (Score:4)
I think if its REALLY interesting to you, you should consider entering Academia and studying (and teaching) it there. I would have LOVED having a Comp. History course, but, as of yet, few professors are young enough to NOT remember when computers were a "new thing".
I think it might be interesting to see the very specific patterns and progression of computing throughout history. If you wrote some papers on it, I'd certainly read them!
Best of luck!
Re:Smithsonian (Score:1)
-*-*-*-*-*-*-*-*-*-*-*-*-*-*
Life decisions... (Score:1)
To answer your questions I havn't seen anything like that, but in most computer related programs there are some sort of "history of computing" course so I can easily see that there will probably be some opening in that field in the future. If this is what interests you just go ahead and make your way...
Re:Get a head start (Score:1)
He wanted to be employed by a company as a computer _historian_
not
_hysterically_ (as in panicked) employed by a computer company.
not really (Score:1)
Tough competition... (Score:2)
Computer people are just like that. They type all day, keep an eye on the developments around them, and have good memories, so it's no big deal for them to sit down one day and type up the highlights of all the developments they have seen (within a narrow focus) in the last 20 years of their career.
I'd as soon start a business based on creating a new desktop OS!
---
Despite rumors to the contrary, I am not a turnip.
Reference (Score:2)
college professor/writer? (Score:1)
In my college experience, the professors that were easiest to learn from (and best to take) had extensive knowledge about where computers came from (maybe because they could predict where they were going?).
Like was mentioned in an above post, a good base in computer history can help you with current projects, excellent for beginning computer students. And every university has a required class of some kind to prove that the geeks can string a sentence together (so they can post on slashdot).
Re:Well... (Score:1)
Outdated vs. out of date (Score:1)
Re: (Score:2)
Re:Be an author (Score:1)
yeah, yeah, it's "history"! (Score:1)
Re:Good luck, maybe you can set Hollywood straight (Score:2)
As a side note... (Score:3)
This *should* be a position at every University. (Score:5)
IMO, students should be required to take detailed courses in computer history. Why? Not for the trivia, but to understand why decisions were made, and what has been tried before.
Too many students come out of school thinking they know it all, but understanding only a tiny bit of computers beyond the present generation for which they learned to program. Understanding the computers of the past would be useful.
Alas, I have found no such position, or I would apply for it tomorrow.
Computer Presevation (Score:1)
Geeky.org [geeky.org]
Re:Internet Historical Resource (Score:1)
Oh, that's right. Not everybody has the time to find it themselves, and some are just too lazy or too dumb to know where to look. Having somebody on hand who knows from personal experience how to do the job right, especially with archaic computer hardware, is a very valuable thing.
Andrew Walbert
The job is... (Score:1)
Is there any place for a computer historian? Yeah: maintaining legacy systems is all about history :)
baldeep
Oh man... (Score:1)
Re:Be an author (Score:2)
The great news is that studying computer history is dirt cheap. Old machines are sold for next to nothing. You can probably buy an old Cray 1 if you convince the old owner that you're going to take good care of it. The biggest expense will be putting in the right power equiptment and cooling machines.
It can also be fun. I remember a good friend of mine had an old PDP-8 in his living room. He just flicked on one night and said, "Go ahead, it doesn't need to boot." I still remember the speed of core memory every day when I turn on the machine and wait for everything to load. (Of course rebooting is a real bear. He needs to reload the OS with a paper tape!)
My favorite folks in computer history are the ones who make emulators of old machines. It's possible to have a virtual Commodore 64 on your desk if you run the right emulator. That means you can run your old software without futzing over old hardware. This kind of virtual collecting is pretty cool and arguably the right way that cyber savvy folks should be collecting. Who wants to get into the artificially introduced scarcity of physical goods? Lets leave that for the baseball card and stamp fanatics.
Become a Java Programmer (Score:1)
Computer Historian (Score:1)
The Importance of Computer History (Score:1)
in academia. The prospective of the properly trained
historian/sociologist/economist is different from that of the
participants. For examples, the key decision makers during the
Vietnam war have written books about what they did, but the best books
on the topic are by professional historians (usually academics) who do
a much better job than the participants in synthesizing the relevant
information and being relatively objective.
If you combined computer history with some policy work, the project
could be really cool. History can teach great lessons about what to
do (and not to do) in the future.
Re:Well... (Score:1)
Lucid means clear. Lux means light in Latin. Sorry for picking on language.
Computer history (Score:1)
You also have to watch out for the Smithsonian - they seem to not always have their facts straight either (http://www.yale.edu/scimag/Archives/Vol71/Tesla.
So a private, complete, subjective history of all the computer happenings, devoid of corporate influence, would be a really Good Thing.
Antique Radio and TV as precedents. (Score:3)
Nah, I doubt it. I'm sure that most of the old information and stuff that might have been on vintage machines has long been rendered obsolete or transferred off onto a newer computer.
Antique computers, like antique radios or antique TV sets, will never have any value except to Hollywood for use as props and as toys for hobbyists and collectors.
Let's face facts: my Trinitron uses a lot less power than my 1954 General Electric TV set. The Sony has stereo sound, a remote control, goes beyond channel 13 and - get this - it's color! But the old GE is a really neat piece of history, and while I only ever turn it on every now and then, it has a prominent place in my living room.
Now, here's a funny thing: ubiquitous as the TV set is, it has, perhaps, been a victim of its own success. There are less pre-WWII TV sets out there now than there are Stradavarius violins. 1950s and 1960s TV sets are getting rare, too. People tend to hang onto old radios because they're usually rather small or have more decorative cabinets.
There are lots of antique radio museums and collectors around the world, but there are only a handful of antique TV collections. (One of the best is the MZTV Museum in Toronto [mztv.com].)
Early computers are even less useful, from a practical standpoint, than a 40-year-old TV set; at least anyone can figure out how to use the 40-year-old TV, but few of us here could use even a 20-year-old computer effectively. Old TV sets often had gorgeous woodwork and great polished brass and chrome accents that were futuristic for their day. Early computers had that sort of retro feeling of "high-tech" too - a plastic prop out of the movie "Tron". But they lack the handmade qualities of earlier antique electronics.
So, what's the fate of my Commodore 64 in twenty years? Cherished museum piece that people will love to turn on, try out and admire; or will it be reviled and ridiculed for its age, simplicity and primitive design?
the past leading to the future (Score:3)
Let me state that again. Look at game designers. There are some very good game developers and companies that spend serious money by looking at old games to determine how successful they were from different aspects and trying to determine why they were successful.
If you like the history of computing, I'd say to try find an application of it that looks at the computing of yesterday to determine what the computing of tomorrow will be like.
How to do this? Research, writing articles, and create a demonstrated need. Show companies what they'd gain by reading your articles and getting your opinion in their R&D.
It's a neat idea. Takes some work, but there will probably be a strong demand for it in the coming days.
Re:Well... (Score:2)
I don't think this is the case anymore, at least as far as computer history is concerned. The Computer Museum History Center broke off from The Computer Museum in Boston as early as 1996 or so and moved to a building on Moffett Federal Airfield in Mountain View, California. You can visit, but you need to get clearance to get onto the site.
Website: http://www.computerhistory.org/ [computerhistory.org]
Also of interest (and closer to Boston): the Retrocomputing Society of Rhode Island. Their website is here [osfn.org]. There are more museums scattered here and there, but I believe these two (and, perhaps, the Smithsonian) are the foremost.
--Tom
Re:Internet Historical Resource (Score:1)
You are so right. We desperately need computer historians. Otherwise, how will I get by when my PDP-7 breaks down!? I might have to buy a $20 pocket calculator to replace it. What a waste!
Thank you, God bless you, and God bless the United States of America.
Somplace that might need this position.. (Score:1)
... is a large university. Especailly ones with strong Computer Science departments.
Part of the duty of such a historian would be to help provide a means of translating old stored data to newer mediums. As well as similar duties for the transcripton of old code.
If industry has percieved a serious need for this, then they have probably pulled few monitary strings, and courses on such subjects might be appearing. A lot of colleges work closely with industy to determine what skills need to be taugt. (I'm not a CS major, so I can't speak from personal experience for that field).
It also makes sense that a Univeristy might have enough space and funds to maintain a library/museum of old data and data storage devices. If only to maintain thier own records and to provide (sell) this service to smaller companies that can't afford to do it themselves.
Failing that, I can guess with about 90% certainty that the Smithsonian, or Library of Congress (which collects all forms of media) might have deparments dedicated to the history of computers. (If not, they should!)
Good Luck with your search!
Patent prior art searches (Score:1)
Take EE in college and you'll be on your way.
University Degree (Score:2)
Plenty of other posters are speculating on what one could do as a professional computer historian, but a better question is how you'd get there. May I suggest that you find a university with a CS department that offers lots of ethics courses and then joint-major with History. Or get a CS degree and a diploma or MA in museum curatorship. The later is available at Trent [trentu.ca] where you can get a Computer Studies [trentu.ca] degree and a Museum Curatorship [trentu.ca] diploma in just 4 years.
Ask the Father of the Internet (Score:3)
TN3270 anyone? (Score:1)
Personally I'd like to have rows of blinking lights on my workstations : ( I see no reason that had to die. Why!?! Why!?!
/. Readers' Two Favorite Institutions Have Museums (Score:1)
Microsoft Museum [microsoft.com]
The Microsoft Museum is mainly focused on the history of Microsoft, although it does have quite a bit of information and exhibits on the history of computing and computer software. The whole place is decorated in Microsoftie colors. It's located on the Microsoft campus in Redmond, WA. Unfortunately it's not open to the public, but I got to attend a party there while I was interning for The Great Satan of Software. However, they do have a fairly nice website that's available to the public.
National Cryptologic Museum [nsa.gov]
The NCM is run by the NSA and is located on/near Ft. Meade in MD. It gives a good overview of the history of crypto and includes a lot of information on early computing and the role it played. They also have a small public library with plenty of old books that deal with crypto. It's open to the public and has a gift shop where you can buy plenty of things with the NSA logo on them.
Re:Well... (Score:1)
I was going for the first definition; unfortunately, there is no convention for this in English, so I will use the convention used in unix(7) man(1) pages.
Even a lucid(1) history* with pointers to resources would be nice(1).
*BASH_BUILTINS(1) SEE ALSO bash(1), sh(1)
---
pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
internet history is just stories (Score:1)
Job? Maybe. Decent wage? Probably not. (Score:2)
As a young man, I was flirting with becoming a dedicated coin collector and I was already an avid photographer. The Smithsonian had a position available for just such a person. The position required studying their collection, documenting it both in words and photos, and acting as a resource person for all things numismatic. Even the educational requirements weren't too high which is understandable since many/most/nearly all top-drawer numismatists are largely self-educated.
The catch? The job paid, IIRC, about USD$15K per year. Living in DC, that would have meant camping out under a bridge somewhere.
My point? I imagine that many of the same forces would be at work when it comes to a position as a computer historian. Such a job would be fascinating but the market value of such services would likely be low. The people who would employ you would be doing it as a service to the hobby population. As a self-employed person, you might be able to deal in collectible computers, if an when such a market ever develops. (And it's out there, actually. There are people paying premium dollars for rebuilt Tandy 102s and the like. But it's certainly not yet a huge market.) I think the best feedback so far is from folks suggesting academic pursuits.
Of course, if old computers suddenly start to get fashionably "collectible," all bets are off.
:-)
I don't think so Tim :-) (Score:1)
Take for example XT based computers. Believe it or not but these machines are still being used by some corporations due to the simple fact that they can handle do what they need to do and its a waste of money to upgrade 'm with other machines when there is no need. But when there is an error and such a machine does crash the company sure won't call out for an 'historian' to tell them what they need in order to fix it. Thats simply too expensive. Its far more easy to buy a new PC (which is offcourse downwards compatible with XT machines (iow; any other PC)) and off you go.
I'd advice you not to focus on this subject from a 'business point of view' but I sure would advice you to spend some time on this subject in your spare time. If you like the idea of seeking information on older computers and also focus on the way they work you can learn much more about the whole 'computer aspect' then you can learn from studying books like "computer internals" and such. Its much easier to begin simple (XT) and work your way up to the current machines. Offcourse this will take more time then it does to study a book, but it sure will pay off more. For starters it will give you more insights in the working of the machines, which automaticly can lead to more insights on the internals of OS's which can lead to..... Well, I guess you get the picture :) Good luck!
Re:Well... (Score:1)
Their main attraction back then had to do with calculating path length between cities in Pascal on a very *large* computer; back then, it looked pretty impressive!
---
pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
Re:This *should* be a position at every University (Score:1)
This could be a graduate thesis for Anthropology or Philosophy/Sociology of Science if CS Depts won't take it. And the academic discipline of History is more interested in technique than content, so they'll let you do anything that requires the right kind of research.
Re:Well... (Score:1)
Even a turbid history with pointers to resources would be nice.
or perhaps
Even an obfucated history with pointers to resources would be nice.
________________
They're - They are
Their - Belonging to them
Speaking of Steven Levy and AI (Score:1)
Cheers,
Rick Kirkland
IBM (Score:3)
Stick w/ the computer part, no money in History (Score:1)
Journalism, history of sci & tech, comp sci (Score:2)
* Get a good general education. Learn to WRITE. Being a historian is an academic job, and you're going to have to write papers. Even if you avoid that horror, you're going to have to write grant proposals and such.
* Take some journalism classes. Learn to write for a popular audiance. If the history thing doesn't work out, you can become a pundit.
* Learn the history of science and technology. It's fascinating stuff, and it will put the history of computers and related technologies into perspective.
* And take some computer science courses! Programming I and Programming II (or their equivalents), data structures, and most importantly a Operating Systems course. Both of the OS courses I took had a LOT of history built in.
Stefan
Start a museum.... (Score:2)
I always though that if they marketed it better (and housed it in a nicer looking building) it could be a draw unto itself. What it needs is an energetic person who can build it up and market it the way it should be done. Of course you would have to sell the owners on it, but if you have a good vision it wouldn't be that hard. I don't think it would be a HUGE draw right away, but it would break new ground. Perhaps it could even be the home of the AFIK first computing hall of fame. The possibilities are only limited by your energy and vision...
--
Quantum Linux Laboratories - Accelerating Business with Linux
* Education
* Integration
* Support
Re:This *should* be a position at every University (Score:1)
Re:Internet Historical Resource (Score:2)
Walk up to a Mechanic, and ask him what year the first car was built, where it was assembled, and what kind of engine it had. Or ask him what year the Bel Air was introduced.
Now, switch places. Walk up to your Network Administrator, and ask what year ENIAC was built, and what it was used for. Or ask what the first commercially available personal computer was, the year it was introduced, and how much it cost.
Pretty stupid, eh? Your argument is flawed.
Mechanics have a skill, and are trained to provide that skill as a service. History is simply reference information. You can easily find the history of the automobile without consulting a mechanic, just as you can locate historical information without the assistance of a Network Administrator. MY point is, you don't need a Computer Historian either. It would take more trouble to contact a Computer Historian, than to hit a search engine and find what the information yourself.
-- Give him Head? Be a Beacon?
AT&T has a Corporate Historian (Score:1)
Go to a technically-minded fuzzy school (Score:1)
If not Stanford, find another top computer science department: UIUC, Carnagie-Mellon, MIT, etc. If the CS department is strong, it will flow into other departments who want to ride the wave.
As far as coursework goes, most schools allow majors to be designed if they don't have one which follows your exact path. Definitely take some CS courses to broaden your knowledge of technology, but a couple of history and economics courses wouldn't do any harm either. Just remember to get as much out of college as possible, since it's only four years.
OK. I'm on my soapbox; but I am a senior, so nostalgia has set in. Good luck and feel free to e-mail me if you have any questions.
Re:not really (Score:1)
Yes, I will never forget what a hottie Ada Lovelace was in her younger days. I really miss her.
For the sarcasm impaired: Even if you are an aging boomer who was writing for mainframes back in the 60's, most of "computer history" happened long before you were born.
Perhaps this kid can teach you a thing or two after all, v4mpyr. :)
Re:not really (Score:1)
I would have to argue,
Computers, while relatively new, are a rapidly changing medium. And one that has entered into just about every facet of modern life. (There are people who study the history of mass media!)
To some all the subtle changes from one system to the next might be a practice in detail.
But ask any engineer, and they'll tell you, that for a project of any significantly large enough scale, the devil is in those details! And there is probably a good number of people who are willing to pay for someone who knows how to find those details.
^$1 for the chalk mark, $49,999 for knowing where to put it!^
Recording Motives (Score:1)
Another place for computer historians would that is crucial at the present time, would be to record the status of the industry and related industries along with, again, the motives behind descisions, in relation to technology laws, such as the DMCA and UCITA to keep a clear image of why they were passed and why we should abolish them.
--Drew Vogel
Freshmeat Crossposting Time! (Score:2)
There's lots of other info out there too, like FOLDOC [foldoc.org], which could probably be incorporated into a project like this.
---
pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
Re:Internet Historical Resource (Score:1)
As in any "somesubject historian" profession, certain historians will over time gain respect for being very accurate and knowledgeable, which will lend credence to their judgements on the reality of computer history.
I'm all for it. I'd love to know for sure the details of a lot of those hazy-ish computer legends I hear about all the time.
Now is a really good time to start as well, since there's still a lot of founding fathers alive to be interviewed.
checkout IBM Endicott, NY (Score:1)
-psxndc
Re:Internet Historical Resource (Score:1)
Which came first, the Z1 or Colosus?
Personal Experience (Score:2)
The great news is that the prospects in 10 years will be great, esp with things like the NSA, Echelon, and other nice nasty things out there, they will increasinly play a role in history, and conversly, historians will have to know about CS to tell it as it is.
Unfortunitly, the typical definition of history is 25 years ago. This is just now hitting the Computer age. Give it a bit yet...
Come over to the dark side... (Score:5)
Yes. Picture it. Spend your days on a college campus, teaching classes on the history of computers. You just come up with some random BS thesis on the ways in which computers have affected and changed society, and run with it.
The advantages? It's tough to get a job as a colleger professor, but once you do, you're good to go. Plus, you spend the rest of your life around college-age women.
Come over to the dark side, Luke.
Journalism (Score:2)
Screen Writing and Historical Accounts (Score:2)
http://www.geocities.com/jim_bowery/potc.html
Sherwin Gooch's Account of John Bardeen's Lecture (Score:1)
by Baldrson [mailto] (jabowery@netcom.com) on Tuesday December 28, @08:58AM EST
(User Info [slashdot.org]) http://www.geocities.com/jim_bowery [geocities.com]
In any case, I'll check with Sherwin Gooch to see if he has any more direct evidence from Bardeen himself to support the controversial account of the hide-away experimental stand.
I did, and here is Sherwin's response:
Jim,
Thank you for alerting me to your discussion.
To provide a more solid foundation, one should be aware that I heard this story from the horse's mouth.
John Bardeen himself gave a talk one evening at Altgeld Hall on the University of Illinois campus, circa 1978, in which he related various experiences surrounding his inventing the transistor. At the time, people suspected that the scheduling of this presentation may have been related to Bardeen's health.
Professor Bardeen showed us the B&W 16mm film BB&S had made at Bell Labs immediately after they got the first transistor to work (and, presumably, before Bardeen's boss got to work the next morning...) I have seen individual frames and out-takes of this film since, but I don't know if the entire film still exists. The "rolly-cart" with their experimental set-up is plainly in evidence on the film.
It was John Bardeen himself, at Altgeld Hall, who related that his boss had said that the "solid-state amplifying device" which they wanted to develop was "not feasible," and that, "even if it were possible, it would have no practical application." Dr. Bardeen related that sometimes, when his boss stayed at work past 5 p.m., the three of them would become very impatient waiting for him to leave so they could roll their setup out of the coat-closet, and get busy on what they, apparently, thought was the greatest "cool hack" of the day.
I wonder who Bardeen's boss was. His boss should be immortalized in history next to the NASA manager who advised the last engineer withholding approval of the Challenger launch to "put on your management hat!"
One of the anecdotes John Bardeen related was how he had left his set of photographic slides in the taxi which took him to the ceremony to collect his Nobel prize, and all the trouble to which he and the Swedish government had gone in trying to recover them. But their efforts were unsuccessful; the slides were never recovered. Professor Bardeen was extremely apologetic that he didn't have them to use in his presentation, and so we would just have to make-do with his relating the incidents to us.
With my background in computer music, I found one of the pieces of supporting paraphernalia that Dr. Bardeen didn't lose in Sweden quite interesting. He brought along a transparent plexiglas box, approximately the shape of a 6" cube, with randomly distributed 3/4" or so holes (apparently for cooling?) in the sides. On the top were a number (6 or so) of black SPST N.O. push buttons. A small loudspeaker was mounted inside. (There must have also been a battery of some kind, but I don't recall it.) The box contained a collection of electronic components, their leads soldered to one-another ("tacked together"), and hanging in "free space." (He hadn't bothered to use a prototyping board or connecting strip.) There were resistors, capacitors, possibly some coils, and these ~1" long bar things (which were the transistors), of which there were 3. Dr. Bardeen explained that he had had chosen to build this device because it embodied what he considered to be the fundamental 3 types of circuit: an amplifier, an oscillator, and a filter. He remarked that he thought that pretty much covered everything you could do with electronics. Each of these had been implemented as a single-transistor circuit. Dr. Bardeen then demonstrated the device (which still worked!) by playing "a drinking song of the time, which some of you may recognize" by pressing the few buttons on top of the box in the proper sequence. He apologized because it had gone so badly out of tune (which it had). He apologetically related that he had never re-tuned it. (I'm afraid I didn't recognize the song, nor did anyone sitting around me. I believe he said he had chosen it, in part, because the chorus could be played using a minimal number of different notes. I got the impression that he was somewhat embarrassed by the song, and that's the reason he didn't tell us it's name. I wish I knew what it was.) Even though this makeshift musical instrument was out of tune, I believe the monotonicity of pitches, as one traveled from one end to the other of the row of buttons, still held. The pitches were also all still of a central musical frequency.
Professor Bardeen then passed this device around the audience for everyone to examine, which amazed me at the time, and still does. I wish I had a picture of it. I think this first all solid-state device -- an electronic organ -- should be in the Smithsonian. After all, it contained 3 of the first transistors ever made, AND THEY WERE STILL WORKING!
But I wax nostalgic. Jim, if your point was that Bell Labs did not support Bardeen's research into solid state amplifying devices, you are in good company; John Bardeen, himself, was certainly in agreement. If there were teams being supported to research that area, perhaps he just wasn't lucky enough to be on one of them. I have no idea. All I know is what he told us.
Please feel free to copy this e-mail (less my e-mail address) into any discussions in which you were involved. I find it particularly upsetting when people or organizations fraudulently assume credit.
While it is true that many research facilities can be viewed as "sand boxes" which, independent of management, enable invention, and that many great breakthroughs could not have been accomplished without the collections of tools and talent amassed therein, in reality the role played by management in R&D is much closer to what Scott Adams has chronicled in "Dilbert" than it is to any accepted management text or theory.
Sherwin Gooch
991227
Nor are you advised to allow anything like this to shape your dreamwork:
http://shop.barnesandnoble.com/booksearch/isbnInqu iry.asp?isbn=0471048852#customerReviews
Jim Bowery (jim_bowery@hotmail.com), 46-year old network architect., August 18, 2000,
The Rise and Fall of Midwest Computing Sans PLATO
This is a great book, conveying much of the flavor of what it was like to be in the midwest's computing culture in its heyday of the 60's through the 70's. What it failed to do was tell the real story of the midwest's demise as computing leader of the world -- which isn't the story of Seymour's obsession with packaging over on-chip integration, as implied by this book. Rather it is the story of the failure to deploy the network revolution, now embodied in the Internet, to the mass market 20 years early on Seymour's matured hardware via the PLATO networking project at Control Data Corporation. PLATO was a $1 billion 'bet the company' investment by Bill Norris, the farmer/CEO of CDC who put a windmill pump from his Nebraska farm in front of CDC's corporate towers to remind people where they came from. That is the story of epic proportions only grazed on by this book. PLATO was ready to go to mass market, but Wall Street combined with classic middle mismanagement killed the mass market version of PLATO before it could even be test marketed -- for which it was ready. Had it gone otherwise, Seymour probably would never have left the midwest, and his supercomputer architecture would have focused more on the directions now being taken by Sun and Hewlett Packard -- except with Seymour's inimitable qualities.
I personally worked with the PLATO project and tested a version of it that would have leased a network computer with Macintosh-like interface, including network service, for a flat rate of $40/month with capital payback in 3 years. It had everything -- email, conferencing, user-programmable electronic commerce, multiuser realtime graphics games not to mention thousands of hours of computer based education courseware for which the PLATO system was originally designed. We could get this performance because the culture surrounding the land grant colleges of the midwest, such as the University of Illinois where PLATO originated, combined with Seymour's astounding performance levels created the right tradeoffs between hardware/software. Some of us were looking forward to incorporating Seymour's newly marketed Cray-1 as the foundation for the next generation of mass-market PLATO system -- and initial benchmarks looked to provide an outstanding bang for the buck as an information utility hub -- even without some of the more obvious architectural optimizations that would help in this new kind of application of his systems. This would have shielded Seymour from the vargaries of the government-dominated supercomputer market and driven his architectures into higher levels of silicon integration faster -- possibly providing the kind of capital in the kind of organization that could have delivered on gallium arsenide's potential, unlike the disaster that occured when Seymour left his farm and went cheek-to-cheek with the military in Colorado Springs, CO.
If you look at your Internet Explorer Help menu and select About Internet Explorer, you'll notice it is based on the NCSA Mosaic web browser and that it was developed at the University of Illinois -- right across the street from where PLATO was invented. This was no fluke. PLATO had a profound impact on the culture of the University of Illinois particularly its young students who wanted to push the envelope in networking. The NCSA also gave rise the most widely used web server, Apache, and the the founders of Netscape. The loss of possibly 20 years of 'new economy' is incalculable, but suffice to say, comparable losses have been suffered as the result of open war.
There are a lot of anecdotes this book doesn't tell that will probably die with the people who lived the tale. Just one, to capture a bit of what will be lost to history:
People looking for Cray Research's facility in the fields of Wisconsin could drive up to a farm house and ask where 'Cray Reserach' was located and friendly neighbor would say, 'Oh, you mean Seymour's place...' and then give directions to an area surrounded by an almost invisible network of intelligence agency surveillance equipment -- protecting what was seen as a national treasure from potential espionage. In a speech to one of these agencies, Seymour told them they could come out and protect his folks but only if they never got in the way, and that meant not even letting anyone know they were around. Well, you could tell they were around, but at least they didn't get in the way!
--------
Good luck -- and when they ask you to please consider castration -- tell them that went out with the pharaoh's eunichs (which has nothing to do with UNIX as obvious as such an association may be to them).
Re:Hrm. (Score:2)
I agree, Computer Historians are the ones who generate the web pages, but the Ask Slashdot article was asking if there was a *JOB* in existance for a computer historian.
As a hobby, yes. Profession, No.
-- Give him Head? Be a Beacon?
Re:Journalism, history of sci & tech, comp sci (Score:3)
A lot of this history can be found in everybody's favorite textbook: "Computer Architecture:A Quantitative Approach" by Hennessy and Patterson.
Vintage Computer Festival (Score:2)
If you're local to the Silicon Valley, you might care to check out the Vintage Computer Festival [vintage.org] at the San Jose Convention Center on the weekend of 30 September. Nothing less than ten years old will be on display. You may even be able to see working models of Altair and IMSAI machines. A couple of years ago, a friend of mine brought his DEC PDP-8. And there was also the Wall-O-Mac, with every Macintosh model released up to 1993 or so.
I plan on being there. As the owner of two SOL-20 machines (one with a Helios drive), I have a soft spot for the old machines I cut my teeth on.
Schwab
Re: Same here for Azusa Pacific University =) (Score:2)
Sustaining an imaginative grasp of posterity! (Score:3)
Historians not only analyze the past; they also often catalogue the present. This is vital in a field in which massive change over small amounts of time is a matter of course.
As a designer, I'm fascinated by the effect the internet has had on the history of my discipline. When there is no physical record, there is little in the way of history beyond oral tradition. When websites are redesigned (all of them every day, it seems
I imagine everyone who works with, on or around computers has similar issues to face.
How will future students investigate history without a physical record? The answer would seem to be found in people like the kid who asked the topic question, people who can archive, catalogue, analyze and synthesize information about the information age as it happens. There's no time for traditional history, in which we sit back years later and disect a great battle or read through ancient manuscripts in search of insight... because the record will be gone after the next daily big breakthrough.
I think there's a great deal of promise for this pursuit. Computer historians will ensure that we will continue to be able to learn from "the experience of our predecessors, [and] to sustain an imaginative grasp of posterity*"
*quote from Rick Poyner
Books about Apple... (Score:2)
> _The Second Coming of Steve Jobs
How do you know if it's a good book? I thought it wasn't out yet. Or am I thinking of another book.
I'd seriously warn anyone against taking any book with a history of Apple too seriously, however. It seems that there's not a writer out there who can put his feelings about Steve Jobs aside and simply write a history of the company.
Owen Linzmeyer (sp) did a fairly decent job of remaining detached in "Apple Confidential". But that one doesn't read like a history, so much as a collection of mostly independent essays, from which you can draw out a sence of the company's history.
Pretty much everyone else, however, uses their "history of Apple" book as their personal soap box, either to praise Jobs for his geinus, or tell the world how much they despise the man.
For a good example of the first, see Steven Levy's books; in particular "Insanely Great: The Life and Times of Macintosh, The Computer That Changed Everything". The title pretty much says it all, eh? Jobs is almost the messiah in this one. Or you could read Pogue or Kaplan and get much the same.
For the other side, this "Second Comeing" book has already been widely described as a "hatchet job". Robert Cringly, in "Accidential Empires" calls Jobs "the most dangerous man in silicon valley", and compares him to the likes of Jim Jones and Saddam Hussain. And don't even bother with gil amelio's rag.
Not having met the man, I can't say IF he's so great as Levy thinks, or the physical incarnation of evil, as Cringly would have you beleive. I suppose you could just read them all, and try to pick a middle ground to beleive. Just don't expect anything resembleing objectivity from ANY of them... with perhaps the singular exception of Linzmeyer (sp).
john
Resistance is NOT futile!!!
Haiku:
I am not a drone.
Remove the collective if
IEEE (Score:2)
I would advise going to a good University library, finding this journal, then finding articles that interest you in it. Then, figure out what the background of the authors are, or even contact them for advice. Clearly, the best procedure will be to find out who's doing the work you're most interested in and finding out how they got to do it.
We'll always want old data (Score:2)
But it's different when you look at IT.
Punch cards are still in use. My overclocked Celeron has a 5.25" disk drive. My friend just bought a record player.
Why? Because there's information in old formats that's valuable. And as long as that data's out there, the equipment that can read it is valuable.
Study the History of Science (Score:2)
Alas, this was back during the recession of the late 80's, getting a degree in the History of Science looked like it would perfectly suit me for a job saying "you want fries with that?"* So I opted for something else.
Anyhow, there is an entire discipline out there regarding the history of science and technology. I don't know which schools are big on the history of science. But I can tell you that the big acedemic journal for the History of Science is called ISIS. I assume you can get some information out of that journal about academic programs.
(* note to those interested in funky majors like the history of science. The truth is, if you have computer skills, you can get a job even if you don't have a major in it. I suspect that I could have gone on to have the career I have right now, even if I did get a diploma in the history of science. Oh well.)
Re:Smithsonian (Score:2)
Not to be too rude here, look into media job if this is what interests you. I don't see much value in a historian at a company, except maybe if you can scam an ombudsman job for a support group. I figure that is the sort of job that makes the most sense in the real world, that might do what you want. Or try the writer thing for real.
Oh yeah, if you go into tech writing, get used to getting flamed on
--
$you = new YOU;
college degree (Score:2)
-tim
Re:This *should* be a position at every University (Score:2)
There are a lot of really good ideas have gone to waste over the years since the h/w wasn't capable enough to let people run what they wanted to run. Look at what we're using today. Many of the major ideas are very old. Many of the things thought of as "new" had been talked about or at least mentioned by the likes of Turing, von Neumann, Zuse, et al.
It is very valuable to go back and read some of the old papers. I have a copy of Newman's "History of Mathematics" which I inherited from my father. This has lots of papers from major figures in computing. Ever wanted to read George Boole on logic? Or Turing's paper where he proposes the "Imitation Game"? This is a great set of books. There's also some very good papers on general mathematics in there too. It's a bit old now but is very interesting in any case.
Another good source of information is the ACM SIGPLAN "History of Programming Langauages" conferences. Alan Kay's "History of Smalltalk" presentation at the 2nd is fascinating. And for Unix devotes it also has Dennis Ritchie on the history of C. This conference is one of the best for historical recolection. The people who did things are telling you about it.
I also often go through my old copies of Dr.Dobbs from the 1970s and 1980s. Very interesting to see the types of thing people where proposing for microprocessor based computers Dr.Dobbs also got quite a few good papers from well know CS types. For instance there's a paper from Knuth on TeX in which he states "...I'm going to write a book about the program..." (or somesuch) and some nice articles by the Bell Labs folks on C, Unix, algorithm design (Jon Bently).
Re:Come over to the dark side... (Score:2)
Actually, I should have tried to be serious. If you want to be a "computer historian," you're not going to go into the industry. You're probably not even going to get a job in CS. You're probably going to work in history.
There are many interesting questions to be asked (Score:4)
I think there's definitely a need for computer historians. They probably belong in Universities (I don't know whether it would be in the CS or the History department, though). It's not because computer science is such a young field that there aren't some interesting questions to be studied yet.
Possible areas of study include:
Re:Antique Radio and TV as precedents. (Score:2)
Re:Journalism, history of sci & tech, comp sci (Score:2)
-----
http://movies.shoutingman.com
There are computer historians already (Score:2)
Such as Paul Ceruzzi, who works for the smithsonian, and has written several books on the subject. He's also involved with SHOT, the Society for the History Of Technology [jhu.edu].
You might also be interested in the slightly less formal Vintage Computer Festival [vintage.org], taking place at the end of September. There will be plenty of history and historians there. The VCF web site also has a long list of links to museums, collectors, etc.
And, of course, I would be denying my own conceit if I did not mention my own collection [sinasohn.com] of classic computers.
Computer history is a growing field, but not one that I think you could ever get rich in, any more than any other similar field. Certainly it is fascinating to look back and see just how far we've come.
Re:Smithsonian (Score:3)
The resident computer historian at the Smithsonian [si.edu] is Paul Ceruzzi [nasm.edu]; a very knowledgeable guy. So they already have someone, but other museums might not.
Re:This *should* be a position at every University (Score:2)
And so they blissfully* reinvent the wheel, over and over again...
*as in "ignorance is bliss"
Re:Internet Historical Resource (Score:2)
Oh sure, and then you find a page like this one [newmedianews.com] which is factually wrong on several levels. (The Gavilan was preceded by the GRiD Compass, and possibly the Sharp PC-5000.) So you can leave such misinformation alone, or you can rely on a computer historian to correct it.
Meanwhile, can you find out what the first PC was? If your lucky, you might come across this page [blinkenlights.com] which will test your knowledge and probably surprise you -- it was put together by a computer historian. That same historian has done quite a bit of research into the first pen-based portable, but it's not on the web (yet).
So don't knock computer historians, unless you don't care whether or not your history is correct.
How to earn a living as a computer historian (Score:2)
Being somewhat involved [sinasohn.com] in the computer history field myself, I know several people who have made a few bucks off their knowledge -- through providing that knowledge to legal firms for use in patent cases. Prior art is a very big part of proving a patent should not have been given, and having the obscure knowledge of old systems that might have had a particular feature can be very valuable.
And if you'd like to pick up some of that knowledge, check out the Vintage Computer Festival! [vintage.org]
Computer History as a Business (Score:2)
Things are not good in the computer history business. In part because the main-line companies that felt this was important have faded into oblivion (think mainframe and mini) and the dot coms are too interested in wasting their venture capital on roll-out parties.
The sadest example of the problem is the death of the Boston Computer Museum [tcm.com]. Strongly supported by DEC, when DEC went away, so did their funding (and yes there were other reasons including some idiots for executive directors). I was in it several weeks before it closed and a pretty sad thing to see. It has been 'moved' to the Boston Science Center.
The actual museum for the BCM is in California and can be found at Computer History Center [computerhistory.org]. It looks to be alive and interested in history, not 'gee, look, computer interactive toys for school bus loads of children to play with instead of learning how to add, subtract, multiply or heaven forbid divide without a calculator'.
Probably the most respected computer history place at the moment is the Charles Babbage Institute [umn.edu] at the University of Minnesota.
In any case, learn more, subscribe to IEEE Annals of the History of Computing [computer.org] and remember that the dot comes have mostly forgotten/ignored all of this and so you can make money consulting on 'NEW' ideas that are actually old things revisited.
--multics.
hmmm (Score:2)
--
Our university already has such a person (Score:2)
On a more general level, I believe that "computer history" is a job for both CS people *and* historians. Professional historians have learned a few tricks over the years about understanding the past, and trying to write history without their skills leads to amateurish, sloppy work. If historians were trying to use computers for their job, should they get help from an expert or should they try and write the code themselves?
Cringely tells a bit of history (Score:2)
A historian of the computer business has to dig beyond corporate statements.
__
Re:Be an author (Score:3)
Incidentally, Kenneth H. Rosen's 'Discrete Mathematics And It's Applications, Third Edition' (ISBN 0-07-053965-0) provides great computer history-related biographical and historical footnotes. It's also a must-read for it's coverage of um, discrete mathematics.
Indirectly related job (Score:2)
This job consists o being aware of the latest relevant technologies in order to advise corporate buyers about potential updates.
Computer History knowledge is used here to help evaluate the products' advance and estimates the actual possibilities its use may bring to the company.
Choosen products are then extensively tested and compared to currently used ones before they can be deployed in a production environment.
Of course, I used the word product but this could also be whatever which could have an effect on the workers productivity (method, etc.).
--
Academic perspective (Score:2)
There is a small, but growing, collection of historians of science and technology exploring the history of computing/computer technology (I'm just halfway through my master's program here: The Institute for the History and Philosophy of Science and Technology [utoronto.ca] at the University of Toronto. There's only a couple of us doing computers, but it's a start :)
You might want to start at the library reading the Annals of the History of Computing [computer.org]. Off the top of my head, Michael Mahoney [princeton.edu] (who started in the History of Mathematics) has done a lot.
Historians of computing have looked at Babbage, Turing, and Wozniak, but you can start just about anywhere. The field has barely been touched - there are plenty of unexplored areas. And the great thing about the history of technology is that everybody can help: from engineers to economists.
Myself, as a recent University of Waterloo [uwaterloo.ca] CompSci grad, I thought I'd return to my roots, and write my MA thesis about the early computer science program there. In particular, I'm thinking about looking at the birth of WatFor and the related successes achieved in undergraduate education. Hint: if you have a story to tell about Watfor, email me! [mailto]
Re:Come over to the dark side... (Score:2)
If he's talking about personal computer history, he might be less lucky. Most hardware courses feature a history component which is geared toward state-of-the-art-then. If he's teaching at a liberal arts school with an integrative studies program of some kind, he could probably "switch hit", teaching both operating systems and hardware in the CS department, and then teaching (or team-teaching) a "Sociology/History of Computing" class. The sociology of computing angle would be more in tune with personal computer history; the history angle would be more "Turing and the Enigma" perhaps.
In any case, I'd say major in CS and maybe history or sociology, too. (A philosophy major probably wouldn't hurt, either, as long as you read Wittgenstein, Church/Turing, Frege, etc.) Then find a graduate program, get a Ph.D., and get a tenure-track position in a small enough (or forward-thinking enough) department so you can implement your ideas. Once you're tenured, start pushing your more radical ideas.
It's not a fast track, but it's a good track, and will be very rewarding if you stick it out.
~wog