What Can We Learn from the Computers of 1966? (harvardmagazine.com) 61
Harry R. Lewis has been a Harvard CS professor — teaching both Bill Gates and Mark Zuckerberg — and the dean of Harvard college. Born in 1947, Lewis remembers flipping the 18 toggle switches on Harvard's PDP-4 back in 1966 — up ("click!") or down ("CLACK"). And he thinks there's a lesson for today from a time when "Computers were experienced as physical things."
[T]he machine had a personality because it had a body you could feel and listen to. You could tell whether it was running smoothly by the way it sounded...
Unlike the unreliable mechanical contraptions of yore, today's computers — uninteresting though they may be to look at if you can find them at all — mostly don't break down, so we have fewer reasons to remember their physicality. Does it matter that the line between humans and the machines we have created has so blurred? Of course it does. We have known for a long time that we would eventually lose the calculation game to our creations; it has happened. We are likely to lose Turing's "Imitation Game" too, in which a computer program, communicating with a human via typed text, tries to fool the user into confusing it with a human at another keyboard. (ChatGPT and its ilk are disturbingly convincing conversationalists already.)
Our challenge, in the presence of ubiquitous, invisible, superior intelligent agents, will be to make sure that we, and our heirs and successors, remember what makes us human... All computers can do is pretend to be human. They can be, in the language of the late philosopher Daniel Dennett '63, counterfeit humans... The first error is suggesting that computers can be digitally trained to be superior versions of human intellects. And the second is inferring that human judgment will not be needed once computers get smart enough...
[N]o AI system can be divorced from the judgments of the humans who created it... Only hubristic humans could think that their counterfeits might completely substitute for human companionship, wisdom, curiosity, and judgment.â
Even back in 1966, Lewis says he learned two lessons that "have stood the test of time. Be careful what you ask them for. And it can be hard to tell what they are doing."
One example? "In those pre-miniaturization days, the ordinary operation of the central processor generated so much radiation that you would put a transistor radio on the console and tune it in between AM stations. From the other side of the room, the tone of the static indicated whether the machine had crashed or not."
[T]he machine had a personality because it had a body you could feel and listen to. You could tell whether it was running smoothly by the way it sounded...
Unlike the unreliable mechanical contraptions of yore, today's computers — uninteresting though they may be to look at if you can find them at all — mostly don't break down, so we have fewer reasons to remember their physicality. Does it matter that the line between humans and the machines we have created has so blurred? Of course it does. We have known for a long time that we would eventually lose the calculation game to our creations; it has happened. We are likely to lose Turing's "Imitation Game" too, in which a computer program, communicating with a human via typed text, tries to fool the user into confusing it with a human at another keyboard. (ChatGPT and its ilk are disturbingly convincing conversationalists already.)
Our challenge, in the presence of ubiquitous, invisible, superior intelligent agents, will be to make sure that we, and our heirs and successors, remember what makes us human... All computers can do is pretend to be human. They can be, in the language of the late philosopher Daniel Dennett '63, counterfeit humans... The first error is suggesting that computers can be digitally trained to be superior versions of human intellects. And the second is inferring that human judgment will not be needed once computers get smart enough...
[N]o AI system can be divorced from the judgments of the humans who created it... Only hubristic humans could think that their counterfeits might completely substitute for human companionship, wisdom, curiosity, and judgment.â
Even back in 1966, Lewis says he learned two lessons that "have stood the test of time. Be careful what you ask them for. And it can be hard to tell what they are doing."
One example? "In those pre-miniaturization days, the ordinary operation of the central processor generated so much radiation that you would put a transistor radio on the console and tune it in between AM stations. From the other side of the room, the tone of the static indicated whether the machine had crashed or not."
Things change, 'basic knowledge' evolves (Score:2, Offtopic)
Today, my understanding of how to start a fire without a kerosene brick and a lighter is purely for entertainment value. I've never started a fire with two sticks and some dried moss, but I MIGHT just be able to in the kind of life or death emergency that is never going to happen to me.
There is no practical point in the average person learning about transistors, or punch cards, or vacuum tubes, or toggle switches. Computers today are so much more complicated that understanding the basics will give you pre
Re:Things change, 'basic knowledge' evolves (Score:4, Insightful)
There are people who treat computers like they were essentially magic. They don't have any skepticism of it, and if the computer gives an erroneous answer or result then they implicitly trust it. Someone wants their social security number, they will hand it over because the computer said to. Now, knowing how a computer works doesn't remove gullibility, but it does help dismiss the feeling that computers are magic.
Now you might think I'm just talking about your average technologically illerate home user. But I see professional programmers who have a similar deference to a computers magic qualities. Ie, the compiler is always correct, it does not make mistakes, and my tests showing that there's a bug puts them into a confusion. Or the bug must be due to my new code because the existing code didn't exhibit bugs before. I have run into many many cases of programmers making major blunders because they don't understand floating point numbers, how they work, what their accuracy is, how complicated they are to calculate, etc. There's a LOT of cargo cult programming out there, as well as cargo cult testing and design.
Re: (Score:2)
Maybe I'm a gatekeeper, but I think those kinds of people are best harassed until they catch on or get out. /The only magic in computers is the smoke. If you let out the magic smoke, the computer stops working.
1960s software development ideas maybe beneficial (Score:2)
Maybe a few of the 1960s / 1970s / 1980s software development ideas on productive/unproductive software development practices can be used for reevaluating more modern technology...
Maybe magic strings as the core bridge between DOM element to JavaScript code can be reevaluated....
document.getElementById("customerdata")
Built in reuse and encapsulation without using 1000+ package heavy frameworks (Angular, React) and without cut and paste duplication of DOM/JS code
Re: (Score:2)
No, that smoke coming out is proof of The Theory Of Dark. Lights suck up the dark, send it to powerhouses, and they emit it as smoke. When you break electrical circuits, they leak the dark prematurely.
Re: (Score:3)
I can't believe how uninformed this post is. The magic smoke isn't dark, it's Blue! That's why everyone says You Blue a fuse!
https://en.wikipedia.org/wiki/Magic_smoke [wikipedia.org]
Re: (Score:2)
I must admit to having believed a "computer" once when the answer was clearly wrong. The "computer" wasn't actually a computer, it was a Casio fx-115 calculator ( this [ebay.com] model) I bought in the mid-1980's because it had oct/hex/bin capabilities.
I was doing some debugging with another dev who came to my office wanting help. A number had been printed out (I don't recall where/why) in decimal and I wanted to confirm that it was 2^32 but couldn't recall the decimal value of that number. So I asked my calculator whi
Re: (Score:2)
Re: (Score:2)
That's a good observation. I wish I had added it to my comment below about the IBM 1620 teaching me how computers do basic stuff, and programmers who don't have a clue about efficiency. It really does go beyond that, to people who think computers can do no wrong. GIGO is unknown to them, and they'd argue that it's not true.
Maybe the common factor is a lack of curiosity. I remember one programmer who always found the most immediate cause of bugs, with zero curiosity of why the bug had been written that w
Re: (Score:2)
Re: (Score:2)
Back when I worked supporting computers for a defense contractor labs, some scientists occasionally seemed confused about such things also, as well as some other obvious mathematics. Ie, one was baffled that his test of his program that did N^3 performance that ran in 10 minutes, now with (N+1) was not completing even after a full day, I think he honestly expected O(N) performance. A different scientist, arranged his array access in a way that guaranteed a page fault on every access, and when I said he sh
Re: (Score:2)
New story, a Chemistry Prof I used to support (used to support only the science and maths profs at a university) when told for the third
Re: (Score:2)
There is no argument in there... (Score:1)
While I agree with most of the conclusions, I have actual arguments for them. The story just appeals to belief and some form of nebulously specified mysticism. A bit more than that is required to be credible.
Re: (Score:2)
who cares about "arguments" if they will never lead to a theory?
Re: (Score:2, Troll)
you came up with auxiliary thought-constructs that make sense to you and are using their claimed existence to assert your superiority, but not sharing them.
i'd have to be mentally challenged to take you seriously.
Re: (Score:3)
Re: There is no argument in there... (Score:2)
How so, AI bot?
Lots (Score:5, Interesting)
Being able to deposit and examine into memory via physical switches gives the user a much better idea of how a computer actually works. From there you can look at the schematics for the cpu and memory. Today a cpu is a monolithic chip but back then it was discrete transistors on flip chips making logic gates. Then a bit later TTL chips became common building blocks.
If you want to experience toggling in bootstrap code via switches and blinky lights you could buy something like this https://obsolescence.wixsite.c... [wixsite.com]
Re: (Score:3)
You know, people who used to toggle in code via the front panel don't recall it being a very fun affair. It was an annoying necessity that had to be done because despite the computer being expensive, they couldn't put in a simple ROM chip for the CPU to boot from.
So you spent half an hour toggling in the magic incantations so you could boot from the card punch reader or punch tape reader, which would at least have more code so you could boot from tape or disk.
And you had to do that every time you powered up
Gates the intellectual visionary /s (Score:2)
Re:Gates the intellectual visionary /s (Score:4, Insightful)
Not a quote, although a google search might turn it up., and I'm sure I have some of the irrelevant details wrong.
He wrote the original BASIC for the IMSAI and/or Altair 8-bit computers that started Micro Soft by copying a university's BASIC system from a dumpster. Yes, he literally dumpster-dove to copy somebody else's code.
He then complained about all those users who gave friends copies of his BASIC, accusing them of being thieves.
That's his level of software competency.
Re: Gates the intellectual visionary /s (Score:3)
Both the IMSAI and the MITS Altair used 8080 CPUs. If Gates copied the code for his BASIC from an university machine, then that machine used the 8080 too, or something roughly compatible, like Intel 8008. What was that mysterious university machine?
Yes, I'm aware that Gates designed his BASIC to copy the look and feel of Dartmouth BASIC, but this doesn't mean he stole anything.
Re: (Score:2)
He didn't have to copy it literally character for character. What mattered was not the individual instructions, but the basic (!) flow, how to look up symbols, handle comparisons, store the data. That's not a trivial part of any program.
Re: Gates the intellectual visionary /s (Score:2)
IMHO, what you're describing would be pure waste of time and effort. The first 8080 systems for which BASIC was supported were extremely limited, having only 4kb of memory or so. Copying Dartmouth BASIC conceptually would at best result in a large program not fitting in its target.
Re: (Score:2)
Re: (Score:2)
Why, the An Open Letter to Hobbyists [wikipedia.org] is insightful enough. It urges people to respect intellectual property, and that regardless of the perceived theft of others' IP on Microsoft products, Microsoft has still made an effort to make these products, and this effort should be respected. To people accusing Microsoft of theft, Mr. Gates replies that everyone is free to repeat what Microsoft did. Too hard? Then pay for the convenience.
The "640Kb should be enough for everyone" line is more controversial.
Re: (Score:2)
Interesting Wikipedia article, but it doesn't mention the dumpster diving, which was "common knowledge" at the time. Probably never know now if it was true or not.
Re: (Score:2)
Billg 1211!1997 Presentation to Level 14+ MS People [edge-op.org]
“Talk about strategy mostly today, some on public perception.”
“A glimpse, to inspire your confidence in senior management, about how we develop strategy" [bill roils video of BilIG and SteveB in a VW Golf- take-off on the VVV commercial - with the "da, da, da" soun
Re: (Score:2)
640K is good enough for anybody.
What a load of clap-trap (Score:2)
Hopefully this statement came from the PR tool who wrote this thing. If the statement came from Harry R. Lewis, though, hopefully he's already retired - because he's living in some fantasy world now and has no business making real-world decisions anymore.
Yes, they "felt" different (Score:2)
Using DDT on a PDP-10, a debugger that ran directly in the address space of the subject program, I always had the sensation that I was "inside" the machine. I've never gotten that from ICE or IDE-based systems. It made patching on-the-fly more exciting, too.
... put a transistor radio on the console...
There was a PDP-10 program that would accept a music notation and "play" the resulting music on such a radio. It could also print player piano rolls on a plotter, taking paper thickness into account (though one required some skill with an Xacto knife
Dear kids today (Score:5, Insightful)
Dear kids today,
Back in my day, holes were dug by men with shovels. You could tell if they were working efficiently by the smell of the sweat on their bare backs. Some would always break down, a few terminally, but knowing blood went into building a thing makes it human. The mistake is thinking that a mere machine, hydraulic fluid for blood, metal for bones and sinew, could ever build a bridge, road or a house with soul, like a man's hands can.
Sincerely,
Old dude who believes in magic
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
My laptop has absolutely no indicators other than a single one that lights white when it is charging and amber when the battery is critaclly low. With an SSD you can't even rely on the sound of the drive spinning to tell you it's doing something.
I use Glances [readthedocs.io] to keep on top of memory and CPU useage and to monitor the tempurature sensors. I also run Wavemon [github.com] to monitor network IO.
These two applications occupy the bottom half of my "Engineering" workspace (yes, my workspaces are Star Trek themed) and both can
Physical things are satisfying. (Score:5, Informative)
Kids these days will never know the deep satisfaction of hanging up a actual telephone on some arsehole. Slamming down that handset is so much better than pressing the "end call" button.
Put a transistor radio on the console (Score:2)
Re: (Score:2)
Re: (Score:2)
Amen! Das blinkenlights and the sound of the disk drives were vital clues as to the progress of programs.
And as valuable as I think my 1620 learning was, I have no desire to go back those days. It was a great learning tool, but not very productive.
It also made me wary of claims for how good computers are. This 1620 had 20,000 decimal digits, 10,000 characters, and I wrote a poetry generator for it. Most was pure nonsense, but every once in a while, one looked half reasonable, and it was all too easy to
Re: (Score:2)
And thus, TEMPEST was born.
Flipping switches (Score:3)
Nearly 50 years later, I still remember some of the 8080 opcodes from flipping switches to load a bootloader on an IMSAI 8080 my freshman year in college. Pulled my first all-nighter writing a smaller program to allow typing them in from a terminal. Of course, it wasn't long after that that we got a North Star floppy drive with a rom bootloader making all that unnecessary.
It was also a popular thing to get programs that played music on transistor radios placed near the microcomputers of the day, from the emissions...
Duh, I've seen it on TV (Score:3)
My first experience was a field trip to the bowels of IBM's facility in Portland in 1977. A father of a classmate worked there. It really was like what you'd see on TV depicted as military and bionic and supervillain lair stuff. Got me started with an insatiable curiosity to how it all worked. I was told some boxes that resembled washing machines were hard drives and by the time I was able to delve back in, hard drives could fit in your Big League Chew pocket.
IMO, the more foundational information one has, the better with which to frame the high for abstract stuff. In any case you can recognize and hopefully respect the complexity of the level you choose to settle at, interest-wise. I started by learning binary. Gotta start somewhere. If we're just talking industrial curiosities, have fun with niche interests! I'm not going to piddle in your cereal.
Humans still stand in the way (Score:2)
I was watching a speech that Steve Jobs gave in the early 80s and noted the story of how Apple wanted to put one computer in every school. The motivation, he said, was to give curious student a chance to get their hands on one. As someone who has been fortunate to be in proximity of computers in school since 1978, I can tell you that such access was always, ALWAYS, restricted by someone. The curious never really got unfettered access. Even as the technology progressed and became cheaper, access was still
Re: (Score:2)
We had apple 2s in my elementary school in Aptos, CA. I can't remember if there were two or four of them in the library, but I was the only one who ever used them. I was able to pick through the floppies and load what I wanted at lunch recess time. Played a bunch of oregon trail, mickey's space adventure, carmen sandiego, etc etc.
efficiency (Score:4, Interesting)
Most of today's programmers are idiots, incapable of actually writing real code and certainly not capable of making efficient use of a computer. On the old hardware, you wrote your own code - and you fully understood it and could debug it. Too many today slop together piles of code written by others and harvested from the internet, and add some garbage code to link the borrowed hunks together - they do NOT fully understand what they have supposedly written, cannot properly debug it, and can NEVER get rid of all the bugs. NOBODY today is writing code that maximizes the use of the hardware, instead they are counting on gigabytes of memory and multi-core multi-gigahertz processing to make their shoddy sloppy claptrap "code" run at a usable speed.
It takes an entirely different caliber of programmer to write full applications that run in 16K or 32K of RAM at 2MHz than it takes to write an "app" that needs 2GHz and 32GB.
The very sad part of all of this is that the programmers working on the old hardware were coding in more primitive languages (often assembly) often with simple text editors (no suggestions, no code coloring, no API reference stuff, no language-specific plugins, etc) and doing manual debugging - not a single IDE in sight. Today's programmers are mostly incapable of operating in such an environment as they (using tools early coders could barely dream of) write worse code.
Re: (Score:2)
I agree with you to an extent. But efficiency comes in different flavors. A program which runs half an hour a week but takes a year to write well is not necessarily better than a program which takes an hour to run but was written in a week.
Bullshit Re:efficiency (Score:3)
The difference is software complexity. What people wrote that ran in 16K or 32K of RAM was so much simpler than the things done today. These days you can hammer in a day something that's more complex than everything you did as a programmer 40 years ago together.
Did doing dry runs on paper for mainframe code that you knew you needed to submit to get compiled make us better programmers? Did debugging on paper or with prints because there were no debuggers made us better programmers? I don't think so. The fact
What I learned most from all dass blinkenlight (Score:3)
I learned to program on an IBM 1620 Mod I Cadet (Can't Add, Doesn't Even Try -- it added with a table lookup -- 1xy, or 2xy if there was a carry in). Hundreds of lights, you could single step through every single memory load and store (6 cycles just to read a 12 digit instruction), and what I got the most out of it was how computers do things. I've interviewed programmers who measured their program efficiency in lines of code, as if combining two statements on one line with a semi-colon between them made it faster, as if function calls were equally expensive as adding two variables. Most aren't like that, but too many were. Computers were a black box mystery and the inner details of no consequence. Cache memory, slow memory, opening and closing files for no reason, with absolutely no comprehension that system calls were expensive.
In the long run, it really doesn't matter all that much, since computers keep getting faster, optimizing compilers can work around most of that, and they can certainly optimize better than humans for modern computers. But I've always liked knowing my tools. It's like using a modern drill and not knowing what the different settings do, or trying to hike with a 50 pound pack in flip flops. Sure, hiring contractors and driving cars make that knowledge useless, but ....
Nothing. (Score:2)
Let's be real. There's nothing to learn from them. They sucked.
Talk about touchy feely (Score:3)
Re: (Score:2)
I see no objective reason why electronic devices can not perform all neural activities. If you want to claim humans haves souls or existence somehow above or beyond the physical you should do so openly rather than make touchy feely comments. To me this is reminiscence of the time when creating organic compounds were thought to be beyond human capability.
I would take as an objective reason that we see zero electronic devices in nature which can perform any neural activities at all. Organic compounds are formed and transformed all of the time by natural processes. There are very few areas in which human ingenuity has reached anything near parity with natural systems, possibly none.
Nothing to be learned. Just an AI comment. (Score:2)
Honestly, that's what we can learn? Aren't there like another million attempted arguments for the same thing?
Pre-Napster... (Score:3)
"...put a transistor radio on the console and tune it in between AM stations. From the other side of the room, the tone of the static indicated whether the machine had crashed or not."
We would write programs specifically to play music through the radio.
Linc 8 (Score:3)
Analog History (Score:2)
On Decsystem 10s you hook your console up with RS422 or 20ma current loop usually to an ASR-33 teletype. Anytime a timesharing user sent a message to an operator, it would show up on the console. Send enough control-Gs (Bells for you noobs) on a console connected via current loop and you'd crash the system because a dedicated hardwired console was necessary and there was a circuit fault in the design The ASR-33s were noisy and the bells were especially loud. DEC came up with an ECO that fixed it eventually
Get off my lawn! (Score:2)
Kids nowadays! They have no idea how much work it is to have a well trimmed lawn.