Forgot your password?
typodupeerror

"Tech Heroes" From Ada Lovelace to Jamie Z 117

Posted by CmdrTaco
from the all-know-kung-fu dept.
An anonymous reader writes "The Web 2.0 Journal has launched a search for what it calls "the all-time heroes of i-Technology" (its own shorthand for 'Internet technologies'), reaching as far back as to The Countess of Lovelace, though whether or not Ada Lovelace is truly the first programmer is not discussed. As an exercise in reminding ourselves whose shoulders we are standing on when hurtlng towards the 21st-century, richer Web it's not a bad start. Naturally there are sins of omission..."
This discussion has been archived. No new comments can be posted.

"Tech Heroes" From Ada Lovelace to Jamie Z

Comments Filter:
  • well (Score:4, Funny)

    by macadamia_harold (947445) on Sunday February 04, 2007 @10:24AM (#17880808) Homepage
    The Web 2.0 Journal has launched a search for what it calls "the all-time heroes of i-Technology"

    In the search for heroes, they should talk to a Mr. Mohinder Suresh. I hear he has a list.
  • by Anonymous Coward on Sunday February 04, 2007 @10:24AM (#17880810)

    a Web 2 "journal" that doesn't even validate and uses tables for presentation (not to mention 20+adverts per page) spread over 18 pages

    if that's what web 2 is all about i'm dreading Web 3
    • by rucs_hack (784150)
      Jesus, that site is horrifically designed. The first thing I got was an auto playing video, then a floating advert that followed me down the screen.

      I may be wrong, but this strikes me as 'hey, lets make something slashdot might put up and fill it with adverts'. What a heap.

      Oh, and web 2.0 is, so far as I've been able to tell, all about making money, and that means advertising, so yes, expect worse to come.
      • Re: (Score:3, Insightful)

        by QRDeNameland (873957)

        Even worse, that crap pops up even if you have Adblock on.

        Despite that, I hope whoever invented Adblock is on the list. My vote for best technology of the "Web 2.0" era, by far.

        • by rucs_hack (784150)
          I do jhave addblock, which was why it was so shocking, normally I don't see the cruft of the interwebs
          • Re: (Score:3, Funny)

            by jpardey (569633)
            Me too. I decided to see if it was doing anything or not, so I pop open the sidebar and see it had already blocked about 20 items! I added an exception for the video ads, but I hope I never go to that site again.
    • This is by far the worst one.
    • by fm6 (162816)

      Why is that ironic? "Web 2.0" is about hyping "interactive" web applications — most of which are badly designed.

      Can someone explain to me why Jamie Z is a hero? I only know him from reading his comments in the Netscape keyboard resource file when I was trying to get the browser to behave under Linux. These left me with a permanent dislike for the dude: instead of explaining the format of the file, he put in lengthy sarcastic (and misinformed) rants about the "mistakes" made by various Unix vendors i

      • The word "hero" should of course be used sparingly, and probably not in adjunction to "tech", but JWZ holds his place among the Big Hackers, IMHO. Some of his accomplishments, in no particular order:
        • XEmacs. He was one of (the?) main people making a user-friendly version of GNU Emacs.
        • XKeyCaps. This little application has really helped me getting a sane keyboard layout under X a few times.
        • Mosaic. I believe he was the main hacker on the Unix version of the first "real" browser. And one of the first emp
        • by Ilgaz (86384)
          Don't forget XScreensaver and he is much more social/responsive guy compared to some people at that degree.

          Also one of the interesting things is, his club runs OSS (as much as possible) and he shares the stuff as well as "Club" is much more like an open source project, even deepest details including how to get a license for club is shared as well as the software source making some stuff run.

          http://www.dnalounge.com/backstage/ [dnalounge.com]
        • by fm6 (162816)
          None of which is all that impressive. XEmacs is just a GUI-aware port of Emacs (I wouldn't call it "user friendly".) XKeyCaps is useful (or was, when he was still maintaining it), but not a major achievement. I don't believe JWZ contributed that much to the creation of Mosaic or Netscape; certainly his comments in the resource files that I mentioned suggest he'd be more of a hindrance then a help in a major software project, whatever his technical talents.
  • by antifoidulus (807088) on Sunday February 04, 2007 @10:37AM (#17880864) Homepage Journal
    Yeah, like no CowboyNeal option!
  • Web 2.0 Journal? (Score:5, Informative)

    by matt me (850665) on Sunday February 04, 2007 @10:40AM (#17880878)
    A journal with that name just has to be a joke. Yes I did try to read the fucking article, but it was obscured by a large photograph of a bridge. I guess this was an advert.

    Well I'm glad to see this web 2.0 is so user friendly.
    • I did try to read the fucking article, but it was obscured by a large photograph


      Think of it this way: you were looking at page 1. Of 22. Now, do you feel better? If you *had* read the fucking article, you would have had to click 22 times on that "close this window" button. That's what you get when you try to read an article about the inventor of Ada, the most overhyped language until Ruby.

      • by prandal (87280)
        Except that "close that window button" is not accessible if you're on an 800x600 display. And when you try to scroll the page that flipping ad moves with you.

        Consigning "Web 2.0 Journal" to the trashcan where it so obviously belongs.
      • Ada and Ruby (Score:5, Interesting)

        by krischik (781389) <`krischik' `at' `users.sourceforge.net'> on Sunday February 04, 2007 @11:34AM (#17881146) Homepage Journal

        Ada, the most overhyped language until Ruby.
        Ada was not overhyped - Ada delivered everything it promised. Ada was rather underestimaded by those who never learned Ada.

        Of course that was the problem: When Ada came out only very powerfull system where able to run an Ada compiler so not many programmers could actualy try the language.

        But that's not a problem any more, grap yourself an open source Ada compiler [1] and see for yourself.

        As for Ruby: That seems a nice enough language as well. Never given me any problems. So where actually is your problem?

        Martin

        [1] http://en.wikibooks.org/wiki/Ada_Programming/Insta lling [wikibooks.org]
        • by PCM2 (4486)

          Of course that was the problem: When Ada came out only very powerfull system where able to run an Ada compiler so not many programmers could actualy try the language.

          Actually, the problem as I understood it was that Ada compilers were required to undergo a strict certification process, managed in part by the U.S. government. This process was very expensive for compiler vendors, therefore the compilers were themselves very expensive. Getting your hands on an Ada compiler cost several thousand dollars, c

          • by Archtech (159117)
            The cost of Ada compilers should not have been a problem, but a valuable feature. Ada was not meant to be used for writing games, trivial utilities, or ephemeral database apps a la VB. Instead, it was designed for making software as reliable and error-free as possible, so that it could be used in support of business, or in other important/critical applications. When you fly the Atlantic, do you try to do so using a paper aeroplane you whipped up yourself with parts from a DIY store, or do you pay for a flig
          • Indeed they had - but that too has been abolished. Getting an Ada compiler today cost you $0,-- these days (thanks to the US-Navy sponsoring one). And it is not a crippled compiler like the Delphi for home users. It's fully functional, with IDE (also fully functional) and even an CORBA ORB. Ada has learned from it's mistakes.

            However: There is an advantage here as well: All compiler ventdors - even today - comply with the standart. There is an ISO standart describing the official test harnish.

            All unlike C/C+
    • I do so like the Firefox Nuke Anything Enhanced [mozilla.org] extension. I don't use it often, but for web sites like TFA it is nice to have its "remove this object" choice on the right click menu.

      That said, you didn't miss much by not RTFA. I waded through the first few paragraphs, but stopped when I realized that author was in love with the english language but not in a healthy way...

    • Re: (Score:3, Informative)

      by PCM2 (4486)

      A journal with that name just has to be a joke.

      Sys-Con Media is known for this sort of thing. They whip up publications devoted to the latest trends, then scrap them when the ad dollars dry up.

  • They forgot one (Score:5, Informative)

    by TodMinuit (1026042) <todminuit&gmail,com> on Sunday February 04, 2007 @10:42AM (#17880890)
    Douglas Engelbart [wikipedia.org], the true father of desktop computing. At a time when computers were used merely for data processes, he envisioned they could be used in the everyday life.
    • Re: (Score:3, Funny)

      by Anonymous Coward
      So he was the first person to put pr0n on a computer?
  • Woaaah (Score:2, Funny)

    by Anonymous Coward
    Ad gangbang!!!!!

    I can't believe it, gazillion ads on one page (they topped tom's hardware)
  • Ouch! (Score:4, Funny)

    by JamesTRexx (675890) <m@nystrom.mbitz@nl> on Sunday February 04, 2007 @10:47AM (#17880908) Homepage Journal
    when hurtlng towards the 21st-century, richer Web

    I think I'll stick to plain HTML 4.0.1 if web 2.0 is going to hurt that much.
    • by CerebusUS (21051)
      Also, aren't we already in the 21st century? shouldn't it be "through" instead of "towards?"
  • Misplaced credit (Score:1, Interesting)

    by Anonymous Coward
    My question is: Who has been given credit for things that other people invented? Who are the unsung heros and who are the rip-off artists?

    For instance, I always gave credit for the invention of spread spectrum to Hedy Lamarr (a movie star). Then I found this little gem:
    "Frequency hopping spread spectrum was a public domain idea by 1917. The Germans used it in WWII. Hitler wanted to win by bluff and before the war started, invited public figures from England and the US to see how invincible his military w
    • The Lamarr Patent (Score:4, Informative)

      by westlake (615356) on Sunday February 04, 2007 @01:18PM (#17881772)
      When the group got back to the US, they applied for a patent and possibly as a joke put only Hedy's name on it.

      Lamarr was in Hollywood in 1937.

      U.S. Patent Number 2,292,387, August 11th, 1942, [was awarded to Hedy Lamarr] under the name 'Hedy Keisler Markey' (her married name) and George Antheil, for a 'Secret Communications System.' Nomination for the EFF Pioneer award [ncafe.com]

      Lamarr's first husband was an independent munitions maker interested in control systems whose European properties were confiscated by the Reich in 1938. George Antheil, an avant-garde composer interested in the related problem of synchronizing non-traditional "instruments" in concert performance. Advanced Weaponry of the Stars [americanheritage.com]

      Hitler wanted to win by bluff and before the war started, invited public figures from England and the US to see how invincible his military was.

      Hitler was always alert to the propaganda value of massive displays of troops and guns and planes.

      But he was not such a fool as to prematurely expose the secret technologies of jet propulsion, radar, guided missiles, the Enigma, etc., that, in the end, might prove decisive.

      • by kfg (145172)
        U.S. Patent Number 2,292,387, August 11th, 1942, [was awarded to Hedy Lamarr] under the name 'Hedy Keisler Markey' (her married name) and George Antheil. . .

        I knew how much to trust the guy the second he said this:

        ". . .as a joke put only Hedy's name on it."

        KFG
    • by Picass0 (147474) on Sunday February 04, 2007 @09:28PM (#17884912) Homepage Journal
      I don't know James Long, Ph.D, but he seems to have an ax to grind. Most people who met Hedy Lamarr would verify she was extremely intelligent. Her husband in the early 30's, Fritz Mandal, was an engineer and producer of aircraft, artilery, and early weapons guidance. It would appear Hedy learned a thing or two during their time together.

      There are many accounts of Lamarr explaining the process by which she and George Antheil invented the concept of frequency hopping. At the outbreak of WWII Hedy had in idea for a torpedo guidance system. Antheil suggested a way to sync the necessary systems together using a roll of punched paper (as in a player piano)

  • Claude E. Shannon (Score:5, Insightful)

    by z-man (103297) on Sunday February 04, 2007 @11:06AM (#17880974)
    How is it possible to create a list of the most important people in technology throughout history and _not_ include Shannon. Jeez, the guy is the father of information theory and digital circuit design!
    • Dubious paternity (Score:3, Informative)

      by Anonymous Coward
      "In 1924 and 28, Nyquist and Hartley published the limits to communication over a noisy channel. In 1949, Shannon and Weaver published a book on the same subject. Shannon got the credit for Nyquists' and Hartley's work. He also claimed the 34 year old sampling theorem as his own work.
      H. Nyquist, "Certain Factors Affecting Telegraph Speed," Bell Systems Tech. Jour., vol. 3, April 1924, p. 324
      H. Nyquist, "Certain Topics in Telegraph Transmission Theory," A.I.E.E. Trans., vol. 47, April 1828, p. 617
      R. V. L. Ha
      • Re: (Score:1, Informative)

        by Anonymous Coward

        Are you sure about that, or are you just believing that source that you've seen? I ask, because there doesn't seem to be much else out there implying that Shannon dubiously appropriated the work of Nyquist and Hartley and passed it off as his own original work. It would be scandalous if that were the case.

        As I recall from reading Shannon's paper years ago, Shannon does reference (rather than appropriating the work of) Nyquist in his 1949 paper, and what is generally regarded as his original contribution,

        • by Anonymous Coward
          http://en.wikipedia.org/wiki/Shannon%E2%80%93Hartl ey_theorem [wikipedia.org]

          The above linked wiki article is excellent and shows the relation between Nyquist, Hartley and Shannon. AFACT, you could make the argument either way.

          A similar question might be: Who is the father of radio? Marconi? Maxwell?

          Who discovered the electron? ancient Greeks? Stoney? Thompson?

          In attributing credit for something, the guy favors the first one to posit an idea even if the practical implementation came much much later. He points out that
  • by allikat_uk (1058258) on Sunday February 04, 2007 @11:11AM (#17881002)
    How could they forget Alan Turing? [wikipedia.org] The inventor of the Turing test for AI, and father of the modern computer?
    • by rucs_hack (784150)
      He was not the father of the modern computer at all. Dr Tommy Flowers was http://en.wikipedia.org/wiki/Tommy_Flowers [wikipedia.org] . He created Collosus, so in every sense *he* was one of the fathers of modern computing. However he was constrained by the official secrets act to never discuss his creation, so his contribution was forgotten.
      Turing knew how to use Colossus, and did some very impressive things. Certainly he could be assigned the title father of AI, but not of modern computing by a long shot. There are people
      • Re: (Score:2, Informative)

        by Tablizer (95088)
        He was not the father of the modern computer at all. Dr Tommy Flowers was

        It appears they worked together, so it is hard to say. Turing used electro-mechanical relays, and Flowers replaced the designs with vacuum tubes because of his experience in phone systems. Thus, he may have simply "upgraded" the switches to faster technology rather than reinvent the entire computer design itself.
                         
        • by rucs_hack (784150)
          Dr flowers actually built collossus away from bletchley park, and away from Turing. Not that I disagree that Turing may have dabbled, but he was not there when colossus was constructed
          • by Tablizer (95088)
            Dr flowers actually built collossus away from bletchley park, and away from Turing. Not that I disagree that Turing may have dabbled, but he was not there when colossus was constructed

            But Flowers was mostly interested in increasing the speed via tubes instead of mechanical switches. He was not trying for a revolutionary kind of device or techniques other than using tubes instead. Sure, the difference in technology may have resulted in some innovations, but the concepts were not really different it appears
            • by rucs_hack (784150)
              I was more meaning that he personally financed and undertook the construction of the worlds first computer, and yes, it was him who designed it, he was one smart cookie.
              • by Tablizer (95088)
                I was more meaning that he personally financed and undertook the construction of the worlds first computer, and yes, it was him who designed it, he was one smart cookie.

                I guess it depends on how one defines "computer". Turing's mechanical devices were considered "computers". They just were not electronic. I guess you could say Flowers invented the *electronic* computer. But that is a different credit than inventing the computer. That credit would probably go to Mr. Pascal. The other invention that is uncl
        • Flowers replaced the designs with vacuum tubes

          Ahhh.. so he's the guy who invented the internet, then?
      • by Archtech (159117)
        1. Colossus (one "l" only) was not a computer in the modern sense of the word, although it was an important precursor.

        2. Turing's greatest contribution was theoretical, and dates back to his paper "On Computable Numbers, with an Application to the Entscheidungsproblem", which was written and published in 1936.
    • by Falladir (1026636)
      They didn't. He's third from the end as of 12:27, 04 Feb 2006.
    • by eokyere (685783)
      sorry, but mauchley and eckert got there first, not turing ;P
      • by Tablizer (95088)
        sorry, but mauchley and eckert got there first, not turing

        IIRC, mauchley and eckert got credit for the "stored program", that is storing the program in memory rather than wire-boards. But that is a different issue than "first electronic computer".

               
    • Turing on the list [web2journal.com]. Guess you must've missed that one...

    • did they get <URLhttp://en.wikipedia.org/wiki/Tommy_Flowers> who built Colossus ?
  • Missing pair (Score:3, Informative)

    by Raul654 (453029) on Sunday February 04, 2007 @11:15AM (#17881022) Homepage
    At the risk of stating the obvious, the list is missing John Bardeen [wikipedia.org] and Walter Brattain [wikipedia.org], the guys who invented the transistor (With their manager, William Shockley, they won the Nobel prize in physics for it).
  • by nomadic (141991) <nomadicworldNO@SPAMgmail.com> on Sunday February 04, 2007 @11:19AM (#17881048) Homepage
    Andy Hertzfield: Eazel developer and Macintosh forefather

    Jean Ichbiah: Creator of Ada

    Grace Murray Hopper: Developer of the first compiled high level programming language, COBOL

    Jordan Hubbard: One of the creators of FreeBSD; currently a manager of Apple's Darwin project

    Jean D Ichbiah: Principal designer, Ada language (1977)

    Ken Iverson: Inventor of APL, later J


    I've never used ADA, is it really so good that its inventor had to be listed twice in the same list?
    • Re: (Score:3, Informative)

      by NoNeeeed (157503)
      It's apt... The first mention was the specification, the second was the implementation body. Welcome to the world of Ada :) Paul (Ex Ada coder)
      • The first mention was the specification, the second was the implementation body

        In the list posted (I, obviously, didn't RTFA), the first listing was as 'creator' and the second as 'designer' of ADA. This sounds more like the first listing is for implementation while the second is for design.

        This sounds like it would be a little bit more appropriate for Java or C# than ADA...

    • by PCM2 (4486)

      I've never used ADA, is it really so good that its inventor had to be listed twice in the same list?


      Though the Ada language seems destined to be forgotten, at one time the U.S. Department of Defense required that any significant code written for DoD projects be written in Ada.

  • Article text (Score:2, Informative)

    by Anonymous Coward
    Who Are The All-Time Heroes of i-Technology?

    I wonder how many people, as I did, found themselves thrown into confusion by the death last week of Jean Ichbiah (pictured below), inventor of Ada.

    Learning that the inventor of a computer programming language is already old enough to have lived 66 years (Ichbiah was 66 when he succumbed to brain cancer) is a little like learning that your 11-year-old daughter has grown up and left home or that the first car you ever bought no longer is legal because it runs on ga
    • There's a list about "i-Technology" and neither Jonathan Ive [wikipedia.org], nor even Steve Jobs is anywhere to be found!?
    • by mad.frog (525085)
      today Gay still guides Adobe's Flash's development

      No he doesn't. He hasn't been at Adobe for a long while now, and in fact, he and Robert Tatsumi have formed a new startup with other notable ex-Flash engineers.
      • Is there any way we could gather up all these ex-Flash engineers and do something terrible to them, like make them browse the web for four years without any Flash extensions? Or, maybe, force them to release the source for Flash rendering and submit the Flash formats to an open standards body?

        Until then, they really shouldn't even be mentioned in an article about 'Tech Heroes.'

        • by mad.frog (525085)
          Sigh... If only there was a "Mod: -1, Dumbass"...

          Look, tell you what: as soon as you conceive of and write a bit of code that is installed on a few hundred million machines around the world, and ends up producing a multibillion dollar corporate merger, let us know, OK?

          You may find Flash's success to be annoying to your ideology, but the monstrous technical success it's had, and failure of competing technologies, leaves no room for argument here.
          • We could substitute in 'ActiveX' or any number of other registered trademarks for 'Flash' and have a wholehearted discussion here, I guarantee.

            For gods sake, you make 'ideology' out to be a nasty word.
  • Heroes (Score:2, Insightful)

    by alexj33 (968322)
    With the patenting of other people's ideas, Microsoft could be the "Sylar" of Tech Heroes.
  • For crying out loud I hope that list was not supposed to be in order of importance.
  • Vannevar Bush (Score:4, Informative)

    by Aphrika (756248) on Sunday February 04, 2007 @11:34AM (#17881154)
    He's [wikipedia.org] an absolutely huge omission from the list.

    If you're unaware, he wrote a memo in 1945 titled 'As we may think' [theatlantic.com] which laid down a lot of seminal ideas about information, computing devices (the Memex [wikipedia.org]) and the way in which we interact with it - specifically the concept of hypertext.

    If you haven't already read his memo, give it a shot. Along with Alvin Toffler's book 'Future Shock', this changed the way I view technology for ever... oh, stick Alvin Toffler on the list too, Bill Gates for 'commoditising' the PC, Gordon Moore, pretty much anyone who ever worked at Xerox PARC and the guy who invented the MP3 codec. They're all important to why we're sat here today.
    • by pjones (10800)
      Bush's memo/article (published originally in The Atlantic Monthy) did have an effect in America. But the ideas in it as regards hypertext are far from original.

      From 1937, HG Wells' essay/lecture "The World Brain: The Idea of a Permanent World Encyclopedia" reflects a more accurate version of what we now call the World Wide Web. Bush's hypertext was mostly personal and barely social. http://sherlock.berkeley.edu/wells/world_brain.htm l [berkeley.edu]

      And even more important was Emanuel Goldberg, who actually had the machine
  • Bill Gates .. (Score:1, Offtopic)

    by rs232 (849320)
    Bill Gates for single handedly creating the Desktop computer, the GUI, the Web and the Internet .. :)
  • by Anonymous Coward
    By the time I logged in to read a Slashdot article about the creators of the Internet, not a single Al Gore joke had been posted.
    • by Kingrames (858416)
      I think more and more people are finally seeing his movie and getting the crap scared out of them.
      After I saw it I was eager for anything to cheer me up.
  • Clean link (Score:3, Insightful)

    by Sax Maniac (88550) on Sunday February 04, 2007 @01:05PM (#17881684) Homepage Journal
    YARGH! Mine eyes and ears are bleeding! This one even stumped adblock with filterset G. Here's the print version: http://web2journal.com/read/331813_p.htm [web2journal.com]

    We need a tag for "loaded up with ads to the point where you can't even RTFA if you wanted to", but I can't think of anything pithy. "adsoup"?
  • Some people are on his list just because they hold ranking positions on big companies, not for what they did.

    How the hell did Bill Gate get on a list with Vince Cerf, John Postel, Robert Metcalfe, and Nicklaus Wirth? All he did was singlehandly pollute the Internet with spam, and lower IT standards to the point of making IT the laughing stock of the technology sector. Truly an intellectual midget among giants.

    • by kfg (145172)
      Jano, Columbo and Lampredi were the engineering geniuses and Chinetti the marketing genius; but Ferrari is the one that had his name on the cars and the bulk of the subsequent biographies.

      Such is the way of the brand driven world.

      KFG
    • by westlake (615356)
      How the hell did Bill Gates get on a list with Vince Cerf.. All he did was singlehandly pollute the Internet with spam, and lower IT standards to the point of making IT the laughing stock of the technology sector

      Bill Gates is a "laughing stock" only to the proto-Geek who laughed at the Model T Ford, so much less elegant a solution than the Stanley Steamer. But you could "afford a Ford" and so the Ford became ubiquitous.

      The PC is everywhere for the same reason that paved roads are everywhere. The market b

      • by chromatic (9471)

        The PC is everywhere for the same reason that paved roads are everywhere. The market became big enough and strong enough to bear the cost. That is Gate's achievement.

        Bill Gates worked at IBM, on the PC? That was his idea?

        • by mabinogi (74033)
          IBM's idea certainly wasn't a machine everyone could afford.

          If anyone is to get the credit for that (on the x86 side of things), then it's probably Compaq.
      • by MECC (8478) *
        Bill Gates is a "laughing stock" only to the proto-Geek who laughed at the Model T Ford

        There are so many of them still around. Actually, its not BG whose the laughing stock - its the "IT" sector that's a laughing stock to every other engineering and technical discipline. Almost entirely because of the phrase "microsoft standard".

  • I wonder why people seem to forget the inventions done by Douglas Engelbart. "What did he do?", you might ask. Or maybe you say something like "oh, the mouse guy, right?". Well, If I was only to point out one thing he did, I would mention what we call "the mother of all demos" which he gave in december 1968. There he demonstrated the use of a mouse, hypertext linking and video conferencing. Again: He demonstrated the use of a mouse and hypertext linking in documents more than 20 years before Tim Berner
  • by www.sorehands.com (142825) on Sunday February 04, 2007 @03:13PM (#17882440) Homepage
    I'd include Steve Wozniak. He was the one who designed the Apple. The Apple II and The Trash-80 were the real home computers available for the masses. The earlier computers where you had to get them from Heathkit or toggle in your boot loader, didn't quite make it in the home and the business.

    Also I would add Jonathan Rotenberg. He founded the Boston computer Society [wikipedia.org] in 1977. The BCS served as a incubator for new products and companies. Many of the large computer companies made presentations and announcements to the BCS. Several companies used groups of people at the BCS as source for focus groups and and source for beta groups (back in the days where they didn't consider customers their alpha testers).

    • Woz was a genius of simplicity, but how come nobody ever gives the TRS-80 people credit for their, uh.. brilliant way of criss-crossing the address and data bus almost entirely unbuffered across the keyboard layout?

      Oh, never mind.

      (My personal and somewhat meagre innovation from the same era was using the degrees/radian slide switch on the SR-56 calculator as a hardware interrupt)
  • Inventor(?) of the "Ctrl-Alt-Del" key combination.


    I'm afraid the identity of the "Any" key creator (possibly the most useful one in all computing) has been lost to history.

  • Why are we acknowledging this article? Any site that refers to Web 2.0 as anything other than a stupid marketing buzzword has no clue who the real IT heroes are. How about a hurrah for the poor sap working the graveyard shift in the NOC, or the overworked sysadmin who needs to restore a server or correct daemon errors every time the hyped-up "Web 2.0" services break?
  • If anyone missed it , this is what the article looks like, FF2.0 . The most horrible Adsoup of Web 2.0 [b166er.com]
  • It was nice to see Stewart Brand (and Larry Brilliant) there, the founders of the Well. But, Jeremy didn't mention the _author_ of the Well (PicoSpan), Marcus Watts. (A friend of mine.)
    1. The progenitor of "Internet glue" [wikipedia.org]
    2. Internet, IANA guy [wikipedia.org]

APL is a write-only language. I can write programs in APL, but I can't read any of them. -- Roy Keir

Working...