Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Intel

The Chip That Changed the World (wsj.com) 97

The world changed on Nov. 15, 1971, and hardly anyone noticed. It is the 50th anniversary of the launch of the Intel 4004 microprocessor, a computer carved onto silicon, an element as plentiful on earth as sand on a beach. Microprocessors unchained computers from air-conditioned rooms and freed computing power to go wherever it is needed most. Life has improved exponentially since. From a report: Back then, IBM mainframes were kept in sealed rooms and were so expensive companies used argon gas instead of water to put out computer-room fires. Workers were told to evacuate on short notice, before the gas would suffocate them. Feeding decks of punch cards into a reader and typing simple commands into clunky Teletype machines were the only ways to interact with the IBM computers. Digital Equipment Corp. sold PDP-8 minicomputers to labs and offices that weighed 250 pounds. In 1969, Nippon Calculating Machine Corp. asked Intel to design 12 custom chips for a new printing calculator. Engineers Federico Faggin, Stanley Mazor and Ted Hoff were tired of designing different chips for various companies and suggested instead four chips, including one programmable chip they could use for many products. Using only 2,300 transistors, they created the 4004 microprocessor. Four bits of data could move around the chip at a time. The half-inch-long rectangular integrated circuit had a clock speed of 750 kilohertz and could do about 92,000 operations a second.

Intel introduced the 3,500-transistor, eight-bit 8008 in 1972; the 29,000-transistor, 16-bit 8086, capable of 710,000 operations a second, was introduced in 1978. IBM used the next iteration, the Intel 8088, for its first personal computer. By comparison, Apple's new M1 Max processor has 57 billion transistors doing 10.4 trillion floating-point operations a second. That is at least a billionfold increase in computer power in 50 years. We've come a long way, baby. When I met Mr. Hoff in the 1980s, he told me that he once took his broken television to a repairman, who noted a problem with the microprocessor. The repairman then asked why he was laughing. Now that everyone has a computer in his pocket, one of my favorite movie scenes isn't quite so funny. In "Take the Money and Run" (1969), Woody Allen's character interviews for a job at an insurance company and his interviewer asks, "Have you ever had any experience running a high-speed digital electronic computer?" "Yes, I have." "Where?" "My aunt has one."

This discussion has been archived. No new comments can be posted.

The Chip That Changed the World

Comments Filter:
  • by nagora ( 177841 ) on Monday November 15, 2021 @10:44AM (#61989957)

    Still their best product.

  • by paulidale ( 6575732 ) on Monday November 15, 2021 @10:49AM (#61989975)
    The 4004 wasn't as revolutionary as many people think. Yes, sure it was the _first_. The 8008 was far better but it is later. The 8080 was where things really took off on this line. The 6800 and 6502 were also great but divergent -- they lost the CPU wars in the 70's and 80's. HP came up with their scientific calculator at around the same time -- a much nicer architecture in so many ways. Seriously, circa 6000 *bits* of ROM, effectively no RAM and they implemented a full scientific calculator. None of this narrow integer rubbish -- nice wide registers and good floating point support.
    • by Anonymous Coward

      The 4004 wasn't even the first. First commercially available perhaps, but not the first CPU-on-a-chip.

      The 8008 was actually developed in parallel to the 4004 but the design was by the client, so NIH, so intel's IN HOUSE SO COOL 4004 team kept poaching engineers from the MADE TO ORDER UNCOOL 8008 team, making it later.

      So I don't particularly respect intel for these "YAY US WE ARE SO COOL, SO MANY YEARS NOW" stories. We're still stuck with wintendo crap on the desktop.

    • by 50000BTU_barbecue ( 588132 ) on Monday November 15, 2021 @11:06AM (#61990033) Journal

      https://en.wikipedia.org/wiki/... [wikipedia.org]

      "In 1971, Holt wrote an article about the system for Computer Design magazine,[8] but the Navy classified it, and finally released it in 1998. For this reason, the CADC and MP944 remain fairly obscure in spite of their historical importance. "

      • by Agripa ( 139780 ) on Tuesday November 16, 2021 @01:40AM (#61992451)

        "In 1971, Holt wrote an article about the system for Computer Design magazine,[8] but the Navy classified it, and finally released it in 1998. For this reason, the CADC and MP944 remain fairly obscure in spite of their historical importance. "

        If it was so secret that nobody was aware of it, then how historically important could it be? It could not lead to anything.

    • by AmiMoJo ( 196126 )

      Things were definitely heading in the direction of single-chip CPUs at the time. There were lots of discrete and multi-chip CPUs around and it was the development of new technologies like MOS for integrated circuits that made it possible. Others like Texas Instruments were developing single chip CPUs as well.

      The design was clever in some ways, and being the first didn't benefit from hindsight so much as later designs.

      • Well Intel didn't learn much from hindsight. Well, it kind of did but it got caught up in a trap I think when it got used in a PC. At that point there was a never-ending requirement of having backwards compatibility. So 8086/8088 was sort of compatibile with 8080, because there was a hope that 8080 programs could be translated to 8086; just like hope that 8008 could be translated to 8080. But with the PC they were really stuck, it was selling well and they didn't want to shoot that golden goose. So eac

    • by JBMcB ( 73720 ) on Monday November 15, 2021 @11:28AM (#61990121)

      The 6800 and 6502 were also great but divergent -- they lost the CPU wars in the 70's and 80's.

      I'm pretty sure the 6502 *won* the CPU wars in the 80's. It was used in all of Commodore's 8-bit products, as well as Atari, Acorn, Apple, Nintendo, PC Engine, and a vast swath of embedded devices. It wasn't until the 90's when it's use in general purpose computers declined significantly, as it's 16-bit part wasn't that great.

      • Was going to say the same, 6502 was huge. Also the Zilog Z80 ate intel's lunch on the 8080 based systems. 6800 was a clear loser in the personal computing market despite the 6809 being a really nice chip to work with.

        • by JBMcB ( 73720 )

          Was going to say the same, 6502 was huge. Also the Zilog Z80 ate intel's lunch on the 8080 based systems. 6800 was a clear loser in the personal computing market despite the 6809 being a really nice chip to work with.

          From what I've heard the 6800 was really flexible but very expensive, on a part basis and to implement. As you pointed out, it begat the 6809 which was mostly used in arcades I think, and the nearly ubiquitous 68HC11 micro (it's what I learned assembler on.)

          • Yeah, their ancestry story completely missed the Z80 and all those lovely CP/M "luggable" all-in-one machines.

            I remember learning the Z80 instruction set and writing out machine language programs in longhand, translating to decimal to POKE() them into memory in Sinclair ZX-81 BASIC.

            I didn't get an assembler or compiler 'til my first Amiga, but I did have a disassembler for the 6502 in my Atari, that let me unlock all my cassette games to move to floppy, when I got one.

          • Actually, the 6809E processor was the one used for the Tandy/Radio Shack Color Computers, and allowed them to run the OS9 operating system -- which was really ahead of its time, back then!

      • The 6502 is still used in new applications. See here:

        https://en.wikipedia.org/wiki/... [wikipedia.org]

      • by hawk ( 1151 )

        >I'm pretty sure the 6502 *won* the CPU wars in the 80's.

        That probably depends upon your criteria.

        The 6502 certainly landed in more homes, and certainly in more "toy" grade computers.

        But the 8080/Z80 was what landed in businesses, particularly with CP/M. The Apple ][ was pretty much the *only* 6502 to make business penetration (and *that* was heavily dependent upon VisiCalc!).

        And most of the Z80 machines simply used them as faster 8080s--there was *very* little z80 specific software, other than the moni

      • by cotu ( 412872 )

        the 80's were about getting 32 bit processors. the 68000, the 32000, the 80386 etc. 8088's were very, very early 80's but the 80386 came out pretty soon after. 6502's weren't really serious CPU's back then. they were used in the same way that Z80's were used: more for embedded and those kinds of uses.

      • I guess it's a tale of two cities. The home (not terribly important) side, where Apple and Commodore CoCo/64 (price and games) were the mainstays and then the business desktop world where 8080 and Z80 ruled the roost (CP/M). I'm not saying there wasn't Commodore PET, just saying "in general".

        And let's not forget the Amiga which was light years ahead of all of that.

        The IBM PC did change things, but maybe not for the better (IMHO). Sure, I-B-M, made PC-DOS popular, but we were heading toward much big
      • You're most likely using a computer that can trace a direct lineage back to the 8080. I think that constitutes a somewhat longer term win. The 6502 is a great processor, it just didn't evolve quite so effectively.
    • But that's not the point of the article. That's akin to saying well sure the first car was _first_ but it was completely surpassed by the Model T. The argument makes no sense. The first is almost always complete crap, but the first needs to happen so all the rest can follow. Credit where credit is due.
    • It wasn't the "first". Texas Instruments also legitimately makes the claim to have the first CPU on a chip. They were being developed simultaneously and were released at basically the same time.

    • > The 6800 and 6502 were also great but divergent -- they lost the CPU wars in the 70's and 80's.

      Considering a STB (Set Top Box) from only a few years ago STILL used [youtube.com] a 6502 it wasn't so much "lost" as "migrated" away from desktop to the embedded space where they were dirt cheap.

      > HP came up with their scientific calculator at around the same time ... None of this narrow integer rubbish -- nice wide registers and good floating point support.

      Indeed, the HP calculators were brilliant design. The HP48SX c

      • by kenh ( 9056 )

        When people talk about mobile phones lasting for a few hours of battery life I laugh. The HP48SX calculator using 3 AAA batteries would last a month! There were even programs to temporary turn off the LCD screen to conserve battery life and give a nice 10% boost to performance.

        Apples and Oranges - I wasn't aware the HP48SX had wireless networking like a mobile phone does.

        • You missed the point of extreme battery life.

          While the HP48SX isn't a phone it does have an infrared sender and receiver (which chews up battery life.) You could even "capture" and playback codes from your IR TV remote back in the day.

    • 4004 is a interesting because it's a design to do a very simple thing (be a calculator) at a time when general purpose computing was a thing that the designers knew very well. It was a 4-bit CPU but it had 16 mostly general purpose registers. Essentially it was a basic ALU with simple instructions to control it, a compute engine on a chip, but not a design intended for a more general purpose computer. Instead it was 8008 and 8080 that tried to do that, with competition from others (z80, 6502, etc), where

    • by tlhIngan ( 30335 ) <slashdot.worf@net> on Monday November 15, 2021 @04:54PM (#61991265)

      The 4004 lead to the 8008 (which was still a calculator chip, it wasn't practical to build a PC using the 8008), which lead to the 8080, a further refined 8085, and then the (16-bit upgrade) 8086, which the modern PC is derived from (the early PCs used 8088, which was an 8086 with an 8-bit bus, allowing use of much cheaper 8-bit peripherals rather than 16-bit peripherals).

      6800 was Motorola, who wanted around $200 each for the chip, which is why MOS designed the 6500. However, the 6500 was not only pin and binary compatible with the 6800, it was a clone and Motorola sued MOS over it. MOS then designed the similar-but-not-exactly-the-same 6502. The 6502 was extremely popular because MOS literally sold it cheap - $20 got you the chip and programming manuals. And while Motorola only sold chips to well off companies willing to invest, MOS sold to the hobbyist - you could literally go up to their boot at COMDEX and pick up a chip right there.

      So the 4004 isn't really "the chip", you really needed the 4001-4003 as well (ROM, RAM, timer I/O etc) to go with it as a chipset.

      One interesting thing though is to realize how close everything was the 8008/8080 was the basis of a lot of chips back in the days, with the Z80, 6800 and such deriving their processor programming model from it. Sure each chip was different - the instruction sets were similar but not exact and there were plenty of instructions that weren't in either. But it was close enough that Microsoft made source code translation programs to take your Z80 CP/M source, for example, and turn it into a 8086 MS-DOS source (this was an extremely popular program and was how many CP/M programs were ported to IBM PC. The translator for this was helped out because MS-DOS 1.0 retained a lot of CP/M program semantics - MS-DOS 2.0 added a bunch of more modern semantics we associate with programs today).

      • The 4004 lead to the 8008

        Incorrect. The 4004 was a design for a calculator (for the company Busicom).

        The 8008 was an LSI implementation of Datapoint's 2200 CPU. Datapoint built the 2200 (and later 5500 and 6600) computers using MSI "bit slice" chips. Datapoint wanted to reduce the cost and size of the 2200, so provided the design to Intel and contracted them to build a chipset. Intel delivered the design but it was way too slow for Datapoint's use, and so Datapoint literally gave Intel the design (big mistake!)

        I went to wo

        • by cstacy ( 534252 )

          My first computer was a Datapoint 2200, but I didn't write any software on it. It was what I used to edit and submit programs (FORTRAN) to the IBM/360. However, I did play the 3D-tic-tac-toe vector graphics game on the terminal. Given that it was programmable to that degree, and had both tape and printer peripherals (and more), I was surprised that it didn't get used as a real computer more than as a terminal.

          That was when the 2200 was brand new... a while later I was using the IBM 5100 to program APL, and

    • Comment removed based on user account deletion
      • by sjames ( 1099 )

        The funny thing there is that the 8080 got one-upped by the Z80 for a good while. Intel further backed into success with the 8086. Originally, the 8086 was supposed to be an I/O channel processor for systems based on the iAPX 432 [wikipedia.org], a complex object-oriented protected memory CPU. The 432 turned out to be dog slow and failed by 1986. In the mean while, engineers found that the fastest way to run on a 432 system was to run the program on the channel processor (the 8086). So the 8086 became the main CPU (along w

  • by petes_PoV ( 912422 ) on Monday November 15, 2021 @10:49AM (#61989981)

    That is at least a billionfold increase in computer power in 50 years

    But with all the software inefficiencies, bloat, redundant code and security checking, most of that improvement has been absorbed.

    • by AmiMoJo ( 196126 ) on Monday November 15, 2021 @11:14AM (#61990067) Homepage Journal

      Maybe people forget how limited early computers were. Today we taking watching a cat video on YouTube for granted, but it was really only in the late 90s that consumer grade hardware became powerful enough to display video.

      Back in the 80s computers didn't even have enough memory bandwidth. Even if all you had to do was copy the image to the screen 30 times a seconds, you couldn't because computers just couldn't copy data that fast.

      Storage media were not fast enough to stream uncompressed video, and decompressing it was too demanding for CPUs. In the 90s you used to get MPEG decoder cards with special chips dedicated to that task, because CPUs were too slow.

      Nowadays we have games that are close to photo realistic. There is bloat, but to say that most of the improvement has been absorbed is clearly not true.

      • Re: (Score:2, Interesting)

        by Anonymous Coward

        "Maybe people forget how limited early computers were"

        Yes, like people who think the Amiga didn't take 15 seconds to GRONK-GRONK display GRONK-GRONK a GRONK-GRONK handful GRONK-GRONK of GRONK-GRONK icons GRONK-GRONK from GRONK-GRONK a GRONK-GRONK floppy.

        "Back in the 80s computers didn't even have enough memory bandwidth. Even if all you had to do was copy the image to the screen 30 times a seconds, you couldn't because computers just couldn't copy data that fast."

        And yet

        https://youtu.be/o6qK9b6lPDw?t... [youtu.be]

        You

        • by ceoyoyo ( 59147 )

          You didn't actually watch that video, did you?

      • That's the Wintel world though. To the mass market, those were the only computers, because that's all they saw. Video wasn't really that important at the time, but it was important enough that the Amiga got some major market share in that arena. But in the high end computing world even in 1980, there were workstations with 3D graphics. Automobile designers were using CAD software. They were extremely expensive of course. The microcomputer and PC world were lagging by a lot. Even with Windows NT the 3

      • by cotu ( 412872 )

        huh? I designed the software for a laser printer controller which could drive a 100ppm printer at speed. video ram was a thing back then so you could easily drive displays. this was 1985.

        • by AmiMoJo ( 196126 )

          640x480, 24 bit colour, 30 frames per second. That's about 28MB/sec. Even in 16 bit colour, which increases computational overhead, it's beyond consumer stuff in the 80s.

          Most CPUs back then didn't have much cache, if any, so would also need to fetch instructions from RAM. Computationally, FPU were available (not cheap and not fast) but MMX and vector stuff was years away.

    • by swilver ( 617741 )

      Nah, most of the improvement is still bottlenecked by the thing sitting between keyboard and chair.

    • While software has improved in many respects, the fast hardware can make programmers a little lazy. "Why spend a few hours making sure this function runs as fast as possible? Just tell the customer to get a faster computer!" seems to happen a lot. I have been developing a whole new kind of data engine that I often compare against other databases like Postgres. On my new Ryzen 5950x it can execute queries against a 7 million row table about 10x faster than PostgreSQL v.13! All because it was designed from th
  • > That is at least a billionfold increase in computer power in 50 years

    The 4004 was 92k ops, the M1 is 10.4T.
    10.4T / 92000 = 113.000. Not even a million, where the hell do they see a billionfold?

    • OK, my bad math as well. It's 104T/92000 = 113.000.000. Not a billion, for not as far off.

    • The 4004 was 92k ops, the M1 is 10.4T. 10.4T / 92000 = 113.000. Not even a million

      4-bit ops vs. 64-bit ops? You can do much more work with the latter.

    • The "neural engine" on the M1 alone does 11T ops/sec. That' without the image signal processor, NVMe and Thunderbolt storage controllers, security processor... or its 8 general purpose cores of course...

      A billion is a huge number, but it would be pretty fascinating to implement and benchmark H.265 compression of 4k video on a 4004 and see how fast it actually goes. There would be layers upon layers of gymnastics just in memory addressing and swapping, right down into the inner loops of the processing.

  • Life has improved exponentially since.

    I invite the reader to examine Figure 5.1 here [worldhappiness.report].

    • by mspohr ( 589790 )

      The big drop in happiness seems to be just after the 2008 crash when the monopoly capitalists and financial oligopolies took over the government.

      • That's not when they took over. That's when they realize their stranglehold on the government was so utterly complete that they could let the curtain drop and let the citizens, nay, consumers know who the real power was.

        • by mspohr ( 589790 )

          https://www.theguardian.com/co... [theguardian.com]

          But there’s a deeper structural reason for inflation, one that appears to be growing worse: the economic concentration of the American economy in the hands of a relative few corporate giants with the power to raise prices.

          If markets were competitive, companies would keep their prices down in order to prevent competitors from grabbing away customers.

          But they’re raising prices even as they rake in record profits. How can this be? They have so much market power they

          • And the government's letting them do it since they're completely incompetent lapdogs that can't make a move without checking with their corporate masters.

            • by mspohr ( 589790 )

              We have government by the rich and for the rich. Everyone else must fend for themselves but the system is stacked against them.

    • by ceoyoyo ( 59147 )

      Happiness and quality of life aren't the same thing.

      • Happiness and quality of life aren't the same thing.

        Indeed, one might call happiness a quality of life, rather than the only one. Often QoL measures include things like wealth, education, community and freedoms (which the UN also considers in their reports). Personally, I think happiness is one of the more fundamentally important ones. Are there qualities more important to you than happiness?

        • by ceoyoyo ( 59147 )

          Well, years of healthy life, child mortality, and proportion of the population that is secure in their basic needs are a few I'm fairly fond of. There are also things like the hazard of being the victim of violence. All of which have improved quite a bit since the 70s, even in the US.

          Happiness is indeed important, but it's also pretty fickle. If you're an optimist you can look at it as people being irrationally happy even in poor circumstances. If you're a pessimist you can interpret it as people being whin

  • by GameboyRMH ( 1153867 ) <gameboyrmh&gmail,com> on Monday November 15, 2021 @11:22AM (#61990091) Journal

    Microprocessors unchained computers from air-conditioned rooms and freed computing power to go wherever it is needed most. Life has improved exponentially since.

    Technology has improved exponentially since. Life...well it's...different...

  • Checks out (Score:4, Interesting)

    by hackertourist ( 2202674 ) on Monday November 15, 2021 @11:28AM (#61990123)

    World-changing breakthrough caused by engineers being lazy:

    Engineers Federico Faggin, Stanley Mazor and Ted Hoff were tired of designing different chips for various companies and suggested instead four chips, including one programmable chip they could use for many products.

    • The entire evolution of software is from people being Lazy, there is nothing wrong with assembly. In fact most computer cycles are used to feed this lazyness.
      • The entire evolution of software is from people being Lazy

        I would argue that just about every advance humanity has ever made was the result of people being lazy.

        • I was going to say just that: Many of the things we've invented have been to do more with less effort: Computers, cars, most home appliances...
      • At one point, I would have mostly agreed with you... but the harsh truth is, without languages like Java, C#, and Kotlin, and operating systems like Windows & Linux (including Gnome & Android frameworks), writing something like an app that dynamically animates the UI as you interact with it would be flat-out impossible to do in any sane amount of development time, and software would be somewhere between Protracker, DeluxePaint, or Sculpt/Animate-4D(*) and TempleOS or GEOS UI-wise.

        ---

        (*)SA4D is actua

    • by mspohr ( 589790 )

      My understanding was that the 4004 was designed to be used as a calculator chip. (Calculators were all the rage back then.)

      • A friend of mine was working at the university in the 70s when the first desktop calculator appeared and had been shipped to them. It was big and took up a lot of space, but he relates there was a gaggle of professors gathered around behind the chair oohing and aahing at it, saying things like "try dividing!", "do the square root of pi!" and other things. And this wasn't the portable/cheaper cpu-on-a chip thing from TI or Intel yet.

        • by mspohr ( 589790 )

          I graduated from engineering school in 1970. Used a slide rule throughout. (Slide rules are genius.)
          My senior year the library got a basic four function digital calculator. It was the size of a desktop computer with CRT (full of discrete logic boards). More of a curiosity than useful.

    • by hawk ( 1151 )

      While necessity is the mother of invention, laziness is the father . . .

  • Fire (Score:4, Funny)

    by Rei ( 128717 ) on Monday November 15, 2021 @11:40AM (#61990169) Homepage

    Back then, IBM mainframes were kept in sealed rooms and were so expensive companies used argon gas instead of water to put out computer-room fires. Workers were told to evacuate on short notice, before the gas would suffocate them

    "We also have a Halon fire extinguisher. Its always nice to have a fire extinguisher that kills people around.

    Next to my desk we have an 'Ire Extinguisher'. Our boss is... really assertive, so we like the idea of having it there." --Timster

    • by clovis ( 4684 )

      Back then, IBM mainframes were kept in sealed rooms and were so expensive companies used argon gas instead of water to put out computer-room fires. Workers were told to evacuate on short notice, before the gas would suffocate them

      "We also have a Halon fire extinguisher. Its always nice to have a fire extinguisher that kills people around.

      Next to my desk we have an 'Ire Extinguisher'. Our boss is... really assertive, so we like the idea of having it there." --Timster

      I was a so-called field engineer for mainframes back in the '70's and later. Some programmers and their director were in the computer room one day. One of the programmers complained about the halon - "deadly gas" - and asked why were they subjected to this risk. (Their cubicles were next room.)
      I pointed out that it only killed the slow and inattentive while saving the equipment.
      "Win win" their boss said. "One of the selling points."

      • As I have noted in previous threads, I have been in a high rise office building machine room during an accidental Halon discharge: technician slipped while working on a sensor, and am still alive decades later. After a few seconds of confusion in a maelstrom of swirling fog and blowing papers, where I was wondering if maybe an aircraft struck the building (1976), I figured out what had happened, left the machine room without inhaling more, did the equivalent of the SF6 voice deepening experiment, and conti
        • by clovis ( 4684 )

          As I have noted in previous threads, I have been in a high rise office building machine room during an accidental Halon discharge: technician slipped while working on a sensor, and am still alive decades later. After a few seconds of confusion in a maelstrom of swirling fog and blowing papers, where I was wondering if maybe an aircraft struck the building (1976), I figured out what had happened, left the machine room without inhaling more, did the equivalent of the SF6 voice deepening experiment, and continued forceful exhalations until the refractive index shimmer dissipated from my breath. I was familiar with fluorocarbon behavior from using Freon to blanket storage bottles of photo processing solutions.

          Thanks!
          That experience wasn't on my bucket list, but I've been Halon-curious for some time. I very much prefer having done that vicariously.

  • Been ported to the 4004?
    • Hmm... 4 bit Linux. That would be a challenge.
      • #include

        int main(int argc,char **argv)
                {
                printf("Linux\n");
                return 0;
                }

        $ gcc linux.c -o linux
        [something magical happens to transfer this to 4004 external storage]
        $ ./linux
        HAHAHA! You wish!

        I think that's the best we could hope for.

      • What about a Beowulf cluster of 4004s? In the Cloud!

  • It was a gamble for Intel: they were not sure they could sell enough to justify development and manufacturing costs. Other than specialized high-end calculators, they were not sure what equipment makers would do with it.

    Sure, the idea of a CPU-on-a-chip looks great on paper, but so did the Itanium.

  • You've had a good run. Here's your gold watch.

  • Only one link about the 4004. Here's another with more details... http://www.intel4004.com/ [intel4004.com]

    My first microprocessor to try and program was the Z80, in a Timex-Sinclair ZX81. The 4004 had some interesting opcodes, and 12-bit address space. One project in college was to try (the professor gave it as a "challenge") and implement the 4004 but using VHDL. :)

    JoshK.

    • The PDP-8 was also only a 12-bit machine, it was the first computer on which I did assembly language programming.
      • I did some VAX programming, and I once used a PDP-8 simulator. Alot of the older mini's, super-mini's and even mainframes had some interesting features, architectures...now gone with the microprocessor--but hey that's progress?

        JoshK.

  • " offices that weighed 250 pounds".
    That's pretty light for an office.
  • If not Intel, somebody better qualified with better engineers would have done it and we would today not have such an utterly crappy CPU as the "standard".

    • by shoor ( 33382 )

      I was around back then, and I always felt the problem was that IBM adopted the 8088 for their PC back in 1980 instead of the Motorola 68000. Of course, if the 4040/8080/8088 series had been better in the first place, that would have been a moot point. Generally speaking, I don't fault the first people out of the block for not having the best design. (The Wright Brothers had the first airplane, but it was a canard design and didn't have wing flaps. Still, it was the first plane capable of controlled flig

      • by gweihir ( 88907 )

        Indeed. I have been thinking about that as well. The difference in design quality between x86 and 68k is extreme. And the 68k arch was 32 bit right from the start.

    • Sure, nerd, sure. I bet you still think Beta was 'better' than VHS too, right?
  • In retrospect, if paired with a decent CRTC, it could have been the basis of 1974's Christmas videogame system to die for. The problem back then was, the 4004 was cheap, but you needed a computer the price of a house to do the actual development on.

    The BIG thing that made the 6502 & Z80 so revolutionary was, they were the first cheap platforms with enough power and RAM to run proper, semi-civilized development tools that could be used to write real commercially-viable software. Pre-Apple II/Atari 400/80

  • Digital Equipment Corp. sold PDP-8 minicomputers to labs and offices that weighed 250 pounds.

    Was 250 pounds the minimum or maximum office weight for getting a PDP-8? Or did your office have to hit 250 pounds on the nose?

  • The fire supresent was Freon. They systems were required to have an abort button, so that if you could not get out you could prevent them discharging. Really great way of suppressing electrical fires, we used dozens of them in the faculty I worked in. Unfortunately, freon is bad for the environment in the long run, so we have had to remove them. There is no acceptable replacement, we just put in high sensitivity smoke detectors to get the res ponders there as fast as possible.
  • It cemented in the Von Neumann approach to computing - overtaking much better architectures like dataflow machines. Today we pay the price in our unreliable systems that are vulnerable to race conditions and that are so difficult to parallelize.
  • 4004 chip killed of the slide-rule anyways, the technology would later have a big impact. The 8088 did change the world, but it wasn't the next iteration of the 8086, we still refer to x86 architectures. The 8088 was an 8086 with a downgraded 8 bit data bus released the following year 1979. The Motorola 68000 was better chip than the 8086 but it wasn't ready in the the scale IBM needed, and the 68008 competitor to 8088 didn't come out till 1982 because Motorola was making lots of money from the 6800 and 65

  • "He laughed when he was told the TV had a microprocessor in it"

      So what did he think made it possible for the TV to have a segmented digital display or on screen display, a remote with near feather touch buttons instead of a heavy "clicker", and the ability to type in the channel he wanted directly?

    • Back in the 70s and 80s, "computer control" or "microprocessor controlled" was used as a big selling point for TVs, and often silk screened on the TV itself.

  • Raised floor fire extinguishers of that era used Halon 1301 (bromotrifluoromethane), a halogenated hydrocarbon rather than argon, an inert noble gas. Halon is very effective because it disrupts chemical reactions spreading the fire, not merely displacing oxygen and cooling the material, so it works at relatively low (5% to 7%) concentrations. It also does minimal harm to other equipment (the parts that aren't on fire). It is no longer made, being restricted as an ozone-depleting substance.
  • ... silicon, an element as plentiful on earth as sand on a beach.

    Why not just say: "as plentiful on earth as sand"? Or are journalists so precise nowadays that they will then nitpick that there's silicon on earth not in the shape of sand and therefore it should be "more plentiful on earth than sand"...?

    • by ebvwfbw ( 864834 )

      Years ago someone told me they had a PC that was in a fire. When they opened it up only sand remained at the bottom of the steel case.

      I laughed.

"May your future be limited only by your dreams." -- Christa McAuliffe

Working...