Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Intel

Pentium IV Hits 2 Ghz 319

A number of people wrote in with the news that Intel released the 2 Ghz chip. The Tech-Report article points out a couple interesting meta-ideas - this is Intel's chance to retake the performance crown from AMD, as well as being one of those round numbers that makes people feel warm and fuzzy. I'm sure there's going to be gobs of benchmarks today - post 'em in the comments as you find 'em.
This discussion has been archived. No new comments can be posted.

Pentium IV Hits 2 Ghz

Comments Filter:
  • Anadtech article... (Score:5, Informative)

    by cperciva ( 102828 ) on Monday August 27, 2001 @07:13AM (#2220720) Homepage
    Here [anandtech.com].

    Basic conclusion: 2.0GHz P4 == 1.4GHz K7, but when the 2.2GHz P4.1 comes out in November it will take a clear lead.
    • by jmahler ( 192217 )
      See, the fun thing here is the fact that the public doesn't give a damn about the benchmarks. They'll walk into local computer store "foo" and demand the highest speed they can get, so that they won't go obsolete as quickly. (this is THEIR perception)

      "But sir or ma'am," the salesman will say, "for about 2/3 of the price, you can have this computer, which is arguably better and faster than the Intel Pentium 4."
      "Oh no, we don't want ANYTHING other than Intel," says the mommy or daddy "We KNOW how important reliability is, and we KNOW that the 2 g-H-z (pronounced by letters) is MUCH faster than the A-M-D AthAlon you got there".

      right.
      • to be perfectly honest, i think it's sad that Intel have to rely on big numbers to get sales, and equally as sad that so many people out there remain uninformed about the hardware that they are buying.

        then again it's the same situation with the mac - the G4 is a nippy wee chip, but just doesn't cut it where it counts - that all important clock speed :(
      • See, the fun thing here is the fact that the public doesn't give a damn about the benchmarks. They'll walk into local computer store "foo" and demand the highest speed they can get

        This is almost a contradiction in terms. Processor speed and benchmarks are often held closely together in marketing materials. Having worked alongside the marketing department of one of the top 5 home PC companies, I've seen all manner of spin put on "Why YOU should buy our latest and greatest!".


        However, I'm not so sure that Mom and Pop are all for buying the one with the biggest number anymore. This is no longer the age of the uneducated computer buyer led along by the sales person. Sure, that happens every day, but much of the target market is now on their second or third PC, and are looking for reasons to upgrade that they can relate to - this system has a USB port that will work with that camera, that system has 80 GB hard disk that won't fill up so quickly with Sonny's Paintbrush pictures. More and more families regularly buy PC magazines, and so would realise that Intel isn't the only player. The astute would know about benchmarks too.


        "Oh no, we don't want ANYTHING other than Intel," says the mommy or daddy "We KNOW how important reliability is, and we KNOW that the 2 g-H-z (pronounced by letters) is MUCH faster than the A-M-D AthAlon you got there".
        Not necessarily the case - buyers are more and more clueful in recent years. Having said that, Intel have always been the largest advertiser of all the i86 manufacturers. The famous maxim "No-one ever got fired for buying IBM" may continue to hold for Intel.



        /prak
        *warm and fuzzy with the news of 2-g-h-z*
        • i don't really think there's a contradiction, to be honest. the benchmarks used in advertising are much like the surveys used in politics- you point out whichever one is the one the makes you look best.
          More importantly, the "average mom and pop" would not go on the net and look up benchmarks for processors. if they did, then they would not be the "average mom and pop" that they are advertised to be, right? :)

          people ARE more knowledgeable right now, but not nearly enough to overcome the "higher MHz = Better" mentality so rampant right now.

          What do you think would happen if say... cyrix (via) would introduce an 800MHz CPU that not only outperformed a 2GHz chip and everything else out there, but also ran completely stable for half the price? Besides cows flying and hell freezing over, that is... :)

          I think the chip would fail. Why? Only enthusiasts would understand the architecture improvements, and performance gains. They MIGHT be enough to keep the processor alive in the market long enough to survive and proliferate to the other segments, but it's doubtful.

          AMD is gaining on intel for a few reasons, which are highly touted by the media- it's beaten Intel to the GHz barrier, not once, but twice now with the intro of the Duron at 1000 MHz. that's what is driving it up in sales, along with the perception of a serious value processor.

          What were we talking about again, sorry?

      • Yeah, sales people are always trying to get people to spend less money ... on some planet that I've never visited.
      • "But sir or ma'am," the salesman will say, "for about 2/3 of the price, you can have this computer, which is arguably better and faster than the Intel Pentium 4." "Oh no, we don't want ANYTHING other than Intel," says the mommy or daddy "We KNOW how important reliability is, and we KNOW that the 2 g-H-z (pronounced by letters) is MUCH faster than the A-M-D AthAlon you got there".

        I think you're giving Joe Public WAY too much credit. I think the reality is more like:

        Joe Public: "I want to buy a computer". Salesman: "Very well, sir, what would you like?" Joe Public: "Uhm, a computer, you know". Salesman: "What sort of computer?" Joe Public (stares blankly for a while ..): "Uhm .. a Windows computer?" (watches expression on salesmans face to see if that was the right answer). Salesman: "OK, but what type of system would you like? Intel? AMD?" Joe Public: "Uhm ... uh .." Salesman: "Alright, sir, what you want is the Intel system. 2.0 GIGaHERTZ, this baby is the fastest one we have (points to system that has many other crappy components to cut the price)". Joe Public (who recognises that the name "Intel" sounds familiar from some TV ad): "Uhm .. OK .."

        • I think you're giving salesmen too much credit...

          I went into our local Compustore and was peering at a Compaq T-Bird 900, and the salesdroid said "why don't you look at this one. It has an intel P3-700, so it's much better"

          Other choice quotes from the same conversation:
          • "Yes, the Athlon is about level with a Pentium 2, so the Pentium 3 is much faster"
          • "and Packard Bell have a way better reputation than Compaq..."


          -------------
    • by Lxy ( 80823 )
      2.0GHz P4 == 1.4GHz K7

      I know it's been posted already, but $560 == $107 according to Pricewatch this morning. Explain to me how you can even make an educated comparison on these chips. You're paying an extra $400 for the ability to tell your friends you're running at 2 Ghz, and to heat your home. IMHO Intel is shooting themselves in the foot with their ridiculous pricing. The 1.3 P4's are still more than a 1.4 T-bird, and the T-bird smokes the 1.3 P4 in every test. I have to wonder why anyone without Pointy Hair would consider purchasing one.
      • I have to wonder why anyone without Pointy Hair would consider purchasing one.

        Pop-Tarts. You don't even have to get up to toast 'em. CPU pastries are the 21st century's version of "engine block eggs".

    • but when the 2.2GHz P4.1 comes out in November it will take a clear lead.

      That's getting pretty close to the magic 2.4 GHz number.

      Computers might upset the global microwave oven infrastructure we've already established. Chaos will ensue, as networks of Amana RadarRanges and Panasonic Genius are disrupted. People might have to make a choice between counting with rocks or defrosting TV dinners over a campfire.

      Even worse, there might actually be grounds for newbies calling the CD-ROM tray a "coffee warmer".

      This will also be a new problem for overclockers who are managing to get processors up to the lofty 2.4 GHz range. RF heating of their water cooling systems will have to be addressed.

      Welcome to a brave new world.

  • I use a PIII 500 at home. It is fast enough for everything I need to do, even on those occasions I need to run Windows.

    A few weeks ago I had to buy a "server" for a personal usage. I went for the lowest of the lowest, and bought a PIII 800 for considerably less then I thought I would have to pay for the server. Up to that point, a Pentium 100 did the same job, quite successfully.

    2GHZ? Unless this gives me cheaper PIII 800 (which it won't, it'll only drive the low end to higher performances, not lower prices), who cares?

    • I do 95% of my work on a PII 400Mhz laptop, and have an Athlon 550Mhz as my home office server.

      I'm not interested in games, and frankly can't imagine what I would use a 1Ghz cpu for, never mind 2Ghz. In fact, these days I'm more interested in what I can get my Palm m505 to do.

      Strange isn't it. A few years back I always used to shop for the most horsepower I could get for my money. Now I'd be inclined to shop for the 'least' horsepower; secure in the knowledge that it will easily do what I ask of it, and will be cheap to boot.

      The only exception to the rule I can think of at the moment (sticking with home office) would be a Mac. I'm very tempted by the iMac's (I like the package) but am concerned that MacOS X really needs the grunt of a G4 to handle the accelerated screen work well. And of cause you can't get one yet in an iMac yet. I'll reserve judgement until I've seen the newer and faster G3's running the optimised MacOS X 10.1

      Macka
    • Lots of people (Score:2, Informative)

      in video and pro-audio care a lot.

      An extra chunk of processor cycles = more effects plugins, virtual instruments, etc. This is a big deal for folks with native studio setups.

      You're not going to notice a difference in Word but I sure as HELL would notice a difference in Cubase.

      http://www.mp3.com/vanderrohe
      • Eh, problem is that the # of people working with audio/video is 1/100 of the number of people working with Word/Powerpoint.

        Until about 500mhz, there was a subjective (if pointless) improvement in the feel of common desktop apps. No longer. In fact the only reason that corporations don't buy low-power low-speed chips is that Intel/AMD refuse to make them anymore. If the market was truly meeting demand instead of in a MAD arms race, you wouldn't see 2Ghz chips except at a very high cost (see the old MIPS and Alpha chips).

        Bottom line was that Apple was right -- things like the iMac/cube (and Compaq and IBM's attempts at office 'appliance' machines) are the future. CPU speed will be a footnote in the back of the manual for almost every box. People will blow their dollars on fancy flatscreen monitors instead of CPU. Intel might as well print up t-shirts that say "We made a 2Ghz chip and all we got was a lousy $100 bucks", and R+D will be 'adjusted' to suit that market situation.
    • I use a PIII 500 at home. It is fast enough for everything I need to do, even on those occasions I need to run Windows.

      Heck, I run Windows on a PII 333 and have no complaints. Very snappy. Hard drive speed is more of an issue than processor speed in a few cases, like when Internet Explorer starts up. And I should add that this machine gets used for intensive graphics design work and software development. Never has speed been any kind of hindrance.

      Whenever comments like this are made, certain groups come out of the woodwork: "But I need to solve systems of fifty thousand equations!"; "But I need to use a high-end rendering package!"; "But I run a video processing business!" And those people are all in the tinities of minorities.
  • 1.3 ghz beats them all
  • ...is like a Viper on blocks. 800HP that doesn't go anywhere.
  • 4.77 Ghz? (Score:4, Funny)

    by Anonymous Coward on Monday August 27, 2001 @07:16AM (#2220734)
    I sure hope, for the sake of good ol' times, they'll be manufacturing a 4.77Ghz processor soon...
    • Personal computer and microwave in one! It bakes! It fries! It dices! It comes with a free set of steak knives... well, no, actually, it doesn't do any of those things - but with an appropriately shaped waveguide and a metal-free ceramic mug it could heat your coffee (or my herb tea) directly.

      Cool!

      Er, no, that doesn't sound right, either...

      • I thought microwaves were 2.4 GHz (the resonant frequency of the H20 molecule. That's why that band was still available for RF use... it doesn't penetrate the moisture in the atmosphere, so it's completely useless for radio in the big blue room...
  • but warm for sure...
    • >but warm for sure...
      So maybe we can send P4 based systenms up to Alaska, Scandinavia, Siberia, and everywhere else where it gets damn cold in winter. Then, we could kill 3 birds with one stone.
      1. The P4 heats (thus making old fashioned heaters obsolete)
      2. The P4 can be used to fry eggs (no more stoves needed)
      3. They have more computing power than they ever needed

      So, does anyone want to donate P4's to help the freezing ?
  • One thing that you don't see people talking about much is why all these Mhz matter. In other words, is there really a big difference between 1.9 and 2.0 on the software that people use today? And if not, how long will it take before there is a difference?

    I am just remembering that back in the day, you could tell the difference between a 200 and a 233MMX relatively easily. Does that still hold true, say, when playing Counterstrike on a 1.8Ghz vs. a 2.0GHz?
    • There is a clear different feel on PIII 800 vs PIII 550. Not only for HF:CS :)

    • by Mike1024 ( 184871 ) on Monday August 27, 2001 @08:26AM (#2220938)
      Hey,

      is there really a big difference between 1.9 and 2.0 on the software that people use today?

      Well, that would depend on what you are doing. If you were, for example, word processing, you would notice practically no difference, since for the majority of the time, the processor is not being fully utilised anyway. In word processing, bottleknecks are more likely to occour from a program being slow to load (i.e. hard disk speed), or the fact Microsoft Word sometimes likes to move things on the page around for what seems like no reason at all.

      If, however, you are doing a highly processor intensive task, like rendering a 3D scene in Caligri TrueSpace 4, you would (in theory, at least) notice a reduced render speed, if you cared to time it, because the processor is being used extensively in the rendering operation.

      The problem with this, as with many things, is that the ultra-high-end chips are almost always disproportionately expensive. A 2Ghz chip will likely cost more than twice what a 1Ghz chip costs. Furthermore, a second-hand processor takes a big price hit, so staying 'bleeding edge' isn't really an option. If you have enough money to upgrade every time a new chip comes out, you have enough to get a rendering cluster, which will be faster.

      So, where will a 2Ghz chip find a market? Firstly, among 'Power-stupid' people. They will buy ir because hey, it's... like... TWO gigahertz, which is twice as fast as a one gigahertz chip. They likely won't actually need the power, but they have more money than they know what to do with, and iw will be good to brag about.

      Secondly, when it's cheaper. As the price drops off, if it can beat AMD's best offerings, people looking for high-end systems will like it.

      Thirdly, corperate types who were considering making the switch to AMD because the performance was so much better. If Intel can beat AMD's performance, then AMD will be less attaractive because the performance isn't better, and 'Nobody ever got fired for buying Intel'.

      Just my $0.02

      Michael
  • by xargs ( 411592 )
    "Intel's chance to retake the performance crown from AMD" -- Oops! [theinquirer.net] looks like they droppd that ball!
  • But when are they going to make Linux take full advantage of all the P4 advantages? Right now, to the best of my knowledge, it's almost like never running your ferrari past the second gear. Sure it's fast, but it's not taking full advantage of the potential.

    Anyhow, I've always been a fan of the "more ram and better gfx card" school when it comes to improving performance. Office apps are more than fast enough. The only thing you REALLY need to improve would be the frame rate on Max Payne.
    • More ram and graphics card is nice, but faster processers are good for churning out the polygons. I'm using a Tbird 900 with 256MB of SDRAM and a GeForce 3 and my boyfriend has a Tbird 1.3Ghz with 256MB of DDR SDRAM and a GeForce 3 and there are noticable improvements on some of the more recent games (Max Payne, Giants).

      Still, it's only tempted me to look into overclocking, not upgrading just yet (not until NForce)

      Anyway, I'm not sure that even Windows (2000, dunno about XP since I don't plant to upgrade) will take advantage of the P4 over the P3. I do know that Linux can be optimized for Athlon processors and that gcc 3 supports optimization flags for athlon hardware, so I'll stick with what I know works :)
  • Once again, the company with the most resources, experience, and capital has secured their lead. Did you really think AMD was capable of competing?

    I'm afraid it's time for all the AMD fanatics to admit that the story of David and Goliath simply doesn't apply to the corporate technology world. The biggest will always win, if only because they are more capable of supporting their management bloat problem. Look at IBM vs. DEC. DEC has suffered a two decade long slow death, from being the most beloved and innovative minicomputer manufacturer, to being an unwanted subsidiary of a PC manufacturer, to finally being eliminated. AMD is heading for the same chopping block.

    The key to understanding the difference between intel and AMD, is to realise that intel doesn't have to run flat out all the time to put the best chip on the market. They're big enough to coast, and take a break sometimes. AMD needs every minute of developer time they can get just to keep up! Sooner or later, AMD just exhausts it's resources and slips back into the low-end slot, where it belongs. Also remember, intel can weather a lot more damage to their markets than AMD. AMD doesn't have much of a war chest.

    My advice to everyone here is to realise that there are no prizes given for rooting for the underdog. It doesn't benefit you if AMD "wins", and you shouldn't even be thinking in those terms. While the intel architecture dominates the low-end computing market, intel will be the market leader. Accept it, or mourn it. You can't change it by wishing.
    • For your sake, I hope that was a troll.

      For mine, I hope it was not, given that I'm replying to you.

      Intel made a CPU that churns at a significantly higher clock speed in order to perform almost as well as AMDs "slower" chip. That is a case of "the company with the most resources, experience, and capital [securing] their lead"?

      Wow. The sky must be a bizar color in your world. The sky in mine appears blue; and AMD still has the performace lead.

    • What are you smoking? :) Clock speed to clock speed, the AMD is giving Intel an extreme beating ... if you take the Athlon4 and pump it up to 2ghz based on it's current performance, you probably wouldn't even be able to put p4 in the same class anymore.

      To me these benchmarks suggest that the p4 is a whole generation behind the Athlon4 .. it seems like comparing a 486 to a pentium .. if you had them at the same clock speed, the pentium would win :) and that's just what's happening here..
      • by Anonymous Coward
        Why are you comparing clock speed to clock speed? The Intel is designed to run at a higher clock speed. Get over it! You couldn't pump an Athlon up to 2GHz! IIRC, they won't scale past 1.8GHz until they move to 0.13 micron. That's the whole point... Smack!
    • Intel can exhaust its resources too -- by making stupid mistakes (like its Rambus chipsets). Losing consumer confidence is a hard obstacle to overcome.
    • Comment removed based on user account deletion
  • First Pentium CPU released at 60Mhz: 1993
    1Ghz Pentium CPU released: 2000
    2Ghz Pentium CPU released: 2001

    Moore who?
    • by Anonymous Coward
      Apparently you don't quite grasp Moore's law. Actually, if I remember correctly, Moore's law states that the density of transistors doubles roughly every 18 months. However when people talk about Moore's law these days, they tend to mean that processor speeds double every 18 months. So, assuming your numbers are correct:

      1993-2001 = 8 years = 96 months = ~5 doublings
      Start: 60
      1. 120
      2. 240
      3. 480
      4. 960
      5. 1920

      Wow! Looks as if you're dead wrong.
  • by TheMidget ( 512188 ) on Monday August 27, 2001 @07:21AM (#2220753)
    ...if each individual instruction takes up to three times as much cycles [tommesani.com] to execute. We've been having 667 Mhz Pentium III's for ages...
  • More... (Score:5, Informative)

    by tcc ( 140386 ) on Monday August 27, 2001 @07:22AM (#2220756) Homepage Journal
    More reviews:

    HardOCP [hardocp.com]
    [hardware-unlimited.com]
    Source Mag [sourcemagazine.com]
    Cpu Review [cpureview.com]
    Acid Hardware [acidhardware.com]

    • Re:More... (Score:1, Offtopic)

      by tcc ( 140386 )
      that came out weird heh... one that didn't make it completely thru (heck I did preview post, and slasdot added stuff that wasn't in the preview ([name.com]) ) So I saw my little tag error AFTER it was submitted..



      Hardware Unlimited's review [hardware-unlimited.com]

  • I mean, they're talking about how the 1.8Ghz Intel chip was trailing behind AMD's 1.4Ghz chip.. and now they're excited because their *2Ghz* chip might just beat AMD's *1.4Ghz* chip?! Hello? Geez.. sounds like AMD has a MUCH better design going. What will happen when AMD releases a 2Ghz Athlon? Will Intel have to bring out a 3Ghz chip to match it? I can't see how Intel can be too happy with this..
    • Actually, according to Firingsquad, if you're an unreal player, this tells us [gamers.com] that the AMD 1.4ghz is STILL faster than the latest P4 offering! Aside from Quake, the P4 2ghz is only marginally faster. The 2.24ghz (OC'd) does take a bit more of a lead. So, for only $400 extra you can get 10% speed increase on a FEW programs!
      • You can always find a particular benchmark that makes your desired result occur.

        There are benchmarks where 1.2GHz Athlons and P4's beat 1.4GHz Athlons and 1.7GHz P4's.

        A benchmark can't be biased. Either you run the piece of software faster or you don't. But selection of benchmarks can be biased. And other value-determining factors can get pulled into the evaluations that are supposed to be made solely on benchmarks.

        If all you care about is Unreal Tournament, then you've found your answer. But using that to make an overgeneralized statement like "AMD has the better chip" means you're probably lying to everyone else.

        --Blair
        • Blair,

          I didn't make any "overgeneralized" statement in my post - read it. I didn't write the subject's topic "RE: AMD has the better chip", just the "RE:" which refers that I am replying to that thread (read: your post has the same subject). First, try to understand my point. What I said was, in Unreal Tournament (not to mention Serious Sam, and 3DStudio MAX) the Athlon 1.4ghz is still faster. I didn't say, "because of the Unreal benchamarks AMD makes a better chip". Also, I said that aside from Quake, the P4 is only marginally faster. This is based on a whole gauntlet of benchmarks. Only on a few programs is the increase in speed significant (read: maybe noticeable). My point was, the chip still isn't faster at everything (see also: "mhz myth"), and where it is faster, it's not worth the $400.

          • Okay, my bad, fair enough, looked like you were being bench-selective, I'll take your word that you weren't.

            Is it worth the $400?

            Not if you're an Unreal nut, no.

            But over on Tom's Hardware, almost all the benchmarks other than UT go to the P4.

            There's one from SiS about memory bandwidth that I don't trust that shows every P4 with nearly a 2X advantage on any Athlon, but there it is. Maybe it isolates the CPU and just demonstrates the point behind RDRAM (which is also getting cheaper).

            Is it still worth $400?

            I have been first-day-of-issue adopter of a CPU or two, when I saw the same system two months later on the second tier and for $400 less, I knew that I'd had the nuts for those two months, and still owned a computer that would be nonobsolete for a year, maybe two.

            $400 ain't that much for that kind of egoboo.

            --Blair
            • Well, obviously it's a personal decision.

              I most definatly was not being bench-selective in the sense that I strictly pointed out that there where other benchmarks with other results. I just found it funny that it's still not a "cut and dry accross the board" faster performing chip.

              The issue is, although most benchmarks where held by the P4, the $400 or nearly 380% price increase compared to a 2-5% performance increase on the more broad benchmarks (read: not Quake) makes the "it wins most benchmarks" point moot. Nevertheless, the P4 is a technology that is specifically designed to meet the marketing demands of the "Ghz" ratings, and performance is secondary. This is disgusting. Now, the new P3 is a good chip - albiet overpriced. I'd rather have one of those than a P4. And since the current P4 architecture is being phased out for the Whillimate's, I see no reason in buying Intel right now - just wait, or go AMD. Intel may still come out on top, just not with thier current offering.

              • The top-end, newest-model units have always been several hundred dollars more than the next step down. And the 1.4 GHz Athlon is at least three steps down, being comparable to a 1.7 GHz P4, which is now behind the 1.8, 1.9, and 2.0.

                I can see a lot of people finding value in that. Personally (yes) I could do with 4 GHz now, and would gladly pay $1k just for the CPU.

                Porsche stays in business by not worrying what BMW is doing.

                --Blair

                P.S. I think you have it backwards. The current P4 is the Willamette design. The new one is the Northwood. And it's not a phase-out; it's a shrink and a bus upgrade. The Willamette price curve will continue like all of them. Northwood will scale up to 6 GHz, and semi-official hype (it was an Intel guy, in an interview) says 10 GHz. Brookdale is the i845 chipset, which will allow the Northwood to interface to SDRAM and DDR-SDRAM.

                (Go to TomsHardware.com and search on "intel roadmap"; I'd post a link, but the net is totally packed up right now...)
                • You're right - I got the Intel stuff backwards.

                  Well, the 1.4Ghz is actually closer to a 1.8 then a 1.7 performance wise. My point is that it's NOT a Porshce because a Porsche is more than 1/10th of a second faster than a BMW on a 0-60. Now, if you wanted to spend money, you could try a dual Xeon 1.7ghz, or a Dual Athlon 1.2Ghz. In which case Anandtech.com will tell you that they are moving all their Xeon's to the Athlon MP's because they scale better for half the price.

                  Maybe the Northwood will show AMD who's boss - and if so, I'll go for it when I need to upgrade. Right now, my $95 1.2Ghz Athlon is out rendering a low-stock $266 1.8Ghz P4 in 3DSMax, and only suffers a 1-2% speed loss in photoshop.

                  So, if you want to buy the fastest overall chip, then wait and buy the 2.2Ghz P4. If you want to be smart about it, stick with the new P3's or even better the Athlon's.
      • if you're an unreal player, this tells us [gamers.com] that the AMD 1.4ghz is STILL faster than the latest P4 offering!

        And it [gamers.com] also tells you that if you're a Quake player, it isn't.

        Remember, kids, read ALL the fine print...
  • meta-babel (Score:2, Funny)

    by gaj ( 1933 )
    The Tech-Report article points out a couple interesting meta-ideas - this is Intel's chance to retake the performance crown from AMD, as well as being one of those round numbers that makes people feel warm and fuzzy.

    How in Bob's name are those "meta-ideas"?!

    They are not ideas about ideas, they are simply ideas. Why do people feel the need to adorn their words with unnecessary cruft? I guess the old gearhead saying applys to prose as well: "If it don't go, chrome it".

    <sigh>

    This should be listed as a special case of Rule 17.

  • by gelfling ( 6534 ) on Monday August 27, 2001 @07:27AM (#2220772) Homepage Journal
    Most usability scientists agree that no one can distinguish much of a difference in PC performance 25% greater than the base value. When PC ran @ 200Mhz it was no big deal to squeeze ~50Mhzout of it since that was simply a quality control variable in the manufacturing cycle. Now with 1.4-1.9Ghz PCs you need to squeeze another ~350-500Mhz out of it before anyone notices so difference between old and improved performance. Just to keep pace with perceived performance you have to add nearly 500Mhz - that is, for lower values there is NO perceived benefit. Which translates into people willing to pay roughly ZERO for anything less than a 500Mhz improvemen. ZERO dollars for which
    Intel may have invested billions of dollars to generate. You see it's kind of like boiling water. Nobody cares if it is difficult to raise the water temperature to 211 degrees - it's the 720x more energy required to raise the water that last degree. So it better be worth it to you to spend the energy doing it because investing only 600x more energy will not boil the water.
    • by Anonymous Coward
      Water boils @ 100 degrees...
    • Experience would suggest that it will take only another 18 months to add 2,000MHz so why should 500 be such a big deal?

      It's all relative, my friend.
    • Which translates into people willing to pay roughly ZERO for anything less than a 500Mhz improvemen. ZERO dollars for which Intel may have invested billions of dollars to generate.

      What makes you think that a 500MHz increase in CPU speed today is harder to achieve than a 50MHz increase 5 years ago? 5 years ago when the Pentium 200s were hot, another 50MHz would have been as big a deal as 500MHz today. It will take the same amount of time, too. Let me point you to Moore's Law [intel.com] clearly shows that CPU speeds increase at the same rate.

      Let me tell you also, that if I'm running a maching on CPU bound tasks, even a 5% speed increase is worth buying. Especially if those tasks I'm running take large amounts of time to complete (weeks for scientific calculations!).

      • It's a function of how many wafers you can bake within a given tolerance. The difference between 1.4Ghz and 2Ghz is a function of how many wafers you can make that don't melt when you push that many Watts through them as opposed to any material difference in the design of the chip. It's straight up manufacturing process quality control. Each stepping represents a higher yield way of making the same chips. When chips are rated at 1.4Ghz that represents a given economic value of making at least X chips that can pass that QA test. Certainly SOME of them can be made to go faster but not enough so that you wouldn't have to throw out most of the wafer sheet. When the process gets sufficiently better and the yield surpases Y number of chips that can survive a 2Ghz QA test then you have an officially branded 2Ghz chip.
    • Incremental improvement is the name of the game.

      I won't upgrade my two-month-old 1.8GHz platform to 2.0GHz, especially when clock increases are not 1-for-1 with performance increases.

      But the upgrade sweet spot is an 18-30 month cycle.

      Pro: I have 2 y.o. 400MHz iron running on my desk at home. I can upgrade without shame.
      Con: If I buy now, how do I choose between 1.8 or 2.0?* The difference in system price is a few hundred dollars.
      Pro: I'm rich, and have a big ego.

      You do the math.

      --Blair

      * - actually, for other reasons, I have no reason to UG that DT until xMas or so. Rumor is we may have 4GHz by then. Crazy rumor, yes, but most promising for the 2.4-2.5GHz probabilities. We'll also know if DDR on the i845 chipset is faster or slower than RDRAM on the i850. Those aren't my reasons for delaying, but they're predictable benefits.
    • Two things Intel has going for it:

      People like numbers. Nobody wants a 9 when they can buy a 10 for just a little more. It's the same reason you pay much more for a brand new car than 2 month old one.

      For apps that use lots of CPU such as a 3-D renderer, the increases in speed (Amdahl's law still applies, though) will bring roughly linear increased benefits.

      Also, note that it would be cheaper for Intel to manufacture processors at fewer speeds. They introduced the 1.6 Ghz after the 1.8, and it probably cost the same per-chip to produce. This is profitable because they know each speed will hit a certain market segment willing to pay a certain amount of money.

      Having said that, I agree with your basic premise: that a 2 Ghz isn't probably worth the money over a 1.8. So yes, the better you know the system, the better the purchasing decisions you can make; but most people don't.

  • It's cool that Intel hit the 2GHz mark, but all that clock speed is really going to waste for the moment.

    Right now, you should go for a Thunderbird (AMD Athlon). Later on a Palomino (AMD next-generation Athlon) or the upcoming Northwood (0.13 micron Intel P4) is a better option.

    Am I just saying this? No, take a look at this [aceshardware.com].
  • by Anonymous Coward
    But:

    (1) 1.4 GHz Athlon "MP" will still beat 2 GHz Pee-4;
    (2) No upgrade for Pee-4 (423-pin Mobo soon to be out of date);
    (3) Should have compared Pee-4 with 256 MB RDRAM vs. Athlons with at least 512 MB (or even 1 GB) DDR (on a same-cost basis)--the Athlons will smoke the Pee-4s, at whatever GHz;

    etc.
  • .. for people who like to compare engine sizes. it's like comparing a 1.8L VTEC with a 1.8L in a Hyundai? Hey, the car salesman said they've got the same size engine, i bet they're the same :)

    People who don't know anything about processors obviously rate clock speeds like engine sizes. They don't really understand that 1.4Ghz == 2.0Ghz. It just doesn't make sense to them.

    • WAAAY OFF TOPIC....

      It's a 2.0 in a Tiburon. And don't knock it until you drive it, as it also has a Variable Timing System. I've yet to have any problems keeping up with any other normally aspirated 4cyl, INCLUDING spanking a vaunted VTEC Integra on I-10 on the way home last Friday.

      Jaysyn
  • Lemmings... (Score:2, Insightful)

    by jgrumbles ( 515918 )
    AMD needs to start gettin the word out that numbers aren't the only thing that matters. On a side note, no one will ever have the crown permanantly. Intel may have it for now then the Palominos will hit 2GHz and then Intel will come out with something faster, then AMD, then Cyrix, then....wait a minute scratch the Cyrix comment.
    • AMD needs to start gettin the word out that numbers aren't the only thing that matters

      Nah. Apple are already spending a few million doing that. And besides, people who buy AMD processors are either doing it because of price, or because they went out and read the benchmarks and reviews and worked out for themselves that there's more to life than clock speed.

  • It doesn't matter (Score:5, Interesting)

    by wiredog ( 43288 ) on Monday August 27, 2001 @07:34AM (#2220797) Journal
    I notice very little difference between my new GHZ machine and the 333 MHz machine it replaced. Compiles run faster, but I spend very little time compiling. I spend most of my time editing, and the processors have been able to keep up with my typing speed since the days of the 486-25. Web surfing? I/O bound. Video output? Also I/O bound. Most everything is I/O bound these days. Bus speed is more important than processor speed today. After all, when was the last time you saw anyone discussing spreadsheet recalculation performance?
    • Re:It doesn't matter (Score:3, Interesting)

      by mmaddox ( 155681 )

      It's funny, and I haven't actually run any times on this, but it doesn't even seem like my compiles have gotten any quicker since upgrading my Athlon 550 to a 1Ghz. I'm sure SOMETHING is faster, but I don't recall ever noticing any real difference in the overall feel of the machine. This lack of perceptive differences has really gotten me off the upgrade bandwagon for a bit. Even my 64Mb Geforce 2MX seems more than adequate for the gaming I enjoy.

      Is this just a symptom of the computer finally becoming merely a commodity?

  • by juventasone ( 517959 ) on Monday August 27, 2001 @07:35AM (#2220800)
    detailed review with benchmarks at extremetech.com [extremetech.com]

    I'm curious where power supply requirements are headed. A year or two ago, 230-250W was fine, now I'm seeing Intel and AMD demanding 400W. The HFCs that come with these things are now two or three times the size of the socket. With PCs outnumbering vehicles (saw that stat somewhere) I wonder how the power demands and the heat generated will effect global warming and such.

    Sure, its probably not much more than a few light bulbs right now (in both aspects). But like I said, where is it headed.

    • Those recommended numbers of 230-250 and 400W are the required ratings to supply large somewhat instantaneous current for the entire system. This is usually during bootup and reset for short periods of time.

      Remember, a computer isn't an ohmic device, the current varies. The majority of the time large parts of the system are idle, drawining only 25-50W from the outlet.

      The main concern is CPU power. From reliability to chassis noise from the fans, to cooling costs. This is more important to OEMs than performance.

      FOr once environmental and marketing needs are in synch.
    • If you can grab the edge from the competitors by using the extra 50W to grab an extra 100MHz out of the processor, you're going to flow as much juice through that processor as you can.

      Look at the VIA C3 (aka the Cyrix III)...a 700MHz chip that takes so little voltage that you can almost run it without a heat sink (almost...which says quite a lot compared to these 5 lb. heat sinks on the P4). So? No one's buying it. Even if it had the biggest battleship of a FPU (though it doesn't), the fact that they're not running the processor fast enough to save energy is not going to sell the processor.

      If someone could come up with a power transformer which charges 1000W into the computer case just so that you can get an extra 200-300MHz out of your processor, people would buy it.
  • This [big.or.jp] page is pretty old, so somebody has probably done better by now, but this guy overclocked a 1.33GHz Athlon to 2174 MHz (using extreme techniques I guess).

    Still though, if it's stable at such high speeds when it's supercooled, surely AMD will be able to make it stable at room temperature, no?

    I'm waiting for 4GHz Athlons.
  • by Rura Penthe ( 154319 ) on Monday August 27, 2001 @07:44AM (#2220821)
    Let's see, we have a Firingsquad review [gamers.com]...

    An AnandTech review [anandtech.com].

    And let's not forget ExtremeTech's review [extremetech.com].

    And finally Kyle and the gang at [H]ardOCP did a review [hardocp.com].

    Incidentally, [H] got their p4 to over 2.2ghz, but ran into heat issues at 2.3.
  • I guess I have to stop my Pentium bashing now.
  • by jmichaelg ( 148257 ) on Monday August 27, 2001 @08:12AM (#2220898) Journal
    I've quit upgrading due to noise. The fans needed to cool a 1.2 ghz Athlon are too noisy as it is. I ended up water cooling my machine, not to get it to overclock but just to get it to shut up.

    Maybe when the 4 ghz chips are out, they'll have figured out how to lower the power requirement so that our computers don't sound like small jet turbines.

  • by yoshi_mon ( 172895 ) on Monday August 27, 2001 @08:45AM (#2221000)
    While astute computer users know that raw MHz does not automatically translate to application/game speed, not so in the case of the typical user.

    When AMD broke ahead of Intel in the MHz race, their marketing department was quick capitalize on this with a media blitz that even included some TV commercials.

    However, now that Intel once again taken the lead in the MHz race, astutely AMD has once again retreated its marketing tactics to the knowledgeable and computer savvy.

    Every unbiased hardware review page has said pretty much the same thing, clock cycle for clock cycle the AMD is still faster. However, the average computer buyer is still tied down to the more is better idea.

    And honestly, that is something that is hard to refute. More RAM is better, bigger HDs are better, bigger monitors/screens are better, faster modems are better...why don't CPU's follow the same rule?

    The answer is a pretty complicated one and to explain that would require some basic knowledge that you just can't squeeze into a 30 second commercial. AMD has made noise about a marketing campaign that will educate the public, however so far it has been just that, noise.
  • by wubboy ( 96276 ) on Monday August 27, 2001 @09:11AM (#2221091)
    it crashes windows in half the time as my 1Ghz. ?
  • by Brian Stretch ( 5304 ) on Monday August 27, 2001 @09:35AM (#2221191)
    Let's see, do I buy a 2GHz uniprocessor P4 with its performence-killing 20 stage pipeline, miniscule 8K L1 cache, and high-latency/overpriced RDRAM, or do I buy a dual processor AthlonMP, 128K L1 cache, DDR SDRAM, and 64-bit PCI slots (Tyan Tiger MP) for LESS MONEY?

    These days, Intel CPUs are for people who don't know any better (or are forced to buy Dell).
    • Actually, most Windows applications will still run faster on the 2Ghz P4, since they don't know how to take advantage of SMP. Now for a _server_, the dual MP is a big win. Not sure how many Linux applications see a performance improvement with multiple CPUs...
  • I would think that the Slashdot community would be the one to harbor some bad vibes towards Intel for their involvement in the 4C project, or whatever the hell the copy-protected drive is/was. Maybe I'm just too political though, I dunno. Whenever I scrounge up enough money to replace the piece of junk I'm on now, there'll be as little M$ and Intel brand crap in it as possible. I know you're impressed but really that last sentence was included just so that I could plug responsibleshopper.org. It's not my site, but as the kids say, it's keen.
  • I am not sure how about you, but instead of buying Pentium 4 2GHz procesor and new respective motherboard, because it won't do without upgrade in my old one by no chance... I can as well get 4 processor AMD Athalon 1GHz SMP machine and most likely even save. Or what, the heck, lets get 2 or 4 machines and put them in cluster. I am still better than with this single P4 2GHz processor for oomparable prices. So where exactly is the benefit?

    Intel had a great deal of lead because of their SMP capabilities. Thats no longer a problem with AMD and no longer a banefit of all the Intel processors. So I'd guess, put the money where the real benefit is and not just into sounding numbers.

  • by stevarooski ( 121971 ) on Monday August 27, 2001 @11:49AM (#2221762) Homepage
    Didn't see these posted, so check these out:

    SharkyExtreme [sharkyextreme.com], and pcmag.com. [pcmag.com]

    Naturally, those seeking the zdnet advertising-big money-enhanced (tm) view should choose the latter, while those seeking that of an enthusiast should check out Sharky's. ;o)

    -S
  • by blair1q ( 305137 ) on Monday August 27, 2001 @01:54PM (#2222363) Journal
    Yes. $ for $ the AMD chips win. But you need a computer engineering degree to understand why. Consumers still measure Sony TV's horizontally to determine if they're 27 or 35 inches (try it! Sony makes them that way so they don't have to educate the public).

    However, the 1.4 GHz Athlon with DDR SDRAM was about par on the benchmarks with the 1.7 GHz P4 with RDRAM.

    1. You can't get 1.5 GHz Athlons yet, and the P4 has gone on to 1.8, 1.9, and 2.0 GHz.

    2. Intel and VIA are releasing motherboards that will run DDR SDRAM, reducing memory cost significantly with an unknown but predicted to be very small performance hit vs. RDRAM.

    Ergo, if you want the fastest commercial desktop, you buy the newest P4 platform. And the early adopters, speed queens, and obsolescence anxiety victims have always justified exhorbitant price differentials.

    Businesswise, Intel made a bad, bad mistake putting all its chips in the Rambus basket. AMD was also able to leverage some serendipity when Digital went belly-up, leaving a lot of Alpha engineers with nowhere else they could stomach to go. But Intel has been through this before (remember the PowerPC? Apple, Motorola, and IBM combined are about 40x the size of AMD, and they couldn't take Intel...) and has already reposition itself.

    Intel can be bloodied, but it's never been knocked down, much less knocked out.

    Am I cheerleading? Maybe a little. I own a ton of INTC. But I have always known they make inferior products. 6502, m68k, Alpha, PowerPC, even Intel's own i960 line are superior products to any chip that eats x86 assembly. But if you get prejudiced on the characteristics of a product you will totally fail to understand the value of the company.

    Intel will rule in the end. Start from that premise, and then try to prove otherwise to yourself.

    --Blair
    "It's not an 800 lb gorilla. It's an 800 lb gorilla with a PhD in process technology and 30 Superbowl rings."
  • http://news.cnet.com/news/0-1003-200-6982283.html? tag=mn_hd [cnet.com]

    AMD to slash prices... you can get your cake and eat it too... er... nvr mind.

    e.

This is now. Later is later.

Working...