Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Intel

Pentium IV Non-bus Master PCI Bug Lives 79

Barbarian writes "This ZDNN article says that a bug in the Pentium IV chipset that caused a recall months ago and causes a slowdown on systems with a second (PCI) video card still exists. A talkback comment points out that this bug affects any non-bus master PCI device." To be fair, the probable amount of people that it will effect is relatively small, and even if you do want two monitors, most companies are just using the Matrox G450, or one on the AGP, one on the PCI.
This discussion has been archived. No new comments can be posted.

Pentium IV Non-bus Master PCI Bug Lives

Comments Filter:
  • you can only use the AGP slot for video.

    While it is called the Advanced Graphics Port, I would imagine that any piece of hardware which required fast access to the system memory would fit into this port. I could think of some data acquisition equipment which may be able to utilize the features that AGP provides over PCI.

    And it is quite possible to install video cards into PCI expansion slots if AGP is available. I've done so many times.

  • It's "Pentium 4", not "Pentium IV"--they changed the nomenclature with the new chip. Go to Intel's website [intel.com] and see.
  • For example, anyone with a WinTV/PCI?

    I'll be steering clear of the P4 then...
  • I do.

    I use a Matrox Dual Head G400 Max (Yes, its AGP so its not affected by the bug) with two 17" monitors. I use Multiple desktops in Linux, and now I can use Multiple desktops in Windows as well. Still, what is the big deal right? this is the big deal. [planetquake.com] Take a look at those UT screenshots on FIVE monitors. Sure, I have the ability to use the dual with my 17" monitors, and I do (like the shots of Quake 1 and 3 below), but five? Not yet, but you can be damn sure I am thinking about it. I have enough monitors and cards to do it too.

    So, the "big deal" is that this bug prevents me from doing what I would like.
  • have you ever tried dual/multi head? every person who has knocked a multi moniter setup that i know in person was a changed person after they used a rig with multi moniters on it. personnaly, i run 3 moniters (4 if you count the TV), 19" samsung 900 IFT with a pair of 17" 700 IFTs flanking it. the tv under windows is running in matrox's 'DVDMax' mode where it will grab any video played and display it on the tv full screen. a g400 agp and pair of voodoo3s drive the rig. its nice to be able to get crap that you're not currently paying attention to but need to keep an eye on off the main screen but within a glance (ie : server moniters, xmms, irc). and it totaly kicks ass for web dev stuff. :)
  • The closed source driver works just fine, thank you! The problem is that on the second port the refresh rate sucks so much it's almost not worth using.
  • Wouldn't a bus mastering problem effect scsi /firewire cards or any device using a lot of io across the pci bus?? Or are these devices not using bus mastering (shouldn't they be...)

  • The very first chipset for the pentium II had similar problems. My father's computer, a P2-266 cannot even use 100 or 133MHz DIMMs. They have to be the old 66MHz DIMMs, due to a flaw. And they're all clocked at 66MHz.

    So I won't worry too much. Intel has a record of not testing products enough before releasing them. Sometime you just cannot test everything like it does in the real world. Microsoft is a prime example of that, but Linux is just as guilty. (Don't tell me there were never bugs in a stable Linux kernel)


    ----------------------------------------------
  • According to the article, the P4 (chipset) bug does apply here. If I read it right, the problem occurs when using a PCI videocard. Hemos' comment does not seem to be correct.
  • On the current P4 chipset you can't use a PCI videocard without major slowdowns and image corruption. So effectively you are stuck to a single AGP card. That's what the article says.
  • - space and noise (and heat/speed) can be fixed by something like "computer in the fridge", which I am sure everyone read about...


    You're forgetting certian laws of physics that basicly say "the heat has to go somewhere".

    Refridgerators do a good job of keeping your food cold, at the expence of heating your house. You're still going to have problems with the heat unless you pipe the heat outside somehow (usialy, this means using airconditioning equiptment).

    - electricity charges are not that high considering the number of monitors ;) which is actually one per cluster/network or less.


    True, but you're still going to use more electricty with a cluster of low power machines than with one high power machine. Not only do you need more power for the additional machines, but you also need to consider things such as cooling, the equiptment thats needed to network the cluster together and manage it, and other considerations. Obviously, there's also maintainace and administration issues as well.

    If you want to try it, be my guest. :) But don't be supprised if it turns out to be more of a hassle than its worth for what you want to use it for. Even though clusters and distributed systems have thier uses, they're just massave overkill for the average desktop/workstation users purposes.
  • This might have hit the PCI videocards but think about any other high-bandwidth card you might want to use: Ultra160 SCSI, Gigabit ethernet.

    Hardly "legacy hardware"...
  • I know plenty of people who got it working with one G400 dual heads. You will find them if you did search the newsgroups and, gee, do a bit of research before you say something.
  • [quote]degrade performance when video or other graphical data is processed through a PCI bus[/quote]
    why just video ?
    I don't think the PCI bus can detect the kind of data is transporting so it must affect other things as well.
    ---

  • To my understanding the problem occurs with every PCI device which isn't PCI master when doing I/O. So you I think you are right and we will see problems with these motherboards in a lot of different installations.

    The funny part is that if Intel keeps producing all these flawed chipsets, I think in the end P4 computers will only be used to run Windows ME e.a. in simple home installations, and we won't see it in high-performance configurations. So the problem is solved, and Intel is right..

    [offtopic]
    you seem to be the only /.'r not wanting to go into into lengthy and offtopic discussions on the pros and cons of dualhead, etc..
    [/offtopic]
  • As they said: Documentation on one monitor (who doesn't need specs or help when their coding).

    Code in the middle

    And entertainment to the right (MOVIE). I can't think of a better division, though I use an AGP, a PCI and a TV out (tvtuner piped out) to accomplish it. And yes it's efficient as you don't have to switch screens. To make things really slick, I span the desktop between the two monitors for task switching. One moment I'm coding and the next I'm checking up on the news or something (both monitors, who only reads one website at a time?).

    Perhaps some people are just more co-ordinated than others.
  • Everybody here is considering only multiple video cards, or video + DVD playback. Well, to me this bug is much more serious than that, because it makes the P4 unsuitable for network servers.
    Today any high-end network server has at least 2 netcards, for routing and/or rendundancy purposes. Even if you don't want your server connected to two networks, you still want to have it rendundant, so it can continue serve data if one of the cards or cables fails.

    I think that a highend server, which you would build with such and expensive and fast CPU, would most likely deserve your attention towards rendundancy/fallback.

    Also, I wonder if there are any other PCI bus-mastering related bugs in this chipset. I would not risk putting a Gigabit ethernet card in a P4 anytime soon, really.

  • Naah
    Looks pretty standard for most documentation regarding consumer electronics today =)
  • With the P-IV being targeted as being a high-end system; this bug becomes a considerably larger issue as a percentage of total users.

    The bug in the chip-set limits a user to either accepting a system slowdown when they install a PCI video card for their second display; or using the Matrox card which is at the end of its market life and is outperformed by most, if not all other high-end cards currently on the market.

    Many users at my employer who need high-end systems also require multi-monitor support. Due to this bug, we will not be purchasing P-IV's with the current available chip-set. We will either wait for a new chip-set or settle for a P-III (my employer still refuses to accept Athlons, although I personally prefer them).

    Meanwhile, on an issue unrelated to the bug, most home users who prefer multi-monitor support would, I suspect, prefer to wait for the next generation of P-IV, so there actually exists an upgrade path for the system.

    The home power user market will be avoiding the P-IV for now anyways; and the corporate market will have limited use for it at this time due to the chip set bug. Intel's PR department will have to work a lot of overtime to get these things to move off the shelves while they wait for a fixed chip set to come out onto the market.
  • With most other CPU and Chip Sets, you are correct, you can use PCI cards for video. But, had you read the article attached with that link, you would realize that in the 850 chip set used by the P4, the use of a PCI graphics card results in significant system slowdowns and corruption of video information.

    True, technically you could still use the PCI slots for video on the P4 w/ the 850; but with the performance and image quality problems that result from the bug in the chip set, you would not be very productive on any system configured in that way.
  • The thing that scares me is stuff like Mutli modem cards, and SCSI cards and that sorta gear..
  • Don't forget all those with PCI DVD accelerators! Accelerated DVD playback on a PentiumIV will be more choppy than on a "slower" PIII system.

    Doesn't sound like a trivial problem to me.
  • I think a large part of the point is that a lot of people won't upgrade to this vintage of the P4 chipset, and will instead go to Athlon. Not that this is a bad thing.
  • We can see why AMD has been able to gain such a large profit share, new technology faster, more stable and oh in case you didn't know at a fraction of the cost. Lets see... IA-64 Vaporware, Pentium 4 bugs and recalls... I think the choice is obvious don't you.

  • I use two monitors and yes, I can use both monitors at the same time. I keep an IRC window open on one of the monitors, and if I'm looking at the first one, and someone says something in that channel, out of the corner of my eye, I'll see the all the text move up a line.

  • The first PII chipset was the 440FX chipset which used 72pin SIMMs which only come in 66Mhz. This wasn't a flaw but it was a limitation for the future. It also couldn't handle anything beyond a PII 300(including Celerons) because it couldn't handle the new core voltages.

    I realize that some manufactures like Dell sold FX chipset based servers that supported a 333Mhz PII with 66Mhz EDO DIMMs. I don't know if there was a revision to the chipset or if they made some work around. The point is that your father bought a transition motherboard instead of waiting for an LX board with AGP. You can't blame a chipset for not supporting technology that hasn't been released yet. That is a limitation in the future but how are you going to second guess the future. Intel tried with Rambus with less than ideal results.

  • The G200MMS can support dual or quad digital flat panels. Is there some reason you can't use this setup (other than price $750 for quad)?
  • You sound like you could use a nice LCD setup instead of second monitor like LCDproc [omnipotent.net]. If you really want the monitor, then by all means get it.
  • might be more of an issue in the very near future in CA.....back on topic: If Intel keeps this up, they're going to have people avoiding their ".0" releases like Redhat and back in the day, Word Perfect.
  • And it is quite possible to install video cards into PCI expansion slots if AGP is available. I've done so many times.

    Not with the P4 - that's why it is a BUG
    You can not use PCI for video on this system. The one in the article....
  • Except on this motherboard (850 based).

    See, the "Except" part is what makes the BUG and prevents you from using a PCI video card like you can in a normal system.... When you start using exceptions from the normal or expected behavior, that is what makes a bug. Hope I did not use to many big words for you...?
  • Yes, actually, I believe the headline is wrong. It states:
    are just using the Matrox G450, or one on the AGP, one on the PCI.

    And, whereas the Martox card is a viable solution, placing a second (or even only) card in a PCI slot is not.
  • Does anybody know exactly what the bug really is? I've spent the last couple years of my life designing and verifying PCI interfaces so I'm kinda curious...
  • >but you'll have to alias ls to ls -color=0, else the console is useless.

    Might I suggest you try adding:

    export TERM="linux-m"

    To your /etc/profile? It fixed the same problem on my B/W VGA monitor. Now all colour stuff works perfectly (in black and white). With shades, too. :-)

    If the linux-m definition doesn't come with your termcap, it is part of the slackware distro.
  • It affects me. I have only PCI and two video cards... oh wait... I couldn't stick a P4 in my AMD-based motherboard anyways.

  • cheap, fast to market and available in large quantities

    Like the P3 1.13GHz?

    Intel is losing even that edge. I wonder how much longer they will be considered the top chipmaker

  • The whole development in IT reminds me of the sequence in Fantasia, where Micky just can't get rid of the ghosts that he summoned.

    Microprocessors (& Software) got so bloated and complex, that it seems inconceivable that we will ever return to the stability we used to have with far simpler systems.

    Of course those lacked dancing paper clips and a kisuaheli spell checker...

  • I wonder how much longer they will be considered the top chipmaker

    Where they ever ?

    Maybe, but only in terms of market share. In terms of technological edge Intel seems to have lost it years ago. Compare it with an Alpha and they are probably not yet there where DEC started out in the middle of the nineties.

    Now, if DEC would have had such a thing like an only halfwitted sales - and marketing organization in the first place they wouldn't be called Compaq today.

  • My understanding was that this glitch caused problems at the processor when it was getting video data over the PCI bus; I could be wrong of course.
  • Call this Off Topic if you like, but reading about yaqub0r's setup made me smile to myself - 3 gfx cards?! Just to run different screens in different resolutions?

    Hell, the Amiga has offered multiple screens at different resolutions and bit-depths for the last 15 years, folks!

    Sure, you need to flip between the screens (simple mouseclick or keypress), but to be totally honest, I do not believe that anyone can productivly USE 3 monitors simultaneously to do different things - you will be concentrating on one of the 3 at any one time.

    Just my 2p-worth - interesting to see how technology in the year 2000 goes into overkill to simulate what some of use were doing since the 1980s

    --
  • Read the article - it affects NON-bus mastering devices. This means:

    1. Most block I/O devices ARE BUS-mastering (all IDE controllers on the market today are, and most SCSI controllers are - unless you are trying to use your cheapo scanner card to run a drive, which isn't recommended anyway).

    2. Some PCI video cards made post-1996 are bus-mastering.

    3. Some NICs are busmastering.

    On older mobos (of the original pentium era), there were usually 4 PCI slots, 2 of which were shared ISA, and the first one was the only one that was busmastering, and this caused conflicts when you had 2 devices that required busmastering (such as a NIC and a Videocard). NICs and Vidcards made during the 486-586 era were busmastering because the PCI bus couldn't be shared efficiently. Nowadays, with a bus speed higher than 66mhz and more than 2 independant PCI slots, it isn't necessary to restrict data transfer to one high bandwidth device at a time.

    It is no longer necessary to stick the bus-mastering card in the first PCI slot anymore.
  • In short, the bug with the i850 chipset only affects people who primarily use legacy hardware.

    "Legacy hardware" is not always old stuff. To me, "legacy" may mean a 64MB video card that I may have purchased a couple of months ago. What if I want to upgrade to a P-IV now, but know that I want a second monitor later? I couldn't even go out and buy a cheap 8MB card (also made by Intel) because it would kill the performance. Does that mean I can't upgrade now? Why should I be forced to go buy a new video card to replace the perfectly good (and very powerful) one I have now? What sense would that make? What I want to put into my computer should not be dictated by a hardware bug that shouldn't even exist, especially if I'm willing to pay that much for it.

  • Please check before you post?
    This is NOT a G400/G450 problem, it's a limitation in Windows 2000! Any other OS can take advantage of the DualHead feature, read more about it here [matrox.com]
  • ...they just fade into obsolescence. Seriously, if someone wants two video cards, they either go AGP and PCI or the Matrox DualHead / NVidia TwinView. Therefore, this bug will be meaningless as long as all P4 systems ship with an AGP slot.
  • I downloaded source (from the matrox website) for the current G450 driver and found it only supported 1280x1024 on the 2nd head. 10 minutes later (+30minutes for a XFree86 build) and I had it working with both heads at 1600x1200 under FreeBSD. This is a good example of Open Source Drivers at work.

    And yes... I did submit the changes back to Matrox
  • ... is, what do they mean by 'degraded performance'. Does it mean I lose a few fps in quake, or do I lose the ability to run any graphics app at any real speed. I refuse to trust this kind of new, at least without benchmarks. It doesnt really matter for me anyway, because I'm either getting a dual tbird, or waiting for the p4 DDR boards to come out....
  • Any large company with any sizeable enterprise systems has a huge network and system monitoring setup. If you can't run multiple displays on a single PC you have dozens of PCs stacked up doing nothing more than running an X display or other GUI monitoring window.

    PCs aren't just for Quake any more . . .
  • Just how many people out there NEED a Pentium? You can run lynx just fine with a 386, right? This is not communism, buddy -- you don't care about consumer need, you care about consumer WANT.

    Do yourself a favor -- try a multimonitor display before you make such idiotic claims. Some people don't have $1000 in the bank to lay down on a single investment, and among those that do, most of them would find that fewer pixels spread across more displays are more effective. It allows them to make a REAL mental separation between each region of the screen, and quickly begin to organize and assign special tasks to each one.
  • by Barbarian ( 9467 )
    It appears to affect any non-bus mastering card that is transferring a large amount of data across the PCI bus ... According to the ZDNN talkback comments, Intel has represented this as a second-video card problem soley, because video cards happen to transfer a LOT of data. However, Sound cards under high quality playback (i.e. DVD Dolby Digital output) would also be affected, and I bet non-bus master PCI network cards under heavy load (/. ing perhaps).
  • Ooops, thought it was the other way when I submitted the article.
  • All right, dual displays may be "cool" but frankly, is the average home or business user going to need it, let alone pay for it? The percentage of computer users that need dual-display setups are a tiny fraction of the whole computer market, that's to be sure.

    Besides, there's this issue of hogging desk space with multiple monitors even if they ARE TFT flat-panel units.

    Think about it: outside of developers, very high-end gamers and people in stock brokerages, there's no real need for more than one monitor. Especially now with 21" diagonal displays running 1600x1200, more than enough to do even serious desktop publishing work.
  • John,

    While having more than one display is great if you're doing program or web page development, very high-end games or working in a financial brokerage house, that still is only a small fraction of the total computer market out there. For the average computer user out there, you really don't need more than one monitor.

    Think about it: a top-quality 21" Sony, Viewsonic, Eizo NANAO or NEC monitor can display even beyond 1600x1200 32-bit color at 85 Hz vertical refresh rate. I believe some 21" monitors can display 1900x1440 at 85 Hz vertical refresh rate with no problems. And these monitors can be had for around US$1,000 to US$1,200.

    At 1600x1200, you can easily read two 8.5" x 11" pages side by side; this makes it VERY useful for desktop publishing.

    Anyway, most new computer users who buy higher-end systems usually run 1024x768 to 1280x1024 85 Hz with the 19" monitors out there. That's more than enough to see web pages clearly and do fairly decent quality print previews.

    In short, while I do agree there is a place for setups with more than one monitor, that setup is not for the vast majority of computer users out there.
  • John,

    Actually, I myself stay away from the i820, i840 and i850 chipsets because Intel seems to have WAY, WAY too many problems with these chipsets. They make the slight memory slowness of the VIA KT133 chipsets used on AMD Athlon Socket A motherboards seem like a minor problem in comparison. :-/

    But again, we agree to disagree. :-) I personally contend that multimonitor setups are a very niche market that the vast majority of computer users won't use, if only because it'll hog way too much desk space. Especially now with the nice 21" monitors that can display 1600x1200 32-bit color at 85 Hz vertical refresh rate very cleanly.
  • Folks,

    Tell me: just how many people out there NEED a second graphics card?

    That may be necessary for a very small number of games and some CAD programs, but given today's cheap 19" and 21" monitors running 1600x1200 resolution, you can have lots of display area AND still keep the menu commands on the same screen.

    In short, the bug with the i850 chipset only affects people who primarily use legacy hardware. It's not that likely people will put in older graphics hardware into today's P4 systems given how good 3-D graphics cards and their ability to display 1600x1200 32-bit color have become.
  • In the past, when systems became too complex and buggy, someone usually invented tools and abstractions that made it easier to design and build complex hardware/software systems.

    I am concerned about the corporate culture at Intel. They have a long history of shipping products that are buggy, inefficient and inelegant, but cheap, fast to market and available in large quantities.

  • At the moment the only way to get dual-head on a linux box is to use two cards. The matrox driver is closed source and still beta, and simply won't come up with my G400DH. I haven't heard anything about an open source driver.
  • PIVs were designed for multimedia. PIVs run at 1.4GHz+. DVD decoding in software on a PIV should be just as good as any hardware decoder out there, if not better since software is more flexible. If the software uses SSE2, then there shouldn't be much of a problem.

    I think around an Athlon 750 or so, DVD playback in software has been the equal of DVD playback in hardware.

    Of course, if you need to do something else while watching a DVD, it could still be an issue, but not many people do that.
  • This will affect anyone using the i850 chipset and a PCI video card. It does not matter if they are using an AGP card or not. If there is a PCI video card *at all* then it will degrade performance.

    A bug -- or, in chip maker parlance, errata--in the chip set for the Pentium 4 can degrade performance when video or other graphical data is processed through a PCI bus, an internal channel for data, Intel (Nasdaq: INTC) has stated. Because of the bug, consumers may experience slow processing or data corruption if they connect a second monitor or an additional graphics card through one of the PCI expansion slots in a Pentium 4 computer.

    The sad thing is, this is the first paragraph. Isn't this site supposed to be discussion of *the articles*?

  • Actually, an old slot-a based Athlon/500 can play DVDs using windoze software utilizing between 50-80% of the CPU (it usually seems to be around 75%).
  • Of course, nobody NEEDS a second graphics card. Hell, nobody NEEDS a first graphics card.

    It won't let you eat, it won't let you breathe, it won't let you go to the can. No computers are NEEDED. What we NEED is food. Maybe shelter.

    But you know what I *WANT*? I'd like one of those little 9" black and white monitors you sometimes see working with a cash register. Probably real cheap, too! :) If I could find one of those, I'd buy one right away. Then I could hook up my second video card. Then I could have all the goodness of a real console visible at all times.

    Can you say log monitoring? System health? Nethack? :)

    Dave

    Barclay family motto:
    Aut agere aut mori.
    (Either action or death.)
  • I have a few of those mono-vga monsters.. They do 640x480 and (I think) 800x600 at low sync rates, but you'll have to alias ls to ls -color=0, else the console is useless. I was lucky enough to pick up slightly used units from FedEx (The Powership model that used them is being phased out.) They sell their return stuff in quantity, 5-10 bucks a pop, untested. I think the smallest lot you can snag is 50, but still.. You could prolly get $15 a pop for whatever you didn't keep on the junk show circuit easily..

    You can get them new for under sixty bucks, if memory serves me. Hit up google with the search 'mono vga 9" POS'.
  • I am still wondering why do people run behind new/high/top/super-puper techs like P4... It is expensive, buggy (actually it is not known for sure whether it is buggy or not ;)), it is poorely scalable, etc...

    Instead, one could go the google way - take as many old computers as you need (486 or something), and use their beauty. Cheap, reliable when plenty, and even more interesting conecptually (networks, clusters, distributed systems, etc).

    The drawbacks of course are obvious: space, noise and electricity charges. But if you take a closer look, it becomes even more interesting:
    - space and noise (and heat/speed) can be fixed by something like "computer in the fridge", which I am sure everyone read about...
    - electricity charges are not that high considering the number of monitors ;) which is actually one per cluster/network or less.

    As I have read somewhere not long ago, Google has now more then 6K Linux servers, with the most powerful one being a P133 or something... That is interesting ;)

  • Heck, even if you buy an AMD processor (or any other processor with MMX), you are paying Intel. Yes, Intel gets royalties from every processor AMD sells.

    And for any USB product also.
  • But if you read the article, this isn't just limited to video cards. Do you have any unusual or older cards you want to use? A video capture card, maybe? How about an MPEG-2 decoder board (no software decoder, even on a GHz CPU, produces as perfect a picture as a good hardware decoder)? I'm willing to bet that some of these devices, and others, qualify to be affected by this bug. Certainly, I have enough interesting cards in my PCI bus that I'm not about to buy a P4--including an ATI All-in-Wonder 128 for analog video capture, a Voodoo 5 5500 PCI (because I wanted a real 3dfx Glide card for playing N64 games--glide wrappers produce unacceptably inconsistent results--but wanted to save my AGP slot for an NV20), and a hardware DVD decoder. But hey, I'm not upgrading soon anyway. Santa left a little gift under my tree, an Abit KT7-RAID and a GHz AMD monster. With the price of PC-133 falling, I'm happy because I'll finally get to have a box with tons of RAM--768 MB, baby! It'll let me have a big ass RAMdisk for temp files, which speeds things up immeasurably if you do high-res graphics editing with programs that make temp files for undo functions. But, I digress...

    At any rate, the problem *does* affect many people. That being said, the next rev of the chipset will probably fix it. I'd be a hypocrite if I didn't point out that AMD has had similar problems, too, it's not just Intel's fault--anyone remember the bugs in the AMD 750, which was the only chipset available for months after the K7 came out?

  • Not this one. Take my example. Like most programmers I game and program on the same machine (at work of course). I bought an Asus AGP-V6600 so that I could do some serious gaming. I've also found that the joy of programming increases when you add a second monitor. Code on one monitor, documentation, run-time, etc. on the other. Since the best gaming cards out there aren't dual-head, that limits you immediately.

    Add a third monitor and you can do tons of stuff at the same time. Code, troll the net, read docs, IRC, buildworld, watch what packets are coming through your firewall at 3am...etc, and never have to worry about "what window was that again?" Everything is right there in front of you.

    Since I do different things, they are all at different resolutions. My primary/gaming monitor (which is the largest) is in the middle (21" 1280x1024) Then (2) 19" monitors on either side. The one I keep things on that I will keep refering to, such as docs is at 1600x1200. The one that I usually keep my remote connection windows on is sometimes at 800x600, sometime at 1024x768, depending on how many machines I am connected to at one time. If I want to play a movie while coding/reading docs I set that third monitor at 640x480.

    As far as I know the only way to achieve this setup is with 1 agp card and 2 pci cards. I don't think the Matrox's dual head, which I have on another machine, can do it because it assumes that the two monitors are next to each other, with nothing in the middle. Someone else should confirm that though. The G400 also doesn't support individual monitor resolutions on some os's. (cough...nt...cough)

    I use all three of my monitors everyday, and have found it very enjoyable. To the extent that I won't even put a machine in my home because I can't afford more than one monitor. When I get on a machine w/ one monitor I feel like I'm in a box. If I had the room on my desk I would totally add a fourth 15" monitor dedicated to watching the logs that come in from the machines on my network. There must be people out there that do it. So, even then, unless you have a quad-head, you would require at least 2 dual-heads.

    The point is...Fix the bug!

  • This makes me wonder if I really want one. Sure, this bug's only with the chipset, not with the chip itself, and it only affects a small percentage of users. I would not be one of them, to be sure. Just the same, this just seems to be one more in a series of errata surrounding this chip and, indeed, this company's products. It seems that poor QA continues within Intel, and they don't even provide best-of-class performance anymore.

    Because of the bug, consumers may experience slow processing or data corruption if they connect a second monitor or an additional graphics card through one of the PCI expansion slots in a Pentium 4 computer.
    Given the QA track record thus far, is there anything else lingering that might cause "slow processing or data corruption" yet is branded minor? I don't think my next purchase is likely to contain this company's products.
  • uh... There are lots of people out there and lots of companies big and small who have made substantial investments in pci video hardware. Some of the boards (like targa's) may be a few years old but are still worth putting in your shiny new p4 boxen. Lay out several thousand dollars for these boards you will want to be able to continue to use them too.The term 'Legacy Hardware' is really kind of a misnomer considering that what you buy today is bound to be out of date within a few weeks. Many boards that drive additional video information like tuners, dvd decoders or dual display(heck even multidisplay) may be affected by this bug. Considering the popularity of these as add in boards I think this may affect many people. Sales of the 'latest' heavy duty proc's are really dependent(initially) on highend workstation/server dependent businesses(who have the power to buy or to not buy in huge numbers), not the at home gaming crowd(which usually buys after the first round of price drops).It is kind of sad to see Intel's Itanic place them in a similar situation to Netscape. (Great potential product continuosly plagued by setbacks that shouldn't happen)

  • Actually I'm not sure about the G450 but I do know that the G400 does not really behave like two monitors on W2K which really defeats the whole purpose, IMNSHO. The G400 dual head cards combine two display cards into one, which means that the desktop in W2K, for example, stretches across both monitors. Their drivers can fool an app into maximising in the current display or have popups display over the current app but it's a hack in the sense that it completely ignores the new W2K multi-monitor API.

    If you want true dual head, get two cards NOT a G400. And the best implementations of this that I've seen are one AGP and one PCI so this P4 bug doesn't really apply here.

  • Headline says: "or one on the AGP, one on the PCI."
    That's just it, you can not add a video card to any PCI slots, you can only use the AGP slot for video. Since there is only one AGP slot, you can use only one video card!
  • I don't think my next purchase is likely to contain this company's products.

    Well... technically... USB is basically their baby. If your next computer purchase has USB support, some of your money is probably going towards royalties to Intel off of the whole USB deal. :p
    http://www.bootyproject.org [bootyproject.org]
  • Oh... I agree, very few people need a second monitor. Definite niche market. However, lots of people would benefit from one. Also, the niche market is growing. Windows2000, which suports dual monitors (NT4 didn't) is just starting to catch on in the professional market. And most people who have tried a dul-monitor setup don't want to go back.

    Also, the P4 is supposedly the current top-of-the-line chip you can buy. A lot of P4 users are high-end users, so the percentage of P4 users who need/want dual monitor support is small, but larger than the percentage of general PC users who need/want dual-monitor support.

    I mean, how many people need more than 128MB of ram? Very few. But what if the P4 chipsets didn't support more than 128MB of ram? What if the P4 didn't support defragging your disks on the 5th day of the month? What if your car didn't support sharp left turns above 55mph? What if your toilet didn't support more than five flushes an hour?

    The point is, it's ridiculous to defend an error by saying "oh, must people don't need that anyway". The point is, some people do need or want those features. Obviously this is another bad mark against Intel's name. Plus, as far as I know, every previous Intel chipset supported dual monitors just fine (MS OS's haven't always supported them... but to the chipset another PCI video card is just another PCI device for the most part). Intel's arrogance and coverups are really ticking people off...


    http://www.bootyproject.org [bootyproject.org]
  • To be fair, the probable amount of people that it will effect is relatively small, and even if you do want two monitors, most companies are just using the Matrox G450

    I can't agree with that. The PIV is marketed (and indeed seems only to be good for) graphics and video. This market segment relies heavily on two monitor configurations, and the G400/450 isn't always the card we want.

    And for video production, which is the PIV's strongpoint, I guess the Matrox RT2000 (an excellent low-cost real time video production & digitizing card) is out. It uses the dual head G400 as the base video card, but you can only do video editing with it in single-monitor mode (plus a TV screen). To edit video with dual screens, which makes it much easier, you MUST get another G400 video card on the PCI bus. And you're out of luck with the PIV.

  • OK... so the problem isn't on the chip itself. That doesn't make it go away. Anyone with a P4 and this setup will face the problem. It doesn't matter very much where the problem lies unless you plan on shelling out for some new hardware.
  • by Anonymous Coward on Thursday December 28, 2000 @03:59AM (#539474)
    states that the number of bugs in a chip design will double even 2 revisions.
  • by marcushnk ( 90744 ) <senectus@nOSPam.gmail.com> on Thursday December 28, 2000 @04:44AM (#539475) Journal
    Like the subject suggests.. its not just vid cards affected... is ANY card that has a LOT of data flowing through it...
    I feel that, that makes the bug a little more serious than your standard Intel screwups...
  • by ClayJar ( 126217 ) on Thursday December 28, 2000 @04:05AM (#539476) Homepage

    CNET's News.com had a story on this as well:
    Minor bug lingers in Pentium 4 chipset [cnet.com]

    Interestingly enough, they originally had it under a very misleading title (it said "Minor bug lingers in Pentium 4 processor" IIRC). They apparently got enough feedback that they retitled it by this morning.

  • Tell me: just how many people out there NEED a second graphics card?

    I'll tell you what... two monitors are the way to go. Anyone who's going work that requires a lot of screen real estate (programmers, artists, etc) can benefit from an extra monitor, no matter how big their primary monitor is. Also, a lot of people simply have extra, perfectly good, compatible hardware laying around they'd like to use. Or they can pick it up on eBay or a computer show...

    but given today's cheap 19" and 21" monitors running 1600x1200 resolution, you can have lots of display area AND still keep the menu commands on the same screen.

    Trust me... in order to get 1600x1200 resolution at a decent clarity and refresh rate that doesn't kill your eyes, you need to buy a pretty nice monitor, NOT a cheap one. The average 19-inch monitor is NOT usable at 1600x1200... cheap ones only do 60 or 70hz at this resolution. Trust me, I did a lot of shopping before I found one and it wasn't cheap. But I love my 19-inch Sony. :)

    In short, the bug with the i850 chipset only affects people who primarily use legacy hardware. It's not that likely people will put in older graphics hardware into today's P4 systems given how good 3-D graphics cards and their ability to display 1600x1200 32-bit color have become.

    Wrong! A large number of computer professionals/hobbiest have old PCI video cards and smallish monitors laying around. Come on, what computer junkie DOESN'T have a box full of old hardware? :) It's INCREDIBLY USEFUL AND COST EFFECTIVE to use this old hardware for a secondary display on your shiny new PC.

    And anyway... your post bothers me on a couple of other points to. The "who really needs all that screen real estate" attitude reeks heavily of the infamous "640k should be enough for everyone" quote. Also, it's none of yours or Intel's god damn business HOW much screen real estate I need. I pay for hardware, it should work whether you think I'm using it in a dumb way or not. If I think I need 3 1600x1200 monitors that's my business. Intel's hardware should simply work the way it's supposed to. If it did, we wouldn't be having this discussion.


    http://www.bootyproject.org [bootyproject.org]
  • by funkman ( 13736 ) on Thursday December 28, 2000 @04:01AM (#539478)
    It is the CHIPSET with the bug. Not the Pentium IV chip itself.
  • by cmowire ( 254489 ) on Thursday December 28, 2000 @05:33AM (#539479) Homepage
    The problem is, if my understanding of "Video and graphical data" that they are referring to, this is more than just dual monitor systems. This also means that the whole raft of high-end video editing systems are going to have problems. And perhaps DVD decoder cards, too.

    I suspect that the major issue here is that Intel doesn't want to do a recall on the boards that have already been made, like the i820. So they are figuring that this isn't a major enough problem, so they are going to just let it ship.

    It doesn't bother me because I'm not going to buy a P4 of this vintage. If I upgrade, it'll either be to a fast P3, the next version of the P4 and chipset, or an Athlon.

"When the going gets tough, the tough get empirical." -- Jon Carroll

Working...