Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AMD

AMD on Celeron/Matrox Intros the G450 52

rubyred writes "AMD is at it again today and has added to their Duron line-up and released the 750MHz, which performed well against the Celerons from Intel. There are reviews on Sharky's and Anand's. Also Matrox, who we've not heard from in while has let loose a new .18micron based G450 chip, which is set for the corporate world. The 2D performance looks really good but the gaming... well it's another Matrox product. Still their Linux drivers might get better this time around. Again the same two sites have previews of the card."
This discussion has been archived. No new comments can be posted.

AMD on Celeron/Matrox Intros the G450

Comments Filter:
  • by FFFish ( 7567 ) on Tuesday September 05, 2000 @07:15AM (#805117) Homepage
    Colour me odd, but by far most of my computer time is spent staring at text on the screen: word processing, graphics creation, page layout and web browsing.

    My impression of Slashdot folk is that a good number of them spend most of their time coding software or creating web pages.

    In other words, our machines are our tools for creating products, which we get paid for.

    For all of us (with the exception of the 3D modeling folk), shouldn't our primary concern then be to obtain the absolute best *2D* card? The one that produces the sharpest, stable-est output?

    By all indications, that'd be either a Matrox or an ATi card... not a card that cares more for 3D performance than 2D quality.

    Just my opinion and, like I said, maybe I'm the odd one out on this sort of product choice...

    --
  • I really agree with you on their drivers. I have a G200 that worked great with NT4 but with W2K I am currently using the MS supplied drivers with no problems except less than spectacular performance. Matrox has had drivers out for the G200 since Feb. but they suck on Dual processor machines. After installing the drivers, my system crashes all to often under graphic load. I was looking at buying a newer matrox card because I like the sharpness and clarity of there analog output, but If I have to watch the screen draw well its time to move to something else.

    Just my 2cents worth.
  • I have very mixed feelings about Matrox hardware. My NT box still runs a 4 MB mystique and the card is wonderful, I constantly use it at 12801024x85 Hz with no problems and decent 2D speed. The drivers are rock-solid, and I was able to use the same card in Linux perfectly.

    But

    I recently put together a Windows2000 box and got a Matrox Dual head G400 for it. The Matrox video web site advertized the card being Windows 2000 compliant. Matrox had beta drivers available. Guess what? They did not work! I tried 3 revisions, and the result was always the same: boot up, 2 seconds activity, hard lockup. You can't do that! You can't sell a card and say it has support for an OS, and then release drivers so bad they amount to no support at all!

    I never did get the card to work, I gave up on dual monitors and got a TNT2 Ultra instead. Mind you, my machine has not has a single crash since (once I got rid of the aureal2 sound card which was hardware incompatible with my Abit motherboard due to PCI timing issues - another big rant there). The crashes were not Windows 2000 fault as many people here would tend to assume, it was Matrox's fault pure and simple. I'll never buy another Matrox again. Nvidia is my current graphics chip supplier of choice.

    Ugh, long rant. Sorry, you just have no idea how much mental pain and anguish that stupid Matrox card cost me :) P.S. the machine specs are Athlon500/Abit KA7-100/TNT2 Ultra 32 MB/128 ram

    Alex

  • > ...it's built on the same lines with Thunderbird...

    No, the Duron is built by AMD's Austin, Texas fab using aluminum interconnects, while the Thunderbirds are built by the new Dresden, Germany fab using copper interconnects. The Duron can be overclocked so easily because it's a considerably smaller chip (half the onboard cache memory) than the Thunderbird.

    My 800MHz Thunderbird is stable at 900MHz. Under heavy CPU load, it becomes unstable at 950MHz and 1000MHz, but I can completely boot at those speeds. I recommend that if you're going to overclock a Thunderbird, only add another 100MHz, and keep your core voltage as low as possible (1.7v or 1.75v). If I was buying today, I'd spend the extra bucks for the 900MHz part.

  • That is an interesting stat, but a narrow one. All it measures is the OS's ability to switch between open applications. Here's one instance I'll belive the Elvis fans. I'm going to put my trust in testimonials, trends, and footprints.

    Testimonials say 2000 is all around slower.

    Trends in MS software are for slower performance. Check your own personal experince and forcast the future.

    If 98 was the most complicated (ie bloated) piece of software ever, what's 2000?

    2000 may have an improved GUI over NT and 98. Whoooo hooooo! It beter be improved, NT at work is slower than X at home on worse equipment. Molases that freezes from time to time.

  • by Vicegrip ( 82853 ) on Tuesday September 05, 2000 @05:19AM (#805122) Journal
    One thing that is rarely mentionned is that they have rock solid video drivers (although I haven't done 3d in linux with it). Every steroid pumping card I've used from Diamond or ATI or has _always_ had stability problems and/or poor support for none win9x platforms.

    Quite frankly, after 30fps it don't make much of a diff. I play half-life(CS) in 800x600 all the time and never had probs. Not to mention that all good games always come with little g400 problems. I'm always reading faq items from people detailing various 3dfx card problems or weird behaviors. The faq sections for the g400s are always empty.

    Anyways, as long as they keep up the decent 3d performance I'll be happy. The others can keep their 60fps cards and blue screens.
  • I have the same G400 and it worked with Windows 2000. On my machine, Windows 2000 DID freeze up, but it was due to the Aureal sound card driver, not the Matrox one.
  • on the page you kindly linked we find:
    • The tests here warrant attention because complex Adobe PhotoShop 5.0, Adobe Premiere 5.1, Macromedia Director 7.0, Macromedia Dreamweaver 2.0, Netscape Navigator 4.6 and Sonic Foundry Sound Forge scripts are run. Content Creation Winstone 2000 keeps multiple applications open at once and switches among those applications. All of these have 'hot spots', as ZD likes to call them that truly test the system, but the CPU especially. Think of this benchmark as the "Crusher" ZD-equivalent test, which pushes real world programs to their limits (as opposed to Quake II timedemo1) on a system level.

    I've bolded the more informative bits for you so that they stand out from the rest of this marketroid trash. More modifiers than content. All Sharky says that this marvel of testing does is swithch what window is given focus.

    I'm happy for your win2k. A whole month, that's not bad for MS stuff. Still, I'm sure that 98 is faster than NT from working with both. If win2k does not do much much beter than 98, I don't even want it for free.

    If you have nice equipment, try out Mandrake (pentium and above). Red Hat and Debian have never bombed out on me, even on supposedly obsolete boxes. RH 6.0 runs my cable gateway, a 486, behind a UPS. I take it down once a year or so, and for extended power outages. Watch out for fancy video cards and winmodems. Every version of Linux I've used has been faster and more stable than NT.

    The nuber of applications I still run on 98 is directly proportional to how lazy I am. Yep, I still run stuff that comes out of boxes, such as Adaptec's EZ CD creator. Alternatives are out there, but other things seem more important.

  • I bought a Duron 600 last week for $62. I am currently running it at 1Ghz (1000Mhz)! It gets up to about 48.7C under heavy load which seems to be about average for a 1Ghz Athlon.

    I have read recently that they have locked the multiplier so that it can't be unlocked even with the pencil trick, too bad for those who didn't buy one in time. 8(
  • Matrox Forum tech support does kick ass -- those guys know their stuff.

    However, Matrox also is pretty slow at getting driver releases out the door, so consequentally those tech support guys take a lot of crap from users like me.

    I have a G400Max, and am very happy with the gorgeous image quality with my 21" monitor. HOWEVER, there's a number of driver-level problems under Windows 2000 (SMP support, TV-out support, DH doesn't work right due to MS), that I'd hesitate a little in recommending it to someone else.
  • Sharky was asking for DVI output greater than 1600x1200. I'm sorry, but there doesn't exist a TMDS chip that can do it yet. The best that Silicon Image has been able to produce is a 1600x1200 chip - and even then, there are only two or three cards that have it. The reason is cause there really just aren't any monitors out there that support it. There is a Viewsonic monitor that "sort of" does it, but not really. It cheats by taking 1600x1200 data and scaling it to 1280x1024 - that monitors native resolution. There are two other manafacturers - Apple and SGI - both making "large" LCDs, but those aren't quite up to snuff. While standard PCs use 1600x1200, Apple and SGI are both manafacturing LCD monitors that are digital using 1600x1024. Yes, there are different applications for this, but are useless in that there aren't any standard chipsets to support this. *shrug* Sharky can keep wishing for >1600x1200, but you better have some monitors that can support it.

    ~d
  • On the contrary, software is going to become a greater part of the cost, at least for the people and corporations that haven't yet seen the One True Way of the GNU.

    There are (not yet proven successful, but trying) business models which rely on mostly giving away the hardware at a loss, and then making it back in software and services.

    Witness, iOpener.
  • "Well, it's a Matrox card"?? Never mind the fact that the G400 MAX, at time of release, was the fastest 32-bit gaming solution. It wasn't until 6 months later, with the release of the GeForce, that something faster came along.

    This card definitely isn't intended for gaming (although it won't totally suck - you can still buy a voodoo3 for instance), but the upcoming G800 - if the information that MURC [matroxusers.com] and others have come up with is correct - looks like it will be a very sweet card for gaming. Also, the Matrox 3D support under linux is still as good or better than any of the others - frame rates aren't as high as GF, but the drivers are open-source, the specs are open, and the stability is far superior. Also, DualHead is supported under linux; NV's TwinView is conspicuous by its absence, and is even missing some features in windoze. And then there's the issue of 3D image quality...

  • I was doing a setup for a doctor using Windows2k with the G400 Max.

    Windows2k is not properly setup for dula monitoring unlike Win98. It is a fact. It has nothing to do with Matrox or the drivers... dual monitoring in Win2K is going to suck.

    We had to rewrite our software in order to make Win2k "pretend" that it can work with dual monitors.
    ChozSun [e-mail] [mailto]
  • by Greyfox ( 87712 ) on Tuesday September 05, 2000 @04:16AM (#805131) Homepage Journal
    I've been running a G400 Max for several months. 3D works great with XFree 3.3.6 and Utah GLX (I haven't tried Xfree 4.x yet.) While the 3D may be more sluggish than the newer 3D cards, the drivers are open and Matrox has traditionally been the most friendly video card company to the open source community. That counts for a lot in my book, and I plan to continue voting for them with my wallet for the foreseeable future (I'm currently investigating how well a Rainbow Runner would work with Linux.)
  • I have to agree. I actually run Matrox cars in all my BSD machines since I do not have a particular need for great 3D graphics but do need the wonderful job Matrox does in the 2D area. However, I also always use Nvidia (current, Geforce (1)) cards in my Windows machine. Yes, I use Windows for gaming, and I always will until there's a great BSD port of Freespace 2 - go away. :P

    Matt
  • D'Oh! Never mind.

    Canis timidus vehementius latrat quam mordet -- Curtius Rufus

  • Matrox cards seem to support dual monitors better than any other. I don't know if it has added hardware support or they provide better documentation on how to do it...

    I have had three 8MB Millennium 1's in my alpha for a while. I went down to two because the third monitor's location was a pain.

    If I must spend money to upgrade these M1's (one millennium is in a PC), I'll go for a G400 Max, maybe a G450. I really want to have the TV out and dual monitor capabilities. The Millennium 1 is still a great work horse video card. The 6 year old cards can knock a few AGP cards off their too lofty pedestal.

    After using dual monitors, it makes using a single monitor feel very cramped.
  • Here in Britain it is possible to get bundled Athlon/Duron Socket A chips and MB's for about the same price as similar (clockspeed) Intel bundles: e.g. Athlon T'Bird+Asus A7V for £200 (~£300).
  • 1024x768 is good enough for me anyway

    I take it you don't play Quake3 then. In single player, G400 is barely fast enough to make 800x600x16 playable, but against human opponents? Not a chance. G400 is nice allround card with stunning 2D, but hardly a gamer's choice. It's good enough for a lot of people, but there's no need to get defensive like that.

    A penny for your thoughts.

  • I have gotten 3D acceleration to work properly with the Matrox G400 but some games lock it up or won't run with Xfree86 4.0.1. Unreal Tournament locks it up and so does Heavy Gear 2. Quake 3 runs pretty much rock solid with good frame rates.
  • on the sharky Duron benchmarks, there was an interesting stat- Content Creation [sharkyextreme.com] It seems as though Windows 2000 is significantly faster than Windows 98SE. This is contrary to some of the testimonials that I have seen, where people say win 2000 is slower. FUD goes both ways you know.
  • It really chaps my tail to see people rip on this card as it's not really all that much of a new card anyway. It's more of a rebuilt G400 than anything. Matrox released this primarily for OEM's building machines than the end user. Sure, it's a newly released card, but it's not really fair to compare it to the newest cards on the market. Wait for the G800 in 2-3 months and compare it with the cards out then.

    Of course, while we're on the subject of Matrox, and people ripping on them... How about that dualhead? How about that dualhead in XFree? How about hardware bump mapping in games? All features first put in use by Matrox. Now all the manufacturers are getting their feet wet with it.

    Combine Matrox's ability to innovate new features into their cards, their 2D quality at high refresh rates and resolutions, and the rock stable drivers they pump out like clockwork; Matrox is the choice for the beast under my desk. (Wow.. That looked like a marketing statement huh?)
  • As mentioned in this german article from Heise [heise.de] there are already some names (Siemens/Fujistu, Compaq and IBM) announcing sub DM 2000 (probably translates to sub $1000) PC's with this part. Since MHz still sell in the OEM business this might make some dent in Intels OEM sales and that will probably hurt more (in terms of $$, not of faceloss) than the whole 1.13 GHz story.

    Another more detailed article can be found at Anandtech [anandtech.com].
  • by jjr ( 6873 ) on Tuesday September 05, 2000 @04:31AM (#805141) Homepage
    As OSes and software is getting cheaper everyday Intel better start worring about AMD eating thier lunch. Since software will soon become a minimal cost of building computers. People will see that AMD is giving more bang for the buck.
  • I don't think it is narrow at all. This duplicates the things that I would normally be doing. It also says nowhere that it only switches between apps, I am sure that there is real processing done on files as well.

    My testimonial is that Win 2000 is no slower, and tons more stable than nt 4.0. My current boot at home is over a month old, and my memory has not blown up. I enjoy mucking around with Suse, but I have had XWindows crash so hard I have had to hit the power switch. As always, your mileage will vary.
  • by Hanno ( 11981 ) on Tuesday September 05, 2000 @06:14AM (#805143) Homepage
    Let me say "me too!".

    I've been *extremely* happy with Matrox cards in the past.

    - They support their products for a *long* time (I still get updated drivers for the old Matrox cards in my other computers!)

    - Their Windows 9x drivers so far have always been rock solid, at least on my rather unusual self-built machine (unlike many other graphics card manufacturers I tried in the past).

    - 2D image quality: Yes, it is indeed that good as other people here already stated. And this is one of my main concerns when working with computers. Just ask the many computer store clerks I drove insane just by the way I check a computer screen before I buy it... :-)

    - They were among the first to release specs to the XFree folks. The Matrox products used to be the fastest 2D cards on Linux and back then, it was one of the reasons why I switched.

    - The G400 double head support is a *very* good thing. Sure, Macs had it for more than a decade, but you can't appreciate it unless you tried it yourself. It is also nice to play games on a TV set with the G400.

    To sum up, anyone who is working with his machine most of the time and is also a casual gamer (as myself), the Matrox card is *the* one to buy.

    I have played a few 3D games on it (Freespace 1 & 2, Descent 3, Halflife, Homeworld, Dungeon Keeper 2) and wasn't disappointed. That doesn't mean that I wouldn't mind better game support. If Matrox came out with a more Gamer-oriented card that still has all the "office & business" qualities mentioned above, I'd buy it!

    ------------------
  • Does anyone know if the G450 supports hardware YUV-to-RGB conversion and scaling on BOTH the primary and secondary outputs, and if this is supported by the Xv extension under XFree86 4.0?

    Also, does the dual head support appear as two X screens (as reported by xdpyinfo) under XFree?
  • One thing that seems too nebulous to review is the quality of tech support a company offers. As one of the many whom have been impressed by the active participation of their knowlegeable tech support people on their forum [matrox.com], my next purchase will be a Matrox product.
  • The only thing that scares me more then a Microsoft mononpoly is a Nvidia / Intel / M$ Dulopoly.

    There's no need to worry about this. A few years ago, 3Dfx ruled the roost for graphics cards. Before that Matrox (the company many people here are writing off) made the graphics cards of choice. Similarly, Intel's dominance in the PC processor market is now being challenged - Intel are feeling the heat from AMD and if they don't start shipping large numbers of 1GHz PIIIs soon, AMD could end up winning this battle. Who knows, maybe M$'s dominance may face a serious challenge sometime soon. I can remember a time when M$ Word was playing second fiddle to WordPerfect, but where is WordPerfect these days?

    Things change, and in the high-tech business things change quickly. It just takes one company to come up with a great bit of innovation, and the whole status quo is disturbed very rapidly. Don't write off Matrox just yet.

  • by Anonymous Coward
    Here is output from xdpyinfo from my DH G400 with two screens attached. No +xinerama mode because of one screen being 15" and the other 19".

    screen #0:
    dimensions: 1280x1024 pixels (361x271 millimeters)
    resolution: 90x96 dots per inch
    depths (1): 16
    root window id: 0x30
    depth of root window: 16 planes
    number of colormaps: minimum 1, maximum 1
    default colormap: 0x21
    default number of colormap cells: 64
    preallocated pixels: black 0, white 65535
    options: backing-store NO, save-unders NO
    largest cursor: 64x64
    current input event mask: 0x5a20fd
    KeyPressMask ButtonPressMask ButtonReleaseMask
    EnterWindowMask LeaveWindowMask PointerMotionMask
    PointerMotionHintMask ButtonMotionMask StructureNotifyMask
    SubstructureNotifyMask SubstructureRedirectMask PropertyChangeMask
    number of visuals: 4
    default visual id: 0x24
    visual:
    visual id: 0x24
    class: TrueColor
    depth: 16 planes
    available colormap entries: 64 per subfield
    red, green, blue masks: 0xf800, 0x7e0, 0x1f
    significant bits in color specification: 6 bits
    visual:
    visual id: 0x25
    class: TrueColor
    depth: 16 planes
    available colormap entries: 64 per subfield
    red, green, blue masks: 0xf800, 0x7e0, 0x1f
    significant bits in color specification: 6 bits
    visual:
    visual id: 0x26
    class: DirectColor
    depth: 16 planes
    available colormap entries: 64 per subfield
    red, green, blue masks: 0xf800, 0x7e0, 0x1f
    significant bits in color specification: 6 bits
    visual:
    visual id: 0x27
    class: DirectColor
    depth: 16 planes
    available colormap entries: 64 per subfield
    red, green, blue masks: 0xf800, 0x7e0, 0x1f
    significant bits in color specification: 6 bits

    screen #1:
    dimensions: 640x480 pixels (361x271 millimeters)
    resolution: 45x45 dots per inch
    depths (1): 16
    root window id: 0x32
    depth of root window: 16 planes
    number of colormaps: minimum 1, maximum 1
    default colormap: 0x29
    default number of colormap cells: 64
    preallocated pixels: black 0, white 65535
    options: backing-store NO, save-unders NO
    largest cursor: 640x480
    current input event mask: 0xd8001d
    KeyPressMask ButtonPressMask ButtonReleaseMask
    EnterWindowMask SubstructureNotifyMask SubstructureRedirectMask
    PropertyChangeMask ColormapChangeMask
    number of visuals: 2
    default visual id: 0x2b
    visual:
    visual id: 0x2b
    class: TrueColor
    depth: 16 planes
    available colormap entries: 64 per subfield
    red, green, blue masks: 0xf800, 0x7e0, 0x1f
    significant bits in color specification: 6 bits
    visual:
    visual id: 0x2c
    class: TrueColor
    depth: 16 planes
    available colormap entries: 64 per subfield
    red, green, blue masks: 0xf800, 0x7e0, 0x1f
    significant bits in color specification: 6 bits
  • yeah but the 2D performance of this new card is worse than the g400 max. Either they are taking a step backwards or doing a celeron/duron (hmm no wonder why the 2 stories were put together)
  • The main problem with AMD right now is that the motherboards cost so much more. The Duron/Athlon chips are cheeper but when you have to spend $50 more on the motherboard (than if you bought a PIII or a Celly) it makes the price gap very small.

    I just paid a little under $130 for an FIC AZ11. That's not much more than I've paid for reasonably modern (at the time) Socket 7/Super 7/Slot 1 motherboards from reputable manufacturers ("reputable manufacturer" != "PC Chips").

    When we can get Socket A mobos for $70, then Intel will be truly useless.

    You're not going to find a good-quality motherboard for any current processor at that price. You might find older boards for that price or less (the last two boards I bought for personal use were about $20-$25 each, but those were a Supermicro i430TX-based AT m/b and an Asus SiS 5598-based ATX m/b, both of which are way behind the power curve), or maybe some PC Chips^H^H^H^H^H^H^H^Hpiece-of-sh*t "BXPro+Ultra©®" motherboard, but that's about it.

    _/_
    / v \
    (IIGS( Scott Alfter (remove Voyager's hull # to send mail)
    \_^_/

  • Screw. Some guy is going nuts labeling "redundant" on every user who dares to dream of the ideal 2d/3d chipset alliance. He/she must be stuck with an S3 Savage to be so spiteful of us all =)
  • Yes, matrox cards rock. Get their linux beta drivers (with source) from murc [matroxusers.com]. Who cares about bleeding edge three-dee when it's obsolete in six months and 2d suffers (quality and speed) suffers in the meantime.

    It's like vga support back in the old school. new cards like s3 had fast blitters but only with their windows 3.11 drivers. Kick down to dos to run a game, and perfomance blows (time to get a et4000 or w32).

    Btw - matrox has kick ass vga _and_ vesa.. Although there are too many vesa modes to keep programmed in bios, so they are dropping the lower res ones...
  • Slightly offtopic, but...

    30fps isn't enough for a smoothly played game. You need about 90fps to cope with the firefights and high speed crap. 30fps will drop down to 15fps in some moments.

    Of course this is from someone using a TNT2 - so it's not like I know shit.

  • I'm disapointed in Matrox, I, and a couple other people I know, were considering buying a Matrox as an alternative to nVidia or 3dfx. It seems the G540 is closer to my TNT2 then to a GeForce2. Probally could beat up anything 3dfx has for under $400 though.

    The G450 is not meant to be a high-performance card. I haven't seen benchmarks yet, but performance is expected to be worse than the G400 MAX. It is also expected to be less expensive than the G400 MAX. The G450 seems to be intended for OEMs who want an inexpensive card that is not a piece of crap. This card will probably be ideal for business use, where they don't care about gaming performance but they do care about resolution, refresh rate (Hz, not FPS), image quality, and possibly dual-head.

    Personally, I'm trying to hold off until the G800 appears (not sure but I think in 2-3 months). That should have performance in the same ballpark as the GeForce 2. It'll probably be supported by XF86 4.0 very soon after release too (maybe even before :).

  • I'm disapointed in Matrox, I, and a couple other people I know, were considering buying a Matrox as an alternative to nVidia or 3dfx. It seems the G540 is closer to my TNT2 then to a GeForce2. Probally could beat up anything 3dfx has for under $400 though.
  • I can't understand why we are pushing for faster clock speeds! Don't you understand? The sooner we hit the limits of moores' law, the sooner the earth is swallowed in a black hole! [slashdot.org]

    Not to mention what all that extra heat from CPUs are doing for the greenhouse effect...

  • by thinthief ( 13759 ) <[gro.demlehwrevo] [ta] [1todhsals]> on Tuesday September 05, 2000 @04:06AM (#805156) Homepage
    The main problem with AMD right now is that the motherboards cost so much more. The Duron/Athlon chips are cheeper but when you have to spend $50 more on the motherboard (than if you bought a PIII or a Celly) it makes the price gap very small.

    When we can get Socket A mobos for $70, then Intel will be truly useless.

    PS. I don't understand what the title "AMD on Celeron" is supposed to mean. I guess that they are on the celeron's trail? Weird grammar.
  • by linuxonceleron ( 87032 ) on Tuesday September 05, 2000 @04:07AM (#805157) Homepage
    With the demise of Number Nine, Matrox seems to be one of the only companies today who cares about 2D quality. My friend was using an S3 based card for a while, which broke, and was replaced with a G400, the improvement in picture quality was astonishing. Though he still hasn't been able to get the linux 3d support working right. But if the G450 is as good as expected, I'd be happy to own one. As far as the Durons go, I've been looking into getting a new CPU, but was going to get a Celeron2 until I found that my motherboard can't use the new .18 micron chips. Looks like a new case/mobo/cpu for me...
  • I'm rather botherd by the divergence in chipsets that as been going on for a while. The whole PII, PIII, and Celeron (Include AMD's clones in this) Why diverge into two classes of processor? Shure the Celeron's are cheeper, but not by that much. the performance compared to the regular P chips is lukewarm, and they require a new socket?? (increase motherboard sales?) They should be concentrating on creating BETER chips, not expanding their product line (1.3G pentium anyone?) the whole celeron thing strikes me as just another way to milk publicity and money.

  • I can't understand why we are pushing for faster clock speeds!

    One word: SETI@home [eridani.co.uk].

  • by Anonymous Coward
    Matrox G450: Budget DualHead Graphics [aceshardware.com] Ace's hardware has also a review and has included DVD benchmarks.
  • Weird grammar.

    You misspelled incredibly off-topic

  • Don't know how many of you work in a Corp enviorment, but as for us, Matrox seems to be making a logical step. Other manufacturers are so far ahead of them on the 3D gaming front, that it would require substantial R+D from them to keep up. All of the HP buisness class machines that we buy have a Matrox card of some sort in them from the factory. this seems to be a smart move and should be generating more than enough to offset what they are loosing by not keeping up with the 3D Jonses. We buy about a thousand or so machines a year, and I know they have bigger customers.

  • by Anonymous Coward
    Disclaimer: I'm a G400 user - not out of choice, it was bundled with my IBM Intellistation. I've been pleasantly surprised by the card though.

    Other people who posted before you in this same discussion (here [slashdot.org] and here [slashdot.org]) contradict you about Quake3's playability on a G400.

    No-one (not even Matrox) would argue that Matrox cards offer the world's best 3D performance. However, they offer great 2D performance for the money, and are perfectly adequate for playing the odd game now and then. As you say, the G400 is a nice allround card, and the people who choose Matrox are the people to whom good 2D performance is important. Criticising the G400 because it doesn't have the best 3D is missing the point.

  • is either *very* shrewd, or downright annoying, depending on your point of view.

    The Duron is targeted specifically at the Celeron, and nothing else. Until this announcement, the chip has been clocked at 700MHz, to match the Celeron. But look at the aftermarket, and I've heard reports of overclocking it to 900MHz, which makes sense, considering it's built on the same lines with Thunderbird, which is shipping 1.1GHz parts.

    I presume this 750MHz announcement is meant to sway fence-sitters who look only at MHz and "Intel Inside" stickers, since even the 700MHz mopped the floor with Celeron. Duh gee, the number's bigger, so maybe it'll be faster, even though it's cheaper.

    On the annoying side, AMD is clearly protecting their Thunderbird market by capping the Duron clock speed. I can appreciate this some, because L2 Cache size is a little more difficult to equate to price and performance than MHz. They're also playing the clock-locking games with Duron. Last round, people began tinkering with the lands on the substrate to unlock the clock, but I've heard that the latest production runs can't be tweaked that way. Normally I wouldn't overclock, but a 700MHz part off of a 1.1GHz line seems like it has a bit of headroom, to me.
  • by Anonymous Coward
    I have a G400 MAX.

    3D performance is adequate, but not outstanding. Dont forget that Matrox see themselves as a supplier to the business market who just happen to do cards that can be used for games.

    The thing that really shines though is image quality - stick a good quality monitor & cable onto the back of a matrox card and you will find a a very crisp & rich looking image compared with a number of other cards of the same price.

    I would rather be using a G400 (or even a G200) than most geforce cards when I am working & staring at the screen all day. Gaming is another matter of course...

  • Why does every new video card just have to have better/faster video drivers? This can get matrox a lot of deals with companies like Dell.. who primarily sell to wait no not businesses...

    I think its a smart play by Matrox.. sure your average geek is not impressed but all this improvement in the 3D realm when the business market is so much bigger.. heh go Matrox.. I like their cards and own a G400MAX, the QIII framerates are playable on a PIII 500.. But the 2d is awesome on a 19" trinitron monitor :)

    Jeremy
  • But the $70 Celeron boards are crap compared to the Socket A boards.

    What sort of loon would take a good, fast 750MHz processor and bottleneck it with a junk mobo? I suppose that question answers itself, of course: the sort of loon that'd buy a Celeron in the first place, when he could get a Duron instead...

    Point is, not only are the AMD CPUs a good bit better than the Celerons, the motherboards are better as well. Your AMD+good mobo is going to be a good byte (ooh, a pun!) better than the Intel+cheap mobo.

    (Think I'll mention in an aside here that Abit has a RAID-supporting Socket-A mobo. Sounds awful cool!)


    --
  • by Tet ( 2721 ) <slashdot AT astradyne DOT co DOT uk> on Tuesday September 05, 2000 @05:00AM (#805168) Homepage Journal
    The 2D performance looks really good but the gaming... well it's another Matrox product.

    Your point being? I've been using my G400 for gaming, and haven't run into any framerate problems. I don't personally care if a GeFore2 gets better performance. If I'm not getting slowdown, why should I want a faster card? Admittedly, it doesn't do 1600x1200 at high speeds, but then for gaming, 1024x768 is good enough for me anyway. Furthermore, not only is 2D performance really good, but the image quality is unsurpassed. Matrox just seem to be able to get really clean, crisp images to the screen.

I have hardly ever known a mathematician who was capable of reasoning. -- Plato

Working...