Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Software

Review Of The Matrox 32MB Millenium G400 126

The Damage Report is currently featuring a review of the new Matrox card. I've been playing with the GeForce 256 lately - very, very pretty stuff, but I'd be interested to see how this compared.
This discussion has been archived. No new comments can be posted.

Review Of The Matrox 32MB Millenium G400

Comments Filter:
  • by Anonymous Coward
    2000, Linux?
  • The last time I replaced my video card was this past summer. I got an NVidia TNT2, because I expected NVidia to release some decent drivers (they had already released their 'open source' drivers).

    Little did I know that NVidia was merely playing lip service to the open source movement. The code was highly obfuscated, so nobody could really understand what it was doing. Nobody knows how to make NVidia cards work with DMA transfer (something that would really speed up the card, especially in 3D).

    In my opinion, the best available card is the Matrox G400 Max, which allows for Dual-head capability. It can be upgraded to handle TV input by adding in an extra Ranbow Runner PCI card. The Rainbow Runner is compatible with the entire Gx00 line. TV output comes standard with the G400 Max (output from one of the heads).

    (Please note that there are a few different kinds of G400. The Millennium G400 comes with either 16 or 32 MB of RAM and with either single- or dual-head. The G400 Max comes with 32MB and dual head, plus I think it has a slightly higher clockspeed (not sure). There is also the G400-TV that has TV input built in.)

    I have gotten very annoyed with my TNT2 because of the lack of support NVidia has given to the DRI project. The other major 3D card manufacturers (3dfx, Matrox, and ATI) have pledged their support, so I don't understand why NVidia wants to be so different.
    --
    Ski-U-Mah!
    Stop the MPAA [opendvd.org]
  • Do you mean AMD 751/VIA 686 motherboard chipset is bad and that KX133 based motherboard is worth waiting for?
  • A lot of us OS/2 users have Matrox graphic cards because they provide excellent OS/2 support.

  • I really wanna buy a 3d card, but i can't decide what to choose. I would buy a GeForce, but the drivers are immature. I don't wanna start a flamewar, but what would you recommend for a linux system?
  • WHY can't /. allow us to edit our posts? Damn. screwed up all the links in my post!!! here they are fixed..

    The other sites are www.cnet.com and www.sysopt.com if you wondered. :)

  • > This is six to eight month old hardware, there
    > are not 3d drivers for linux for it, the article
    > dosen't even have any linux benchmarks.

    The Utax-GLX project has had a full featured OpenGL driver for the g400 series of cards for a number of months now. Performance (in q3a) is almost equal to the windows version with turbogl.

    Unfortunatly there is no DRI driver for Xfree 4.0 yet however it should be arriving sometime within the next couple of months.

    > I do have a question I havent seen answered
    > sufficently for me. Does the "dualhead" feature
    > work under linux?

    Not really.. Yet. The last I looked there was alpha support for moving a console onto the second head via fbcon in the 2.3 series of kernels. There is also basic NTSC/PAL out support via the same module. Check it out at:

    http://platan.vc.cvut.cz/~vana/matroxfb.html

    I belive Precision Insight has been contracted to write a drive with full featured dual head support which has an ETA of this summer. This will have full xinerama/TVOut support for the card.

    So a quick summary:

    2D Support: Excellent
    3D Support: Good (q3a, GL screensavers, UT still not perfect but very soon)
    Dual Head: Only for the hardcore (but full support within a few months)

    If anyone has more recent information please correct me I havent really been keeping close track of developments lately.
  • What could be interesting is a comparison of the performances of the G400 running under Win98 and running under Linux with XFree 4.0.

    After all, Quake 3 is available under both operating systems.
    It would be a good test of the performances of the open source driver...

  • I own a Matrox G400 Max. I just upgraded from a Matrox 400 (sold that to a friend.)

    Its a damn nice card, and has full Linux support via the Utah GLX [sourceforge.net] project. It plays Quake pretty well, (I bought it for the Linux support specifically.) but unfortunatly I currently get ~21 FPS in Quake 3 Arena under Linux @ 640x480 (with some special speed optimizations.)

    In windows on the other hand, with no optimizations I get ~67 FPS @ 640x480 and 25 FPS @ 1024x728. It is a bit depressing I must say.

    The dual head technology is VERY cool. I have not tried the Xi graphics dual-head addon for Linux yet. I have however tried it under windows, and I can testify to its high coolness factor.

    The reviews you have heard about the bump mapping are all true. You cannot get a good impression of it until you have actually SEEN it in action. By this I mean movment, not a static screenshot.

    It is unfortunate that there are no games for Linux that currently support this feature. :(

    If you are looking for a Linux compatible graphics card I would highly reccomend the G400 or G400 Max. It is not as fast as a GForce (but then in Linux my Voodoo 1 is faster than a GForce!!!) But it is fast nonetheless. The image quality is nothing short of amazing.

    And hey, the great Carmack is working on the GLX project. That can only mean great things! :)
  • My system:
    Celeron 300a @ 450 Mhz
    Matrox G400 Max (32 Mb ram 'natch.)
    192 MB Ram
    13 GB HDD
    Asus P2B MoBo.
    Sound Blaster Live.
    Mandrake 7.0
    Running Gnome w/ E.

    I followed the directions on lokigames site as per installing the G400. It seemed to work well, my suspicion is that I fscked up with the agpgart.o module though. I say this because:

    [root@localhost flibble]# lsmod
    Module Size Used by
    agpgart 11508 0 (unused)
    tulip 24932 1
    emu10k1 48076 1
    [root@localhost flibble]#


    This is my best guess as agpgart.o does not seem to be used. I also tried the dma settings for my glx.conf and all that seemed to do was kill X. (As per John Carmacks directions on the Utah GLX site.)

    Kinda neat that I can ask this and *STILL* be on topic. (Bet some moderator whacks me anyhow...)

  • Problem is, Matrox originally distingished the segments of their product line by name:

    Mystique was the cheap card for home gaming
    Millennium was the workstation card

    Mystique 220 was the beginnings of a 3d card
    Millennium II was a killer 2D card (I have one)

    Problem was, people associated the Mystique with the bad card, so they started releasing all of their cards with the (respected) Millennium name.

    Two exceptions: Matrox's first go at a 3D accelerator (PowerVR PCX2 based, not even a Matrox chip) was the m3D, another M.

    Matrox's All-in-one card, was the Marvel (another M).

    I guess the reasoning is, that distinctiveness is important, so you can begin to build a brand identity.
  • I can't say I'm sure that nVidia will release binary drivers, but keep in mind that this company released obfuscated open source drivers, because basically it was the only way to get into XFree86.

    If XFree 4.0 allows NVidia to release binary-only drivers, it shouldn't therefore be surprising if they take the opportunity.
  • Nope, whom is correct. The subject of the sentence is "you", not the winner.

    If I said "Who will win", then Who would be right.

    I asked, though, "Whom do YOU think"

    Who is for the nominative, and only the nominative.
  • Compare the performance of XFree's fbcon server under the MatroxFB, with the performance under an NVidia Frame Buffer.

    Oh, wait.. NVidia cards aren't accelerated under Linux fbcon. so I guess Matrox would whoop NVidia (where it really counts, anyway)

    Neglecting the fbcon for a moment, Whom do you think is going to win under Linux, when Da Man of 3D graphics, John Carmack, is supporting the development of Matrox drivers for Linux?
  • Buy a Matrox.

    NVidia is bad, for releasing deliberately obfuscated source. Source that's impossible to read, is impossible to maintain. And source that's impossible to maintain is not much better than no source at all.

    3dfx is bad, for bulling people around, trying to copyright an API, and trying to monopolize the industry with their inferior tech (16 bit color), through attempting to keep Glide proprietary.

    Of course, if your ideology doesn't agree with mine, or your ideology doesn't play a role in the decision making, I'd have to say that now isn't the best time to make the decision, as XFree 4 is still brand new.

    Though, it might interest you that John Carmack is funding a project to make a really nice Matrox MesaGL driver for linux.
  • by Jae ( 14657 )
    i bought one of these at a show a little after they came out - i was really looking forward to having two monitors.

    then i found out that the 2nd head doesn't have great specs and since the XFree guys were moving to NVida in house, they probably won't write a driver for the 2nd head (i checked this w/ mandrake back in january on irc - and that's what he told me. if i'm wrong or inaccurate about his response, pls don't flame - i make a formal appology now - but that's what i remember him saying).

    so - i moved to a 3dfx card so i could play unreal :)
  • The Geforce chip is Very, Very nice. It pastes 3DfX to a wall with their own spinal fluid and snaps the NVidia in

    Hee, hee. Always amusing to see someone who doesn't know what they are talking about.

    The GeForce chip is made by nVidia
    And yes, it is very nice.
  • I had the previous Asus, the V6400 (TNT based board) with TV out. The support was pretty good, the TV got a good signal. I can only assume that they will keep their quality up with the V6800 also.
  • I disagree with your analysis, but not necessarily your conclusion. In order to compare two different video boards you have to run both on the same platform. You are running each board on a different platform, and therefore your conclusions are suspect.

    One of the things these hardware sites do is compare apples to apples, with no differences other than what they are comparing. I personally like Tom's Hardware [tomshardware.com] for reviews.
  • There will be a driver for BeOS. They are currently reworking the OpenGL implementation of the software so as soon as that is ready they'll release it. I believe R5 (FreeBe) will be out before the GL rewrite but they'll release the GL rewrite and the networking rewrite as soon as they're comfortable with them.

    Sorry that I don't have more information on it, but I hope it helps.
  • That's the dillema that marketing types face...you can never appeal to everyone all the time.

  • It's a marketing thing. Having a central theme for a product line generates greater brand recognition. Personally I kind of like these telltale signs in the names of products. It increases the likelihood that one of my family members will say one of the key words ("Blaster", "Viper", etc) and get me one step closer to remotely diagnosing their system for them. Don't you just love relatives with computers?

    The histories of some of the themes are fun to think about. Creative uses blaster because...well, they always have. ATI uses "angry" words because they're pissed they can't make a competitive graphics board anymore. Matrox is going for the alliteration angle. 3Dfx wants you to think it's all voodoo magic. And STB (pre-merger days, at least) just picks random speed-related words (velocity, nitro, etc), throws in "3D" or a random big number (and maybe even the wor "PRO!"), and crosses their fingers and hopes that you'll think it really is good hardware!

  • Perhaps this may come as a surprise, but you can actually uncheck the hardware news posts in your settings so they won't ever show up as long as you're logged.

    Personally, I found this review to be one of the best I've read this year. As others has already said, its focus on image quality and features was a fresh breath compared to many other reviews that only cares about fill-rates, speed, overclocking and so on.

    Many people may just go out and buy themself a 3Dfx, TNT or G-Force because everyone else does. That's the brands that first comes to people's minds. This review will hopefully make them think of Matrox as and option before deciding.

  • If you don't need those few extra fps in 3D (and you probably don't) which the NVidia gives you, you should go with the Matrox. Better 2D quality, solid 3D perf, excellent TV support, overall a "better balanced" card (all IMHO).

    Of course you get a sympathy point for buying a product from a company which helps open source by publishing its cards specs fully.

    ...BUT, if TV & DVD are high priorities, don't forget about ATI. Their cards are nice too. (don't know about Linux drivers tough).
  • Why does everyone say that nVidia has an obfuscated driver? They had one for XF86 3.3.4, because it was rushed, but by 3.3.5 (which came out quickly after 3.3.4) the driver is a normal, non-obfuscated one. Yet, it seems many people had the original /. story "fixed" in their minds, and think "obfuscated driver" when they read nVidia. :) Oh, well...
  • I Have the G400 MAX, its run really nicely under Debian 2.1 for a while now. Xfree4.0 will support it, as soon as Precision Insight [precisioninsight.com] release their drivers for it RSN. There are Linux clocking utils [freshmeat.net] for these and all the info you need is on matroxusers.com [matroxusers.com] I'm a satisfied user! :)
  • the geforce is superior to the g400 in many ways.. fillrate, clock speed, overclockability, etc... if you're a gamer, and you're willing to put out the extra $100, you will NOT regret the choice. the geforce will give an additional 20 frames per second over the g400 in, say, quake 3 easily. the g400 is good if you want dual monitors but only want one video card.. but other than that, the g400 doesn't stand a chance against a geforce. BUT:

    matrox's next generation (we're up to 5th generation now, wee!) looks very interesting. some specs (these were leaked out 3 days ago):

    G450 (Condor I)
    - 64bit DDR SDRAM
    - 0.18 micron 6 Metal Layer process
    - 16MB frame buffer version : 2 x (2M x 32 DDR)
    - 32MB frame buffer version : 4 x (4M x 16 DDR)
    or 4 x (2M x 32 DDR) 16MB version of G450 will use 166MHz memory clock (in case of FC DDR, 200MHz is feasible) and uses 64bit dual memory bus.

    G800(Condor II)
    - Double fillrate of G450
    - Support for 250MHz DDR FCRAM
    - Support for Hardware T&L
    - Support for DirectX 8.0 Shader and fully functional DX7
    - New DX7, 8 IDC drivers
    - Mass production : Sept. 2000.

    Anyway. to sum things up, the geforce kicks the g400's butt when it comes to frames per second in games. nuff said.

  • I believe that XFree86 also has ATI cards support, I think its the Mach64 drivers. And those have been around before the G100.

    I think ATI cards are the most widely supported.

  • I concur!

    My brother introduced me to Matrox when he had is Pentium 100... many moons ago. Keen to the Linux community "it just works."

    I have a G200, and although its performance isn't so hot, I have much better picture quality than any other (non-Matrox) card I've looked at carefully. Mind you, performance isn't BAD.

    I work for a shop that sells many systems. I push to sell Matrox, because we get so few returns on them... quality is excellent. As are the drivers... excellent.

    But then companies that produces the hardware, the chips, and the software seem to offer better Q/A.

    -sid
  • How come all Hewlett Packard products end with "jet"?

    Deskjet, Laserjet, Scanjet, Officejet, Paintjet, Designjet.... any more?

    And what of Logitech's "man" fetish?

    Mouseman, Wingman, Soundman... anyone remember the original cyberman? I think I still have that lying around...
  • I hate to pick nits, but "who" is correct. The subject of the sentence is "you", but the sentence contains a noun clause acting as a direct object; that puts "who" back into the nominative case. View the source of this for a diagram of just the independent clause in question ("Who do you think is going to win under Linux"), and wonder why /. doesn't allow PRE tags. See also "whom" under Strunk & White's Elements of Style, chap. V, or refer to "noun clauses" in any grammar book.

    --
    Fourth law of programming: Anything that can go wrong wi

  • This is indeed very old! I've had my G400 for over 6 months now and am very happy with it. It runs well under W98 and even under Linux. (Unfortunately Linux doesn't support the Dual-Head feature yet and the 3D accelleration can only be obtained by compiling GLX, driver etc and I'm way too stupid to be able to get that running) Apart from that the card is pretty good but as far as performance is concerned it is probably slower than those of the newer generation. (Only the GeForce right now)

    I've heard Matrox was working on the G800. Can anybody confirm that?

  • Real geeks won't buy the WInTV-D because they know that it only supports low end digital tv (not justly called HDTV). Their next card will support 1080i, HDTV at its max. matt

  • I'm just curious, but can *anyone* be sure that the driver for nVidia's chips for 4.0 is going to be binary only? Everyone on Slashdot keeps saying that, but I've never heard nVidia say it, so why is everyone so sure?

    Adam
  • The G400 is by no means new, how does a late review warrant news?

  • I have one of those for my home PC. The deciding factor that made me choose a Matrox card was their good support for Linux. Thanks Matrox! I bought my dual-head G400 (not the MAX card, just the regular one) for my home PC from Dee One Systems [deeonesystems.com] for $178 a while back. I like that vendor; they're good-n-cheap and reliable too, I've been buying stuff from them for years now and they haven't messed up any orders yet.

    In my office we've got a couple of Microstation users who have dual Hitachi 19" monitors, and one of them is using a G400 - looks great. He's running both screens at 1600x1200 in "true color" mode (I think that means 32-bit but I'm not sure). In NT you can drag dialog boxes from one screen to the other, or if you maximize a window it stretches all the way across both screens. There's a landscape-aspect Playboy centerfold scan (Dinah Willis, 12/65) that makes a cool background bitmap for that system.

    I haven't ever hooked my G400 at home up to a TV set yet, but it came with a cable so I can. I'll bet text mode on a big screen TV is real easy on your eyes; combine that with the Logitech wireless keyboard I've got and I could potato out on the couch while hacking away. At last, a useful purpose for that damned TV set!

    Yours WDK - WKiernan@concentric.net

  • Actually, I find this review incredibly timely. I am trying to get a good 3D + tv in + DVD + vidcapt board. Tie seems to be between a Matrox G400-TV and an ASUS V6800. The problem is the ASUS uses a GeForce 256, which is definitely superior to a G400. But the Matrox card is the only card I've seen listed (other than the Ati all-in-wonder) that has _good_ TV in support.
  • They have some XFree86 Benchmarks on the matroxusers.com webpage. I believe it gets 40 fps in Quake3 demo1 with the detail turned really low and about 38 fps with detail turned up
  • You're making the right choice. ASUS isn't really known for high-quality video capture while the Matrox Marvell line is renowned for bringing an on-board hardware MJPEG codec for such a low price. Even on relatively weak machine the Matrox should be able to rip full screen NTSC or PAL without dropping frames.

    Jay

    -- polish ccs mirror [prawda.pl]
  • I'm in the process of putting together my new Athlon sytem (praying my mobo arrives today). My previous computer had a Matrox Millinium II and I LOVED it. It wasn't the fastest card, but next to any other card I've ever seen, the output was just plain beautiful. If I spent %90 of my time playing 3D games, I'd probably get an nVidia card, but I don't. I do a LOT of 2d work where I don't need the speed, but I do need the crisp, clear, easy to read display the Matrox card gives me. I'm eagerly awaiting installing my new G400 into my new box and see just how much it rocks. :)
  • Exactly. Notice that neither the Marvel, or the M3D, which where both new product lines on boards that never really went anywhere, had the Millinium name on them. It was Matrox's first time with those attempts and they didn't want to corrupt the Millinium name with a product that could prove to be sub-par. Now they offer these same features in a new product line, on a card that has been recieving very good reviews, and it carries the Millinium name again. Very good marketing on Matrox's part.
  • I have one of these cards in my system right now. It is pretty cool to be able to get dual monitors off of one card, really saves on slot usage.

    On the Linux side though, I was dissapointed that before XFree 4.0 came out, only Accelerated-X's MultiHead version supported this card.

    But with XFree 4.0, this card in it's dualhead mode is SUPPOSED to be supported. I guess I'll find out whenever RedHat decides to release a RPM or I get tired of waiting and try to install it by hand.
  • Ummm, GeForce is by NVidia, and is not part of the Matrox card.

    Are there any dual-head GeForce based cards out there incidentally? Does the GeForce chipset include support for it? I don't remember reading anywhere that it does...

    --
  • OS Support:
    Win9x -> full boat support. (OpenGL, Dx7, etc)
    WinNT/2K -> nearly full support (dual-head monitors are locked in refresh, I think)
    Be -> Full driver set released (haven't touched it)
    XFree 3.4.x -> full accelerated support.
    XFree 4.x -> base support done. DRI work is being done by Precision Insight.
    GLX/OpenGL -> Utah-GLX-DEV team has everything working really well.

    I'd say _THIS_ is the card for the best linux performance.

    -Steve

  • I have to wonder, what made this an interesting article for Hemos to post? It's an old card that competes with the Voodoo3 and TNT2, when the GeForce is out and the Voodoo4/5 is expected soon. In addition, there's nothing in the article that mentions linux.

    Sure, the card might be great, but this isn't the type of thing I've come to expect from Slashdot.

  • Yeah, the 2D of matrox cards is much better than anything. My friend's G400 32MB with a half-decent CTX 17" looks almost as good as my NEC FE700 on a Voodoo Banshee. Back when he had it on his S3 Virge, it was all blurry and whatnot. He plays games occasionally, so the G400's lack of 'amazing' frame rate is ok for him.

  • Windows, maybe Mac, if they put out an updatable flash BIOS.

    Linux has had support for the G400 for quite some time - in fact Penguin Computing has been selling workstations with dual head G400s for quite a while now (nice machines).
  • I spent a good deal of time researching this card before I bought it, unfortunately, most of the research came from a gaming perspective. The majority of my computing time is not spent gaming so the VCQ, Dual-Head, and good linux support where the features that appealed to me the most. However, when I do find time to quake, I'm pretty serious about it so I wanted a good performer. :) Although I spend all my time in linux at home, I am a win32 developer by title so I naturally keep a partition around for NT and VS6. As soon as I had the card installed under NT i went straight to the ~/timedemo 1. What I saw blew me away, my $300 CDN, 32Meg G400 MAXX was a whole single FPS *slower* than my V2 8Meg. ARGGHH!!! Nothing I did would give me the performance I needed. With 98, the driver is clean, polished, and FAST! Too bad 98 is useless for anything else but games and I'm really not interested in wasting resources on multiple win32 OS's for single purposes. Not to mention the serious moral implications I have about leaving an entire CPU idle on my SMP system. =)

    Needless to say, great hardware, atrocious software. I am anxious to see what Precision Insight puts together and I sincerely hope it's functionality and performance is on par with the 98 driver.

    strain2k
    "If every day was a sunny day, what would be a sunny day?"
  • I believe Tom's Hardware reviewed this a while ago. It faired very well on the 32-bit tests, but still did not beat the l33t Ge-Force. I guess this card would be good for graphic designers using two monitors and millions of colors. Other then that, the regular gamer should get a Ge-Force from pricewatch or something.
  • R5 will have a 'free demo' version. The BeOS Website [beos.com] says the free downloadable version wont have all the features enabled. Or something like that. Probobly just enough to get people to check it out. (and like it)
  • The G400 Linux drivers are still under heavy development, but they're already pretty good. Quake3, Heavy Gear II, and Heretic II work great. The only problem is that the drivers have a nasty habit of locking the machine, especially if the game segfaults and OpenGL isn't shut down nicely. This card has a lot of promise. The 3Dfx drivers for Linux are better, but it's true that the G400's image quality is much better. -Reeves
  • The G400 is apparently to be phased out soon, to be replaced by the G450, and then the G800 towards the end of the year. A great website for Matrox stuff is www.matroxusers.com. AL ============ Btw, I've got a G400 MAX, and it's Great
  • It seems then that we can only make conclusive deductions about Creative, 3dfx and ATI.

    Creative use the word 'Blaster' like Rollerbladers use 'Extreme'

    3DFX use Voodoo because their marketing team smoke crack at meetings.

    ATI just have a bad attitude about it all.

    But where does that leave us with Matrox is it: "an unnerving preoccupation with the letter M"?
  • Yes I would shoot you but no doubt our geographical proximity precludes this; perhaps I could borrow a rocket off Boeing (see next story).

    Seriously though, I do see your point; I think that Millenium, even Matrox don't (as terms) really do it for me.
    If someone asked me on the spot to name a major video card make, I would probably blurt out "Voodoo Graphics" even though that is only a chipset: now there is the real power of marketing (brainwashing) :)

    As for english literature classes; as I live in UK I could not avoid them and feel robbed of my teenage years by them - even though I had perfected the art of sleeping with my eyes open by age 14 :)
  • I got mine at pricewatch [pricewatch.com]. Just did a search for the G400. I'm very happy with what I got for the price ($139 three months ago).
  • I have checked amazon and compusa's websites and neither one of them have anything on Matrox. I hope my compusa store has it.
  • Windows, maybe Mac, if they put out an updatable flash BIOS.

    Linux? Well, I'm not exactly a device driver writer myself... It'll happen soon enough.
  • Well I have seen and read a number of reviews for this card. Now I really feel that this card deserves a bit more than it gets. It is kind of put to the back burner becuase of the GeFORCE 256 Craze. Its like totally forgotten about and not *everyone* plays games and if you dont play games I seriously would hate to see you pay 300 bucks for a DDR GeFORCE thinking your gonna get better 2d display.

    You wont.. the matrox beats the ever lucin sh!t outta any nvidia card out there for 2d. AND I think its also good we support Matrox and pray they have the next awesome 3D card because they will be releasing specs and we will have some wicked fast drivers in Linux. So dont knock publicity for these people! Who cares if its old Its good for discussion also so bah

    JA

  • It was a bit sad to read through and only see Windows benchmarks. Granted, they are the easiest to come by, but since I only use Windows on my laptop, it really doesn't matter to me.

    Anyone thinking of doing a follow up for other systems? I'd really like to see how it does in XFree86, as well as if there are any possible forthcoming drivers for BeOS.
  • Better how? If you mean better because it makes smaller files then I guess it could be considered better. If you mean better image quality then I find that hard to believe considering image quality is the main concern of MJEPG over MPEG2. MJPEG does eat a hard drive like crazy but image quality is top notch. My brother does some promotional videos for stores and uses a Matrox Marvel G200 for the capture. It works good but he has outgrown it and wants to go digital. He is considering a Matrox RT2000 which is a professional grade setup with a G400 integrated to the package.

  • Sounds like your CPU itself can't keep up with the game. On my P2-350 w/G400Max, I get around 40-50fps at 1024x768 @ 32bit in Q3 at normal settings win Windows. Mind you, I get about the same at 640x480, but the G400Max scales up VERY well as your CPU goes up - whereas with the others, most of their processing power is on the video card itself. At 1600x1200 @ 32bit I was pushing about 20fps =)
  • ATI sucks ass
    --------------------------------------------- --------
  • have you seen the mpeg2 capture for the 6800 deluxe? it's better than mjpeg...
    ---------------------------------------- -------------
  • Be careful of what you say. As a frequenter of said message boards and web site I am well aquainted of what goes on there. ANYTHING AND EVERYTHING posted there about the G450 and G800 are RUMORS!!! Do NOT take any of it as fact.
  • On a related note: I've just recently been shopping for a new video card, and one of my major concerns is the quality of tv-out. I was seriously considering this card, so it was very considerate of /. to post this article for me.

    I read through the review, and I am very pleased with what I read, esp about the 3d. One concern I had with going with the G400 was the quality of its 3d compared to some of the leaders in that area (although anything would be better than my old, cheap ATI with 4 MBs of RAM). But the review didn't go into anything about its tv-out quality. From what I understand, Matrox is VERY good in this area, but I was hoping this review would at least mention it.

    Can anyone help me here? Is the Matrox tv-out all its cracked up to be? and if I'm looking for a card with good-to-excelent tv-out with good 3d, should I get this? Any type of help would be nice.

    (Oh, btw, Matrox has lowered its price on this card from $249 to $209)

    ==
  • As I posted this, I realized that I gave out slightly incorrect information.

    The G400MAX is selling for $209 now (down from $249) while the G400 (what's reviewed here) is now selling at $179 (down from $199).

    Of course, this is all according to the Matrox web site....

    ==
  • I admit that the Geforce has a better perfomance. But the quality of the Geforce cannot compare to the G400. Try running Unreal Tournament at 1600x1200x32bit in D3D or OpenGL.
    Luugi
  • The TV-in Card, for a G200/G400, is technically separate. It is a separate PCI card that links onto the G200/G400 with a set of cables. Sure you are stuck with only buying Matrox cards forever as long as you want to use that TV card, but is this bad? (no) Matrox has long supported even old Rainbow Runner G cards, and I don't see why that will stop.

    As a side note, the Happauge card is missing a LOT of features:

    - Filtering/Anti-Aliasing (truse me, watching full screen TV on a BT848 card is a nightmare on a 20" monitor. My head still hurts, and I sold the card 6 months ago! :-)

    - Decent capturing. (The Rainbow Runner cards have a MJPEG hardware compressor built in that works REALLY well, especially at 704x480 @ 30 fps.)

    - Compatibility. (Yes, the BT848 cards are compatible with lots of videocards, just you better hope you don't buy S3 products. Matrox TELLS you what their card works with, ON THE BOX).

    I'd buy a motherboard with integrated sound and video, if the sound was a SoundBlaster Live! and the video was a Matrox G400. It just so happens that motherboard manufacturers are obsessed with cheap, and that is what you get. Cheap, nasty, video, and cheap, nasty, sound.
  • I have both the Rainbow Runner G, and the TV-out module for the G200 (which, I beleive, is the same as what is "glued" directly to the G400 board).

    While the Rainbow Runner G uses a PCI slot, the TV-out only requires a dummy bracket.

    The TV-out from the Matrox card is simply the best. It beats what crap comes out from the TNT2, or most any other consumer level card. I have compared personally the TNT2 card and the Matrox card on a 50" TV. The Matrox card had somewhat readable text at 800x600. The TNT2 card was unreadable at even its lowest resolution (640x480). I remember reading a comparitive review between many video out supporting cards, but have forgotten where the site was. It doesn't matter much, because I do remember the Matrox card beat all the others, hands down.

    Not only that, but there is more than enough info to get you started with video in/out at www.matroxusers.com. I bet you won't find a similar site for other cards... :-)

    Oh, and BTW: The TV-Out works at even 1024x768 (although this is only good for games and movies...).
  • The Geforce chip is Very, Very nice. It pastes 3DfX to a wall with their own spinal fluid and snaps the NVidia in two. Plus if like some people *smug* you have two monitors... well, its a very nice feature.

    I just cant wait for the speculation thats going around to pan out and reveal the GeForce 512 and 1024... They should really up the Ante. The only thing that may compare is an alchemy board...

    ChAoS

  • The article says the thing has "Full AGP 2X/4X device with multi-threaded bus mastering and AGP texturing", but never talked about "multi-threaded bus mastering". I had no idea either, and was kinda dissapointed that they explained bump mapping with 3 pages and 4 pictures, but didn't say anything about this at all.

    After poking around on matrox's site, I found only a breif PDF document [matrox.com]. As far as I can tell it means the G400 can be told to DMA a vetrex list, command list, and textures, and stuff, and the G400 will decide which needs to be feteced next, and how much of it. Sounds useful, but I donno if it really is.

  • Sure they have the best video quality, but they screw their customers. They don't even try to be competitive. I remember buying a Matrox Millenium way back when. The big selling point was the ability to upgrade. The only problem was that they charged more for the upgrade than for a new card. Their other big problem is they build all their own cards so when new products come out you have to wait six months to get ahold of one or you wind up backordered because they don't do a good job of supplying the market.

    To be completely fair if you have a business workstation Matrox is the only card to use. Dollar for dollar the video quality and clarity is the best. Even old Millenium cards with 4/8 meg of WRAM are better than most of the newer 16 meg video cards.

  • Yeah, its not a card that was released yesterday. But I think its a good review of it. As the review mentions, this line of cards were meant to compete with the voodoo3 and TNT2 series of cards. The voodoo3 as of right now is still 3dfx's top offering. The only card on the market right now that is a step above the g400, voodoo3, and TNT2 is the GeForce. And not everyone has damn near $300 to spend on a DDR GeForce ;)

    I'll be interested to see how it preforms in X too, and Scott (Damage) does mention in the review that he may add Xfree 4.0 info to it at a latter time.
  • Personally, I think people should take anything that comes out of Tom Pabsts mouth with a volkswagon-sized grain of salt. Toms Hardware used to be a very good site, but he's since become one big arrogant SOB. Now, this isn't saying that everyone out there is totally un-biased when doing a review. Thats why its always a good idea to check with multiple hardware sites when you're looking at buying a expensive piece of hardware. See what a few different people think about. Never take just one persons opinion, their point of view may be significantly differnt from yours, and you may not be happy with the outcome.

    Some great sites out there for hardware reviews and news:

    Also, I like to check the user opinions at Cnet [cnet.com] and Sysopt [sysopt.com]. Its always nice to know what the average person thinks of the product after they've bought it and used it.

    just my $0.02

  • Problem is, Matrox originally distingished the segments of their product line by name:

    Mystique was the cheap card for home gaming
    Millennium was the workstation card

    Mystique 220 was the beginnings of a 3d card
    Millennium II was a killer 2D card (I have one)

    Problem was, people associated the Mystique with the bad card, so they started releasing all of their cards with the (respected) Millennium name.

    Two exceptions: Matrox's first go at a 3D accelerator (PowerVR PCX2 based, not even a Matrox chip) was the m3D, another M.

    Matrox's All-in-one card, was the Marvel (another M).

    I guess the reasoning is, that distinctiveness is important, so you can begin to build a brand identity.
  • I told ya...it's alliteration. If you managed to avoid literature classes growing up (lucky bastard!), it's the intentional reuse of consonant sounds. They wanted names that would tie the company and the product together in the minds of the consumer. Say each of these 10 times:

    "Matrox G400"

    "Matrox Millenium G400"

    Sure, the first would be just as good a functional identifier for the product. But the second "flows" better. It sounds more pleasing to the ear. This, subconciously, causes the consumer to think of smooth-flowing graphics performance. We also see a hint of corporate philosophy here; Matrox has traditionally been the card for high-end graphics users, not the hardcore gamer. All the other major players in the 3D card industry use terms symbolizing power, speed, explosive nature, etc...Matrox counters that with a name that seems to say "when you're ready to grow up, come to us."

    Good god I'm starting to sound like a marketing guy. Please shoot me.

  • Yup, my original Millenium (~$300 when I got it for my nearly new P-100) was a the best card for 2D I ever had in my machines until it got upped to an AGP G200 (it was better than the Mystique 220s by a good shot, too, but had little 3D). I put my RIVA TNT in my NT4 box, and it just can't compare to the 2D speed of the G200 (or really, the old Millen, either). You can really *feel* the difference - I did benchmarks, too (not that I remember them now). Obviously the G200 wasn't hte answer for 3D, either, but hey, the G400 (esp the MAX) is looking pretty sweet there.

    I really wish the other manufactures would try to at least reach the 2D performance achieved by the Mystique and older milleniums... 3D is certainly the gravy, but I just don't want to see my windows being drawn, refreshed, etc... The built-on-board AGP SiS graphics on so many motherboards choke at res's above 800x600 higher than 16 bit color... why? In 1995 my card could do 32 bit at 1280x1024 pretty darn quick (with 4MB). 2D performance has not improved as much as it could be, due to the 3D madness (which is fun).

    well over 90% of what people use computers for is 2D (like /.!), so they should try and improve 2D performance as well. Sure it may not be the biggest selling point for gamers, but it keeps Matrox cards in (most of) my machines... I do like the TNT for 3D games, but I guess the G400 can take care of that 8^)
  • I like the dual monitor feature because of my plan. You see, I've decided that when the time comes to replace my TV, I'm going to buy a second computer monitor instead. I was going to hook the new monitor up to an old 486 I had but now I figure I'll just hook both of them up to my main computer and upgrade my G200 to a G400.

    Also, the G200 supports Linux perfectly, and when I boot into Windows, games like System Shock II look great (System Shock II is the main reason I keep Windows around.)

  • Nope. Few GeForce cards even have Tv-out. The chipset itself is designed to run as a single, independent chip.

    The dualhead feature of the G400 (and it's inherited 2D performance) does make it a good card for 2D work in photoshop and various publishing applications. But 3D wise, yes, the GeForce does paste it to the wall (then draws a funny little mustache on it!)
  • For TV-In you're better off with a seperate TV Card.

    Why?

    When was the last time you bought a motherboard with integrated video, or intergrated sound, two things that often require upgrades (as opposed to IDE and SCSI, which are fairly static)?

    When was the last time you replaced your video card? Get one with a built in tv-tuner and you'll need to buy the same tv hardware all over again. Go for a Hauppauge WinTV-D (or plain WinTV), so you can watch TV on any video card you own, as opposed to only the card you have in at the moment.
  • Actually, I don't really think that's a big deal. If anything, its probably pretty common for 3D cards, I'm sure the GeForce and TNT2 have it as well. Anybody have any hard data to back this up?

  • Yes, I realize that 3.3.3 contained obfuscated code, at first... I also realize that they rather quickly wrote a patch with non-obfuscated code, which also got included in 3.3.6....

    Basically, I wouldn't be surprised if the drivers are binary only. Nor would I be surprised if their not. Bear in mind the alliance they now have with SGI, a company that has very rapidly become much more OS friendly. Maybe SGI can have a good influence on nVidia :-)

    Ever the optimist,
    Adam K
  • If you're running any reasonably current RH dist, a source install of Xfree4.0 is painless! Snag the source, tar -zxf it, and make World >> stuff.log & tail -f stuff.log!! When it finishes, just su root and make install!

    Be warned, it can eat up a few hundred megs of HD space by the time it finishes. And expect to wait. The dual Celeron 400 took about an hour, the K6-2 took 1.5, and my MediaGX took almost three.

  • Well, it'll run under any OS that has drivers written for it ;)

    But, I expect Matrox will provide drivers for Winxxx, and XFree86 4.0 has support for it, including multi-head and overlays (in 8-bit and 24-bit)

    For information, it also includes support for GeForce and Voodoo3

    [xfree86.org]
    http://www.xfree86.org/4.0/RELNOTES2.html#15 for info

    --
  • Do these cards help out 3D raytracing/rendering programs like POVRAY, or do they just provide the "quick and dirty" raytrace/render operations & don't provide any additional functionality for those raytrace/renderers?
  • For those of you who wanted a Windows/Linux comparison of this graphics card, our lab has been doing som preliminary work on benchmarking this card, and here are some of the results we get.

    First off, the card is a matrox G400 Max, with 32 megs. The system is a Dell Precision 220 with 2 533 Mhz PIIIs, and 128 megs of memory. We didn't do any overclocking, so these are just raw numbers. Everything was done at 32 bpp, and at 1024x768, 76hz.

    For Windows 98, using Q3A, we get the following:

    demo001: 1346 frames, 45.2 seconds: 29.8 fps
    demo002: 1399 frames, 48.1 seconds: 29.1 fps

    For Linux 2.3.47, XFree86 3.3.6, and Utah-glx 0.9, we get:

    demo001: 1346 frames, 74.4 seconds: 18.1 fps
    demo002: 1399 frames, 76.2 seconds: 18.4 fps

    Note that this does not use the DRI of Xfree86 4.0, and there are some issues with AGP, since this system uses the i820 chipset. Overall, not bad performance.

    As a side note, I was seeing similar performace with the G400 test as I was with my SGI Octanse SI with texture memory, not too shabby...
  • It's one of the best, and most widely supported, cards out there (hell for linux it beats nvidia into the ground).

    We have just bought a nice workstation with G400 (16MB) for our lab. I installed Linux on it, and tried to configure the OpenGL support, which seemed prettey straightforward (actually, it comes with SuSE 6.3). The only problem is -- it doesn't work. Some GL programs, like ssystem, are slow like a hog and with a bad case of display schisophreny (these famous double overlapping, black-white images) if you change the window size. And I couldn't get the xscreensaver-gl running in fullscreen mode. And the KDE screensavers don't work at all.

    Yes, I know. I am just a stupid biologist who doesn't know anything about Linux: but still, it sucks and is not useful for anything but, maybe, Quake - however, I rather play nethack then Quake.

    Regards,

    January

  • I own a Millenium G400 MAX and it is the best video card I have owned. I was thinking I got seriously ripped shelling out around ~250 Dollars for the 32MB version of the card.

    I have to agree I have a 19" Nokia Monitor and I even bought a GeForce256 SDR and yeah.. my Quake was a lil faster but I stare at this monitor 26 hours a day writing code some of the time and I will say the Matrox has a much better display than the GeForce. Its very obvious and I would not trade my card for anything.

    Not to mention this card has the best Linux support out their IMHO. Voodoo cards are good for linux support but I just find this card phenomenal.

    Now lets talk about the dual head display and DVD playback of this thing.
    Dual head display! ROCK ON!! Once you start using it you wont ever stop. I wish a couple of *good* games would come out that totally utilize the dual display. As it is using it in doze is just to much like fun.

    Okay DVD playback rocks to.. I lose no frames full screen :-) It has all the good audio formats supported DOLBY etc. etc. I have a Monster Sound as well so overall im super happy with this system but the video card is sorta what makes it for me.

    Hmmn okay ill stop my praise now.

  • If you have a look at http://www.matroxusers.com you will see that the production of the G400 stop in Q2 in profit of the G450 and G800 which are far more better.
  • Why was this posted? Of the hundreds of hardware reviews put online every day why did this make the cut to be posted on the main page of slashdot? This is six to eight month old hardware, there are not 3d drivers for linux for it, the article dosen't even have any linux benchmarks. What's the point?
    It's really disappointing to see slashdot go down hill with what seems like a very large number of reposts of previously covered topics or very old news.
    I'm beginning to wonder if there is a "quota" of posts to the front page that needs to be filled.
    I do have a question I havent seen answered sufficently for me. Does the "dualhead" feature work under linux? That seems to be the major selling point of this card for right now. Especially since this card is very hardware dependant for performance (needs a high horsepower chip) and even then the Geforce still wipes the floor with it. Then again most of the people on this site seem to not be interested in the hardware one-upsmanship so prevalent in 3d gaming. The 3dfx cards work far better with 300mhz class machines and have linux drivers!
    I'm a hardware geek and this post just irked me.
    I'm not trolling, I just am dissappointed in a website I check religiously and have been reading for over a year.
  • by JamesKPolk ( 13313 ) on Tuesday March 14, 2000 @06:33AM (#1204425) Homepage
    I happened to like this article. It was a good review, discussing more than just a few silly Quake and UT timedemos.

    Yes, linux numbers would be nice, but not very useful to review the hardware, given the current state of Linux 3d drivers. slashdot isn't a linux news site, after all.

    And yes, dualhead should work, under XFree 4. Try checking the hardware compatibility list, or whatever they call it. It's linked on the main xfree.org page.
  • by Microlith ( 54737 ) on Tuesday March 14, 2000 @05:07AM (#1204426)
    As has been stated (repeatedly as far as I can tell), XFree86 has built in 2D support for the G400, including support for it's dualhead feature. You can get OpenGL support from the GLx project.

    It's one of the best, and most widely supported, cards out there (hell for linux it beats nvidia into the ground).
  • by geirt ( 55254 ) on Tuesday March 14, 2000 @04:51AM (#1204427)
    As always, Matrox has excellent linux support. G400 runs fast and steady under XFree86 SVGA server. You can overclock the board with gMGAclock [sourceforge.net], a GNOME-based [gnome.org] overclocking utility for Matrox G400 cards.
  • by reality-bytes ( 119275 ) on Tuesday March 14, 2000 @04:39AM (#1204428) Homepage
    Have Matrox got stuck at this letter of the alphabet; nay this product name? The Millenium series seems to have run for a long time now as did the Mystique; it seems that video hardware manufacturers seem to get stuck with only one 'theme' for life:

    Creative: Absolutely everything Blaster

    ATI: Rage, Fury etc; anything to do with getting mad

    Now Matrox: Any card you want so long as it starts with M!

    I'm sure there are many more examples out there, can anybody think of them?
  • by Anonymous Coward on Tuesday March 14, 2000 @04:41AM (#1204429)
    the G400 max runs on a 100mhz faster clock speed the reg G400 shown in this review. I believe sharky extreme did a review of the MAX and showed with the G400 MAXes (at the time) beta drivers it was a hair slower then the TNT2U.

    And the TNT2 does NOT look better then the G400 in visual quality, I own both (a G400 MAX anyway) and the G400 has a slight edge on the TNT2. (TNT2 on my windows box, G400 on my linux box). Allthough, you accually have to sit down and take a good look at both before you decide.

    This review was done very poorly. a G400 MAX vs TNT2U would have been a better selection. They decided to equate it price vs price instead. So i guess their GeForce DDR review will pin the card up against other $250-$300 cards? Yeh, right.

    Oh well.

  • by HedsSpaz ( 143961 ) on Tuesday March 14, 2000 @04:24AM (#1204430)
    It has full OGL implementations under 9x, 2000 and NT4.0. XFree86 has built in support for all Matrox cards. You can get OGL support via the Glx project here http://utah-glx.sourceforge.net/ [sourceforge.net]
    The one thing I would like to know is why this card is being reviewed now? It's been out for a number of months now, at least since last summer. Not that it's a bad card mind you, in fact, while it may not give you the highest frame rate around, it sure as hell will give you the best image quality around.
    But hey, whatever. It doesn't really matter, it just seems a little strange.
    HedsSpaz
  • by JamesKPolk ( 13313 ) on Tuesday March 14, 2000 @04:40AM (#1204431) Homepage
    Argh! I've loved Matrox cards since I first had a Mystique (not a 220, just the original). For a programmer, I find that 2D image clarity should be FAR FAR FAR more important than an extra few FPS in some game! Thus, I get very irritated at reviews which gloss over visual quality, and only worry about fill rates or whatnot.

    This review was a refreshing, and relieving.

    I have a Millennium II now, and I find that the image quality, in 2D, is amazingly clearer, than the output of the TNT2 or Voodoo3. I've used Matrox cards for roughly 4 years now, and now looking at displays of other cards, feels like looking through a piece of thick plastic.

    With the disappointing performance of the G200, though, I was worried that Matrox woudl get run out of business, simply because of the sudden insane focus on 3D speed. This article makes me feel a bit relieved, since it shows that Matrox 3D can keep up with NVidia where it counts, in Quake, run under the #1 gaming platform, Windows 98.

    That, and with Matrox showing far more commitment to open drivers than NVidia (binary driver for XFree 4? Gag!), I'd guess that NVidia will fall behind 3dfx and Matrox in the small, but activist, open source community.

An Ada exception is when a routine gets in trouble and says 'Beam me up, Scotty'.

Working...