Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Technology

Review of the Matrox G450 For Linux 77

The Evil Dwarf from Hell writes "Hardware sites for the most part concentrate their reviews of new equipment for the Windoze OS. AnandTech has a head to head review of the Linux drivers for the GeForce2 MX and the Matrox G450. The GeForce2 MX dominates in the test scores, but the G450 is interesting in its ability to use 2 monitors simultaneously. A single desktop that is 3840x1280 is incredible."
This discussion has been archived. No new comments can be posted.

Review of the Matrox G450 For Linux

Comments Filter:
  • by Phokus ( 192971 )
    Dual Monitor for Linux... YUMMY
  • It would be nice if they'd start to open source their drivers..
    sure people would figure out what little tricks they've used, but they'd save ALOT of money by letting some opensource coders doing the work.
  • now that is sweet. immm vmware. what is up evil dwarf...! drq
  • I've run this kind of thing before under Windoze, and it was extremely useful. I'd considered this under linux, but never made a concerted effort to bring it around.

    Has anyone else succeeded in two desktops with regular video cards? I'm just curious. While I'm flooding the list with questions, how are the laptop people out there handling docking stations and external monitors?

    And the standard - 'Wow, cool, I'm glad the hardware manufacturers are taking notice, blah blah blah' ...

  • by SquadBoy ( 167263 ) on Tuesday September 19, 2000 @05:41AM (#768954) Homepage Journal
    AFAIK, The matrox drivers have been OSS from almost the start. On the other hand the EULA from Nvidia says this. "No Reverse Engineering. Customer may not reverse engineer, decompile, or disassemble the SOFTWARE, nor attempt in any other manner to obtain the source code. No Separation of Components. The SOFTWARE is licensed as a single product. Its component parts may not be separated for use on more than one computer, nor otherwise used separately from the other parts. No Rental. Customer may not rent or lease the SOFTWARE to someone else." This is sad because I really like their chipsets and would love to use them but on those machines where video is important I don't feel I can because I simply can not think of any reasons for using closed source for mission critical applications. I *really* wish that Nvidia would open their drivers.
  • Their drivers are open source and the info needed to write your own driver is available too.
    Matrox was perhaps the first major graphics chipset manufacturer to open the specs of their stuff.
  • So, when you start hitting CTRL+PAGE DOWN, will one monitor shut off while the other instantaneously switches into letterbox mode?
  • yep, it was actually easier getting my G200 to work with a Millennium II than that G450 in the article. :)

    Jeff Brubaker
    Linux Tech Editor
    Anandtech
  • Different results, but consider this: if you like the dual-monitor setup, don't mind modifying your /etc/XF86Config file, and either own a 28" FD Trinitron Vega über monitor, or enjoy squinting at your 19" monitor, then Matrox is the way to go. But if you use a lot of OpenGL programs, go NVidia.

    One side note, Diane Vanasse, once at Matrox, now works at NVidia as the PR dominatrix. I wonder if she's anything like Yvette the pyromaniac from The Kids In The Hall. "Hé! Mon feu!"

  • by twilight ( 11986 ) on Tuesday September 19, 2000 @05:46AM (#768959)
    That's not QUITE true.

    Actually, the Matrox drivers are open source, but rely on a closed source library (hallib) to achieve dual head or TV/DVI out. This is because
    copyrighted code (by Macrovision) in the library.

    So, you can get any Matrox card working with OSS drivers, but if you want dual head, you'll have to link with that library (its distributed in the drivers from their site).

    NVIDIA's drivers, on the other hand, are closed source. I regret not bringing this point up in the article actually.

    Jeff Brubaker
    Linux Tech Writer
    Anandtech
  • by Anonymous Coward
    Dont you mean CTRL ALT - ?
  • by Kalvin ( 45585 )
    If the 'The Evil Dwarf from Hell' had actually bothered check out the facts he would have noticed that the Geforce2 MX supports two monitors:

    http://www.nvidia.com/P roducts/GeForce2MX.nsf/twinview.html [nvidia.com]

    Now if the linux driver doesn't support dual monitors then that's a whole different matter.
  • they didn't review performance under Linux in any way shape or form. they say this plan to do this later. the review is entirely Windows-centric.
  • But the shipping MX cards don't seem to implement this feature...
  • "but the G450 is interesting in its ability to use 2 monitors simultaneously."

    NVidia already has this on their Quadro2 and Quadro2 MXR chipsets. I think the TwinView function is still only in the Windows drivers, but it'd probably be a welcome sight in Linux.

  • by twilight ( 11986 ) on Tuesday September 19, 2000 @06:02AM (#768965)
    Actually, I wrote the article, the GeForce2 MX DOES support Twin View (the Windows drivers didn't implement it until recently), the card I had was a Twin View capable card, but it's not suppored under XFree86 yet.

    Jeff Brubaker
    Linux Tech Writer
    Anandtech
  • I think you read the wrong article.

    :)

    Jeff Brubaker
    Linux Tech Writer
    Anandtech
  • It's fairly simple once you get X 4.0 up and running. I've had it work with ATI cards and a G100, and a TNT2 and a G100. In fact I'm using it at the moment.
  • Hmmm...

    I'm looking at the photo's of the 2 cards, and to tell the truth - I don't see on 2 VGA out connectors - only 1 normal VGA and the other which I don't know what it is (DVI?)

    So how can you connect 2 SVGA monitors to it?
  • by Icebox ( 153775 )
    Soon I bet Maxtor will sue Matrox for reengineering their name.

  • I've been orphaned by too many companies in the past, and no longer purchase hardware with closed source drivers. When I was upgrading a few months ago, I was interested in both Matrox and Nvidia cards. I did my research, saw that the Matrox drivers were open, and bought the Matrox card despite its slower speeds. I will consider doing business with Nvidia, if they open the source to their drivers.

    I would like to also mention that I dashed a query about Linux support for the Matrox rainbow runner off to the contact address on their web page and never got a reply back. So though I like Matrox in general, they get a thumbs down on their customer service from me. I don't think it's too much to expect a timely (or any) reply to an E-Mail query for information, even if the answer is "We don't know."

  • haha.. good catch.

    Those aren't my pictures actually, I didn't even put them in the article. The "senior editors" added that to break up the text a little bit after I submitted it.

    The card I have is actually a Twin View 1 VGA 1 DVI connector card. Anand tried getting it to work with a DVI->VGA adapter and had a lot of trouble. Then again, he only had a few minutes to play with it before I grabbed the thing for the article. :)
  • Yeah, next up is to figure out how to get the Millennium II and the G450 to play nice together to enjoy a three headed display. :)

    BTW - for those interested, the trick is that once you link with Hallib, you throw old-Matrox card support out the window. This includes both the drivers from Matrox's site and the drivers in DRI's CVS (which support Dual Head if you link with hallib).

    I found someone who has recompiled Matrox's drivers and rewritten all the symbols to be mgc. Now he can use one driver with his G400 and one with the Millennium II. Here's a link:

    http://www.xfree86.org/pipermail/xpert/2000-Sept ember/001438.html
  • The link to Anandtech in the article is broken, Mr. Taco.
  • Actually, they've been getting MUCH better. I typically get responses on the forums in an hour or so now. (forums on matrox.com) BUT, this is by customer service reps that have pieces of paper describing what they support and what they don't. Still, there's SOME useful information.
  • yes, i did read the wrong article. thats because of the page layout. you might want to consider redoing this.
  • by tjwhaynes ( 114792 ) on Tuesday September 19, 2000 @06:22AM (#768976)

    I first came across a few comments by Rasterman on how he was intending to try and lever OpenGL acceleration to render windows in Enlightenment many months ago. This struck me as being a smart way to get true alpha transparency support for the windows/menus/icons and not completely stuff up the CPU with processing by offloading the processing to the GPU. It also opens the doorway to a whole host of fancy, over the top special effects such as spinning, shrinking windows when you iconify them and the fancy transient effects seen in the Mac OS X window manager. This is the first tests I've seen of the actual code, but does anyone know how close the development code is to being an effective OpenGL accelerated window manager?

    Cheers,

    Toby Haynes

  • There was no mention of Linux whatsoever in the G450 review!!! 100% WinNT/2000.
  • Nope, not yet, but I did talk to them yesterday about clocking to make sure I had it right.

    As for Win2k, it works fine. I have a tri boot machine (need MICROS~1 to use the admin stuff at anandtech). In fact, the Win2k drivers are interesting in that unlike typical Windows dual-head, the two screens are COMPLETELY joined. You can even have a mouse cursor in between the two monitors. It did not reverse the outputs for me.

    More interestingly, DRI's CVS stuff linked with Hallib won't even let you change the XF86Config file to reverse the displays back to normal, it's stuck at being backwards. At least, that's my experiences at home. With XFree86 4.0.1 and the Matrox drivers (as used in the article), I just gave the primary display Screen 1 and the secondary Screen 0, and that worked fine.

    Jeff Brubaker
    Linux Tech Writer
    AnandTech
  • Duh Slashdot,

    yet again we have provided someone with a rake load of traffic for.....NOTHING.

    The linked article is cobbled together review of the g450 for WINDOWS (I haven't looked at the GeForce side) with a cover page discussing Linux. You can see here [anandtech.com] the trail of where this story came from! The review features lovely snapshots of Windows drivers and it doesn't look like the reviewer has been near X.

    I haven't been as happy in a long time as when I saw this story posted (this is essentially the decision I am making in the next fortnight or so baring the Radeon) and to have actually read a document on what you could get out of these under Linux would have been brilliant. Instead I am another person writing a comment about the quality of the posted story on /.

  • If you don't object to closed source drivers with no sort of guarantee of future support, go NVidia. If you want to use entirely free software, go Matrox.
  • What the fsck? Who cares about _LINUX_ drivers, the thing I want is _XFREE86_ support. Damn linsuxers.

    Please do moderate me down, I'm so terribly wrong about this. Your windowing environment isn't X (what's that?), it's either "KDE" or "Linux".
  • by tjwhaynes ( 114792 ) on Tuesday September 19, 2000 @06:31AM (#768982)

    The linked article is cobbled together review of the g450 for WINDOWS (I haven't looked at the GeForce side) with a cover page discussing Linux. You can see here the trail of where this story came from! The review features lovely snapshots of Windows drivers and it doesn't look like the reviewer has been near X.

    Sorry - you are going to have to swallow your pride a little! Scroll down that page to the base where it has a link to XFree86 background [anandtech.com] and you will find the rest of the review. Just because there are links to two Windows reviews of the two cards doesn't mean that that is all! :-)

    Cheers,

    Toby Haynes

  • by twilight ( 11986 ) on Tuesday September 19, 2000 @06:31AM (#768983)
    It's a LONG way off. Evas is 90% done by the looks of it, and Raster even has the first app using it, Etcher. Check out screenshots on http://www.rasterman.com/

    As for EFM and Enlightenment, they're BOTH going to be rewritten and combined. Hopefully enough code will carry over that it won't be THAT major of an effort, but I wouldn't expect to see anything for a while.

    The alpha blended, transparent window thing wont work even with Evas due to X limitations. Check out the Render extension though, http://www.xfree86.org/~keithp/ -- it'll do it and they almost have it working with normal X servers judging from teh mailing list.

    Evas is good for things like actually drawing out the windows. If you've used EFM, you know that it can start slowing down with a lot of icons -- and it should, that's a lot of alpha blending going on. With Evas, every icon will be drawn with hardware acceleration. (evas_test goes from 10fps in Imlib2 software mode to over 100 using OpenGL typically).

    Jeff Brubaker
    Linux Tech Writer
    AnandTech
  • With the release of XF4 i tried to use multiple monitor to get a "unified" desktop using different video card. In 10 minutes i set up a desktop of 3840x1024 with a matrox g400 (single head) agp, a s3 virge pci and a ati rage pci. It's really easy and the result worth the place on the desk : it's fun to have several gnome-terminal tail'ing -f log file and monitoring stuff while leaving place to do something else. Without the need to change the virtual desktop, only by turning the head. And think about GIMP on a resolution like that!

    Xinerama is a good thing, but the current architecture can't help us. There is only one AGP slot and available PCI become a problem when a sound and ethernet card are installed. With the speed of new bus a well designed serial bus fast enough to handle video could be used to install serveral screen on it, and the video card could be included inside the screen!

    A standard slot inside the screen could be installed inside all future screen to excluded the video card from the main board and giving the choice of the user to use one video card instead of an other... to let USB (or other future bus) monitor be used in a infinite number on the same computer.
  • Actually, the drivers SHOULD work with FreeBSD too, so it's not just Linux. As for DRI support, I really don't know.
  • Since everyone seems to be reading the wrong article, there must be something counter-intuitive with AnandTech's interface. So here's the direct link.

    http://www.anandtech.com/showdoc.html?i=1322
  • Oooooops how blind of me
  • the 400 was a disaster; only 1 screen was accel'd.

    to-date, I have built 3 dualhead systems; all with matrox cards. usually 1 agp and 1 pci. and various flavors of cards, even as old as the millennium-1 4meg pci.

    the only time I had X hangs on dualhead/dualcard was with xscreensaver. I think it did evil things to ram when it overwrote buffers ;-( but lately this seems to be fixed. I run production dualhead at work (cannot afford reboots or hangs!) and also at home (same thing: I work at home a lot and need high reliability).

    with an agp/pci combo, you can see the speed of bitblts on the agp screen whereas the pci side is a bit slower. but even on my millennium-1 pci side, opaque window moves at 1600x1200x16bpp are still quite usable.

    given that you can buy older pci/agp matrox cards for well under $50 ea, its still a good win to use a pair of cards. sucks that I lose a spare pci slot but what the hell - both cards DO run quite fast.

    once you get used to dualhead (and xinerama) you never want to go back..

    --

  • sorry, mixed them two up ...
    just a bit depressed cuz verant changed encryption :p
  • As for 2D, both screens are accelerated. Using the G450 was faster than using my G200/Millennium II combo I used to use.

    As for 3D, forget it, you don't get acceleration under either head unless you forego Xinerama.

    Yes, you still have a software cursor on the second head though.

    You weren't using the kernel's framebuffer drivers were you? That's the old school way of doing things and left the second head completely unaccelerated.

    Jeff Brubaker
    Linux Tech Writer
    AnandTech
  • You weren't using the kernel's framebuffer drivers were you?

    no, just 'regular' old xinerama and xfree 3.9.16 (was the last stable reliable version that seemed to work with any and all matrox cards I owned).

    I've had, at various times, millennium 1 and 2, g100, g200 and even a mystyque. my home box has a g200/sgram + millennium-1 and my work box is a g200/sdram + millennium-2.

    when I brought my windows friend to work to show him my dualhead display (and actually dragging windows across phys screens) he yawned 'but win* has had that for a long time'. then I typed 'uptime' and showed him 3 months of uptime (since my last hardware change - wasn't linux's fault). that quieted him down a bit ;-)

    --

  • And when my company, friends and assorted newbies ask me what they should buy, I point them at Matrox, too. Nvidia IS losing more money off me than the $200 or so their card goes for.

    And my karma's already maxed out, so there wouldn't be a hell of a lot of point in whoreing for more. Would there?

  • The one big desktop approach that Matrox uses Win2K is br0ken. You can't have independant resolutions or monitor positions, and your #1 screen is always to the right.

    Apparently, this is due to limitations in Windows, and won't be fixed until "Whistler" at the earliest.
  • You are talking about the G400's Dual Head feature, right? I didn't realize that existed back in 3.9.16. I was under the impression that it was a recently added feature with the latest Matrox releases and that it requires their hallib.a binary only library.

    Interesting, very interesting..

    Anyway, might want to give it another shot if you have that G400 still. It worked great for me. It's the first time that the Millennium II actually felt that much slower.
  • I don't think anybody's bothered writing a DRI kernel driver for FreeBSD yet, let alone OpenBSD.
  • and for those folks who are too lazy to copy and paste, bleh [anandtech.com].
  • Any idea if the GeForce 2 MX supports XvImages (hardware YUV->RGB conversion and scaling) under XFree4.x?
  • You are talking about the G400's Dual Head feature, right?

    sorry, I guess I was a little offtopic. no, I was talking about dual video cards (I did say this in my first posting to this thread).

    the dualhead support in xf86 3.9.x was very slow - probably due to the linux frame buffer support in the kernel. but I've heard that dualhead is still better with a pair of cards, even today.

    --

  • Go again this morning, fool. We're not talking about the G450 article from last week, we're talking about the G450 vs. GeForce 2 MX article from this morning.

    Oy vey...
  • by twilight ( 11986 ) on Tuesday September 19, 2000 @07:34AM (#769000)
    Actually, that's not quite correct. The Matrox drivers released on their site are completely OSS and all changes in them will be incorporated into the next release of XFree86. Hallib, the closed-source library that you mention, is necessary ONLY for Dual Head, DVI and TV out. As distributed with XFree86, the driver will work fine, provide 2D AND 3D acceleration.

    The library only handles:
    1. Setting the card's clock
    2. Initializing screens properly for
    TV, DVI or Dual Head output.

    This comes straight from a Matrox Linux developer too, by the way.

    Consider the Matrox "released" drivers to be nothing more than the code in DRI's CVS tree linked with Hallib. That's not quite accurate, but it's close to the case.

    Jeff Brubaker
    Linux Tech Writer
    AnandTech
  • .. this may be a little off topic, but not really. I hear alot about video cards and gamers these days. But what about video editing under Linux? No one has mentioned anything about video editing in a while (if at all) and this topic has hit the linux kernel mailing list a few times. I am interested in setting up a new system to do full screen video in from a video camera.

    What is the best card to do video camera in to computer capturing? The idea is that I can take a video camera around and get some movie clips. Then I can take those video tapes and get them on my computer as avi or mpeg. Next I'd take them and burn them on cdrom. Or make video email from them. This all can be done under windows and Mac. Any idea if any of this is being done under Linux?

    I've tried webcams and they are okay but not as good quality as I am looking for.

    More importatnly if I were going to spend less than $2000 on a new system what would I need (MB, CPU, memory, video card, HD, and video camera. I have cdrom and burner)

    I am posting here cause slashdot would never post this question (or anything else I have posted) as slashdot hates my posts.

    I don't want a lot, I just want it all!
    Flame away, I have a hose!

  • This is the card that needs linux drivers


    pc world story [pcworld.com]

    gamers neeed not apply..

  • Is it possible to have different resolutions on the displays ? The XFREE docs are only stating that the screens must be in the same bit depth but nothing about the resolution.

    I have an 17" monitor and i'll put another 14" one beside him (with a monitor arm - no table space for 2 17" monitors :( ) if i could run 800x600 on the 14" one and 1280x1024 on the 17" one to have an L shaped kinda display (i seen w98 doing that and i have seen a picture of an old mac doing that) it would be perfect. Without Xinerama (screen :0.0 and :0.1) it works (3.3.x) but could Xinerama cope with that ? (what kind of strage output will xwd create from a L shaped root window ? )


    --
  • Yes Xinerama can handle that.

    However, the response from people on the mailing list (can't remember if it was XFree's Xpert list or the DRI list) was that it works much better at the same resolution.

    The same bitdepth is a requirement though. That can be tricky because some cards work at 24 bpp while others do 32 bpp and some run at 15 bpp while others run at 16bpp.

    Jeff Brubaker
    Linux Tech Writer
    AnandTech
  • Twin View is being worked on for next driver release.
    This is from #nvidia and is definitely not an _official_ statement.
  • /me dons fireproof suit. The AC writes "won't use what works just because you can't look at the source" Of course I would not use it because I can not look at the source. It comes down to who controls my business my box and stuff. Me or someone else. If I have the source and I notice a bug or want a feature I can fix it or add it. If I can I can do it myself if not I can always hire someone to do it. This means I control it. This is why I will not use something that is closed source on any mission critical box. Now granted graphics don't fall into that very often but sometimes they do. And in any case I *really* don't like having to depend on a vendor for fixes. That having been said I do use a TNT2 that I bought back when it looked like nivida would open the drivers and I do like it very much. But I won't buy another one.
  • I din't pay a damned thing for slackware 7, i left it online for 1 1/2 days and let it dl what i needed. I never pay for anything. The shit ppl say about it not being free is a load of bullshit. linuxgod.net is proof of that, i downloaded that entire server OS with a old ass Minix 386 SX.

    A 3 year old could install Caldera.

    My box is 1 year old, and NT doesn't even work with my hardware. Linux works fine on it. No problems, nothing missing.

    Im sorry, but you have to be a fucking idiot to not be able to use commandline. Remember DOS?

    Im running 2 monitors on this box, please say that its not possible because ill send you the picture that i took with a USB camera on a linuxbox.

    Different from what? win? riiight, I don't like the name win, and I never have any use for a win machine.

    I run Quake, Quake2, Quake3, Unreal, UT, Myth2, Civ call to power, DoomGL, SOF, and lots others on a 4 gig partiton.

    I use linux at work, mainly because winsuck can't do crap with perl, php, Java, and XML. We have no use for win machines. They can't do the job.

    cd, dir, fdisk etc... show me....

    BiLLy GaTeZ iZ Dedd.
  • At the risk of being off-topic:

    So our choices for 2d/3d accel under linux are:

    • nVidia GeForce
    • Matrox g400/450
    • 3dLabs Voodoo
    Are there any others? Is the Voodoo even in the running?

    (asking because I'm thinking of upgrading and was hoping for more options to choose from...)

  • 3Dfx Voodoo cards (you mentioned 3dLabs, they're supported too, but you got the names mixed up, I think)

    Also, ATI's Rage128 based cards and teh Intel i810/815 chipsets (onboard video, crappy, but it works).

    There's a few more too, but I can't remember off the top of my head.

    Jeff
  • Now i know that this has been discussed before, but i was just thinking. Wouldn't it be nice to run two seperate X sessions, with Windows(running through Win4Lin or something) running on one monitor, and X on the other? I think this is an amazing idea . . . Unfortunatly i dont use windows nearly enough for it to be usefull
  • Well, the cameras are kinda pricey, but Firewire/i-Link/IEEE1394 video is pretty cool. The native DV standard is 720x480@29.97fps(NTSC) or 720x560@25fps (I think) (PAL).

    I've got a Sony TRV-310 (~US$800 last Christmas) and and an ADS Pyro Firewire card (US$70 a couple months ago). The nice thing about the camera is it can play and digitize even old 8mm camcorder tapes.

    See the DVgrab links page [schirmacher.de] for info on exactly what software and hardware are needed/available.

    There's one open-source video editing app (Broadcast2000 [linuxave.net]) and one commercial (MainActor [mainconcept.de]) for Linux that I know of.

    Note that such camcorders store and transmit using the DV standard, which is compressed to ~3.7MB/sec. There are also raw video cameras available, though I don't know if they are supported yet. For scientific work you may need a raw camera, for personal or broadcast work DV is ample.

  • I am looking into upgrading my outdated Riva128, and am considering both the GeForce2 and the G400 (I can't find the G450?). My question is: how do these cards compare in quality of image and quality of drivers? Matrox has always been praised for their sharpness of image, but I wonder how much a difference this really makes, unless you are talking about seriously high resolutions. How about quality of OpenGL rendering and color matching?
  • Yeah, that would actually work very well. You don't need Xinerama for it either so you'd probably be able to get some 3D hardware acceleration, though I'm not sure about that.

    Further, you could even have separate keyboards and mice if you have USB in your kernel or have use serial keyboards/mice. :)

    Personally, I wouldn't want to waste the monitor.

    Jeff
  • No - when running multiple monitors with xinerama, +/- only change the resolution on the monitor where the mouse happens to be. This appears to be the case when using a dual head card like the G400, or when using multiple video cards

    ymmv

  • He's probably one of those Lemmings in which his father bought him a NT machine last X-Mas.
  • Lots of tech problems with this, mostly that the actual child windows and window borders are not controlled by OpenGL. You can draw the desktop and window borders using OpenGL, but the boundaries are going to remain opaque and square (unless you use the shape extension, but then they still lack partial transparency).

    I do think a concerted effort should be made to merge the remaining graphics functionality into OpenGL (add TT anti-aliased unicode fonts, arbitrary clipping regions, and the ability to create and transform windows to the glxContext). Then programs could work exclusively in OpenGL and never think about drawing with X. However I mentioned this to Brain Paul once and he seemed to think it was a terrible idea.

  • OK, Ive swallowed the 'linux for the desktop' long enough. Dont get me wrong, I *love* linux, I have Tux tattooed on my arm (literally), I use it exclusively (except for a w95 vm for developer 6)... However, I need to vent :)

    Picture the scene, I hear about this 'xinerama' thing, and think 'excellent, thats just what I need'. Matrox release X4.0 drivers for the G400 dual head, so I go and buy one. I rebuild my box with debian potato, try to learn apt (Im used to RPM but it sucks), get X4.0 onto the box via binaries, and configure. Woohoo, up comes xinerama, etc etc etc

    OK, here's my bitch: Now I want to install Quake 3 arena, I mean, X4.0 has this DRI thing (which I gather is an implementation of OpenGL). I install Quake 3 arena, and run. The left head goes blank (I think q3 is running fullscreen, sort of) and in the bottom left hand corner is this tiny q3 screen running at about 3 fps...

    So, I think 'this isnt running hardware accelerated!'. I search the net, trying to find an answer. I finally asked some E guys (from memory), and they tell me you cant run xinerama AND opengl! I think, ok, it makes sense, what would happen if you dragged an opengl app from one head to the other??.. But youd think it would at least support opengl confined to a single head....

    So, I switch off xinerama and rerun. Same thing happens. I search around again, and supposedly I need kernel 2.4 test7 or something, with compiled in agp support. Now, Im thinking, I want to compile E from cvs (since its the only WM Ive found that supports xinerama and I love it), and in order to use some of the kernel patches (imon related stuff) I have to be running 2.2.something.... It never ends, I am literally in the linux equivalent of 'dll hell'.

    My main bitch is not that this crap doesnt work (I can live without opengl until things calm down) but the fact that this open-source thingo is meant to be rockin, but it is suffering from lack of direction (it seems)...

    I keep thinking 'oh, next version everything will calm down and fall into sync', but everytime something nears the level of maturity to allow this, someone gets bored and goes off on a tangent. You have to match kernel/kernel patches/graphic card drivers/X window/gui toolkit/window manager/applications and it is becoming tiresome... Everyone has their own unique idea of what the desktop should be, effort is being duplicated, and thing is a big stinking mess.

    Linux is great for server stuff, but Im wondering whether the desktop is worth the effort and maybe we should all be running beos or something?

    If someone wants to come to my rescue and explain all this junk to me I may change my mind :)

    Please dont flame me, these are genuine observations from 'one of us'...
    Simon
  • I just recently got dualhead running, without a matrox card I might add, just a AGP creative TNT 16MB and a PCI Voodoo3 2000 16MB. Didn't take much, a RedHat 6.2 install, installed XFree86 4.0.1... done, XF86setup takes care of the whole thing. Xinerama takes care of the rest, updated to the latest version of Xinerama aware Enlightenment, and it's fantastic. Read the Xinerama HOWTO for more info, and multiple layouts are possible for starting X with or without dualhead/xinerama, take one evening and you're set. Your milage may vary, but you'll never go back to single head.
  • The deal is, right now a lot of technologies are being finalized - X4.0, DRI, agpgart, the framebuffer, GLX, and probably some I'm forgetting. Unfortunately, they don't all finish at the same rate. Right now, the biggest "hold-out" is 2.4, although they shouldn't rush it. Once all that is in place, GL should be a snap (emphesis on should). It's just a matter of time (yes, that's been said before. But it's a matter of a forseeable amount of time now).
  • Actually, there's no reason you can't run DRI on a 2.2.x kernel. The review I wrote on Anandtech was running 2.2.16 from Red Hat. You just need to have agpgart for the kernel, which is a patch for kernels = 2.2.17 and will be included in 2.2.18 I believe.

    But, it's really not quite fair to complain about DRI and the other recent X technologies not working. In a lot of ways, they're still extremely "in development." Most things on Linux are this way, if you want to settle for technology that has been around for a while, everything works fine, but to get newer things working, it's a pain in the butt. Think USB, DRI, the bttv drivers (my card JUST got supported in the later 2.3.x kernels), my SCSI card (ugh), ...

    Anyway, point being, distributions will have this sorted out for you in their next revisions. Debian will probably take another two years, but it'll happen. :)
  • I'm using a G200 right now because it has sync-on-green, and because there is a console driver in the kernel. This allows me to keep it at 1280x1024 all the time, even on the console, so I can use my surplus Mentor Graphics workstation monitor (which has only 3 BNC connectors, the sync is part of the "green" signal). But I do have two of those monitors. I wonder if I could install a second G200 and do dual-head with that...
  • The GeForce2 will kicks the G400's ass all over the place in terms of performance. The G400 (don't buy the 450 if you can afford the G400MAX, the 450 is lower performance) has better image quality, but it really isn't noticible unless your running 1280x1024+. Also, unless you've got a really good monitor you won't notice it that much at all. However, if you're not going to use 3D, G400 is the way to go (unless your running Linux, then GForce2 seems to be faster/less flaky.) G400MAX also tends to have better color quality in both normal running and 3D rendering. However, if performance is at all important, then buy a GeForce2. You won't be dissapointed with rendering quality or image quality. It's not as good as Matrox's, but is easily a solid second, and the card offers a much better performance/quality ratio that the G400MAX.
  • Don't delude yourself. If you're going to spend good money on a video workstation, use NT. Linux might eventually be great for video, but right now, the platform is just too immature and the programs aren't quite there yet. Your experience in NT will be a lot more stable, and you'll have a much more flexible environment (in terms of programs and all.) Any CPU in the 450MHz+ range ought to do it, memory should be 256MB at least, and you need a fast harddrive. (A fast Ultra66 drive like IBM or Maxtor will be fine.) You'll need a firewire card (maybe $150-$200 for a decent one) and around $700-800 on a good camera. (The Canon ZR10's are pretty good. If you can afford it, get a Sony. They're quirky (like 20minute minidiscs as storge and has an ethernet interface instead of firewire) and they're hideously expensive (2.5K) but has unbelievable image quality.
  • It's a terrible idea because it breaks OpenGL complience. Also, you don't put a windowing system into a graphics library, you do the reverse. The true solution is this. Dump X. Dump the hell out of it. Develop Berlin. You've got your rotating, spinning, OpenGL accelerated windows.
  • If you use entirely free software, you're an idiot. Stand up for something important. Save the environment, feed the poor. Software is software. Supporting it will just help people who have good jobs anway, whether or not they get paid for it. If you're going to give up so much more performance just for the sake of free software, than OSS is screwed. If OSS doesn't have to be held to the same standards of quality as everyone else, then OSS programs will suck. Use the best hardware in your price range and screw the open/closed-ness of the drivers. Look at it pragmatically, without religious frevor. Linux is an important platform for SGI/NVIDIA. They will support it for the forseeable future. Given that, the open-ness of the drivers shouldn't matter.
  • I do want a merge. We don't have to call it OpenGL.

    But I definately want a single "graphics context". It is insane that I have to use totally different code to draw a rectangle in my X window verses my OpenGL window. And for text, OpenGL has a way to specify a position and a color. I think it is entirely illogical that I cannot use these values to control how my text is drawn. Therefore any new way to add text should be added to OpenGL, or we should add *all* of OpenGL to the new "graphics context" that can draw this text.

    I would also like to see a high-speed way to draw an image projected through the current transformation. Using OpenGL textures works, but I think there is too much overhead because it assummes the texture will be reused. In the case of a movie or other cg image that will change this is a waste.

    For the windowing system I consider this a graphics operation is based on how NeWS worked: when I create a window I need a shape (path), a position (transformation), and a parent window. All of these are attributes of a "graphics context" and thus I again see to reason to not make "create a window" an operation in the graphics context.

Scientists will study your brain to learn more about your distant cousin, Man.

Working...