Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Software Linux

NVidia Releases Linux Drivers Supporting 4K Stacks 380

Supermathie writes "NVidia has finally released drivers for their chipsets and the 2.6 kernel that support 4K stacks. That means compatability with Fedora Core 2 kernels, people! View the README, visit their driver page, or download the package."
This discussion has been archived. No new comments can be posted.

NVidia Releases Linux Drivers Supporting 4K Stacks

Comments Filter:
  • Real Story... (Score:5, Insightful)

    by ThisNukes4u ( 752508 ) <tcoppi@@@gmail...com> on Saturday July 03, 2004 @12:42AM (#9597894) Homepage
    The real story is when they open the source to the drivers.
    • Re:Real Story... (Score:5, Insightful)

      by el-spectre ( 668104 ) on Saturday July 03, 2004 @12:47AM (#9597920) Journal
      No, that is another story.

      An even better story will be when folks realize that it is OK for the whole world not to agree with them on philosophy. Especially when those philosophies have economic ramifications.

      But I ain't holding my breath.
      • Re:Real Story... (Score:5, Interesting)

        by Anonymous Coward on Saturday July 03, 2004 @01:21AM (#9598054)
        Methinks the only real reason you'd want to keep your drivers closed off is because you're artificially handicapping your hardware to increase differentials between various (actually fairly identical) cards you've got on the market. Conspiratorial? Yes, but nothing that doesn't happen all the time.
        I wholeheartedly agree that closed-source code is appropriate for all manner of enterprises (and philosophically, I tend to look at executable code as an open, gloriously inaccessible book anyway). But closed-source device drivers? Just makes me wonder what they're hiding.
        • Re:Real Story... (Score:5, Interesting)

          by pnatural ( 59329 ) on Saturday July 03, 2004 @02:20AM (#9598244)
          As far as I remember, NVidia has maintained that some of the code in their drivers is licensed from a third party, and that the license does not permit source redistribution.

          Several things:

          1. There really isn't a way to verify that the drivers actually ship with the third-party code; NVidia may be using the issue to quelch requests for open drivers.

          2. Goes to show how the license of the code you use in your projects can have determental impact on your future goals (or beneficial, depending on those goals, of course).

          3. I think it's more likely that the drivers sources are kept closed because there's some benchmark tricks, or worse, cheats.

        • by steveha ( 103154 ) on Saturday July 03, 2004 @02:43AM (#9598304) Homepage
          Methinks the only real reason you'd want to keep your drivers closed off is because you're artificially handicapping your hardware

          Um, no.

          0) nVidia might not own all the code they compile into their drivers. The license they have the code under might permit binary distribution, but not source.

          1) nVidia's drivers contain large amounts of software that is better than any of their competition. They spent money developing this, and they want to milk the competitive edge it gives them. And that is okay.

          2) nVidia has more control this way. The Firefox guys are holding control over their cool icons, because they don't want the cool icons slapped onto broken code; only Mozilla-official builds of Firefox get the cool icons. nVidia might want to be sure that no one runs with broken drivers, then thinks nVidia cards are all junk, when in reality some guy made a few "improvements" that broke things, and distributed the changed version anyway.

          3) Other reasons are possible. "the only real reason" my left foot.

          Personally, I would much much rather have FOSS drivers. But even more than that, I want drivers that work. I switched from a GeForce 4600 to a Radeon 9600 XT, and even though the Radeon is a much better card, it runs slower under Linux than the older GeForce. It's the drivers. ATI's Linux drivers for the 9600 XT are lame. I actually boot into Windows to play Unreal Tournament 2004, because the performance is so much better under Windows. When I had an nVidia card, my Linux 3D gaming performance was just fine.

          If nVidia would make a programmable-shaders card that doesn't double as a space heater, I would probably buy it and replace the Radeon. I know that the Unreal Tournament guys check the server stats, and I want to be "voting" for Linux gaming, so I want them to see me running Linux when the check stats on the servers I have been visiting.

          steveha
          • nVidia might want to be sure that no one runs with broken drivers, then thinks nVidia cards are all junk, when in reality some guy made a few "improvements" that broke things, and distributed the changed version anyway.

            NVIDIA could register a trademark for their official Open Source driver build and disallow the use of the trademark on the builds which are modified. The Apache does it like that, modified versions aren't "Apache" anymore.

            1) nVidia's drivers contain large amounts of software that is be
      • It is a driver. nVidia, ATI, Matrox, etc. act as if releasing how to code for their hardware will expose to their competitors their precious designs. Does any new Matrox board even work well in 2D in Linux anymore?

        They, and by a similar token, wireless network chip makers, are kind of counterpoint to the entire IC industry. Most semiconductor manufacturers freely give away information on how to use their product, even giving away free, non-obfuscated source code!

        I really doubt the economic ramification
      • I agree with you main point that there are times when holing on to the source code in encouraged. On-line games are one instance where I hope that the company does keep a lid on the code. I don't even certian Monopoly Operation systems for wanting to keep a close reign on there source, it is after all how they make money.

        NVidia on the otherhand is making money purly on hardware and drivers are a sunk cost. They have to be availible or thier cards won't sell, and they have to be good or their cards will
      • Re:Real Story... (Score:4, Interesting)

        by bit01 ( 644603 ) on Saturday July 03, 2004 @04:53AM (#9598680)

        I'm getting very sick of astroturfers trying to push their marketing drivel (straight out of South Park: "closed source is gooood") at the start of slashdot replies.

        By definition, for the customer (us!), open source must provide at least all the options of closed source. All the grandparent did was highlight what is probably the most beneficial potential change for slashdot'ers. If NVidia had released the source as that poster had suggested the 4K problem probably would've been fixed within hours.

        ---

        It's wrong that an intellectual property creator should not be rewarded for their work.
        It's equally wrong that an IP creator should be rewarded too many times for the one piece of work, for exactly the same reasons.
        Reform IP law and stop the M$/RIAA abuse.

    • Hear, hear. And a big raspberry to whoever modded that legitimate comment a troll.
  • by Thaidog ( 235587 ) <slashdot753@@@nym...hush...com> on Saturday July 03, 2004 @12:43AM (#9597896)
    Ok... wtf is a 4k stack?
    • by Nermal6693 ( 622898 ) on Saturday July 03, 2004 @12:46AM (#9597914)
      K means Kelvin, a measure of temperature.
      4 K is very cold.
      A stack is a collection of pancakes.
      Therefore we're talking about frozen pancakes.

      In other words, I have no idea.
    • by rmull ( 26174 ) on Saturday July 03, 2004 @12:50AM (#9597925) Homepage
    • by Anonymous Coward on Saturday July 03, 2004 @12:51AM (#9597936)
      It's an essentially obscure change they made in the 2.6 Linux Kernel. The idea was that the smaller stack lets you run more threads and perform better under higher IRQ loads. In reality, since pages are 4KB anyways, and most processors not only swap but also cache memory in 4KB pages, if the stacks don't actually use more than 4KB there's no advantage to artificially limiting them--the other memory doesn't really even need to "exist." It also required rewriting and reworking lots of things, such as these NVidia drivers, that assumed the stack size would be much larger than 4KB.

      You can turn off the 4KB stack and go back to the default behavior by recompiling the kernel with the proper option set, but default Linux distros based on 2.6 all use (to the best of my knowledge) 4KB stacks by default.
    • by jejones ( 115979 ) on Saturday July 03, 2004 @01:35AM (#9598116) Journal
      OK... If you're a programmer, you know about stacks; they've been almost THE canonical way to allocate space for the broad family of "Algol-like languages" since the classic reference on implementing Algol 60. If you're not a programmer... you've seen those stacks of plates at cafeterias and restaurants, or of cups at the convenience store? The important property they have is LIFO (last in, first out). Think of each plate as a place where you can write some information. A function is run, it grabs a few plates for the things it needs to remember. When it is finished, it puts the plates back (you can only take an anlalogy so far, of course--if you put your plates back right away at the cafeteria, you'd gross people out). As long as there are enough plates left, it doesn't care who else called it, or how many callers came before it. All it needs to know is to go to the stack and get the number of plates it needs.

      When you make a system call, it typically executes on its own stack, separate from the one you get for user state. The question is, how big should that stack be? It constrains how deeply nested you can get into function calls while in system state and how much space they can chew up for local variables. Until recently on Linux it's been 8K bytes (think 8192 plates), but they switched over to 4K, only half as much space (or half as many plates).

      Some drivers as written count on having that whole 8K of space to play with, and they have to be rewritten. Since nvidia provides neither an Open Source driver nor sufficient information to allow anyone else to write one, however, it means that we have to wait until they deign to make that change. Fortunately, they've gotten around to it.
  • by Crasoum ( 618885 ) on Saturday July 03, 2004 @12:43AM (#9597899) Journal
    I will miss thee.
  • Yippee!!! (Score:5, Funny)

    by CliffH ( 64518 ) <cliff.hairston@g m a i l . com> on Saturday July 03, 2004 @12:44AM (#9597906) Homepage Journal
    Now I can get my ass kicked in Enemy Territory under Fedora Core 2. I was missing that but for some reason, I got so much more work done. :)
    • Re:Yippee!!! (Score:2, Informative)

      by Doogie5526 ( 737968 )
      No such luck man. I recompiled the kernel myself so I've been playing ET for awhile. The problem is ET Pro sees the newer glibs in Fedora as a hack... so I get kicked from every server I join (for cheating). It has been fixed in the unstable branch. But even after it reaches stable, each server will owner need to install the update (i dont expect that to happen soon). What a pain in the ass.

      I know ET Pro is a addon for ET, but it seems like every server uses it.

  • The Best Test (Score:5, Informative)

    by DeadBugs ( 546475 ) on Saturday July 03, 2004 @12:46AM (#9597915) Homepage
    For me the best way to test these new drivers is to play Enemy Territory [3dgamers.com]

    One of the best online FPS games and it's free-as-in-beer.

    Keep up the good work NVIDIA.
  • by maizena ( 640458 ) on Saturday July 03, 2004 @12:49AM (#9597923)
    It seems that this driver's OpenGL headers are a little buggy, but the solution was given by NVIDIA employee in this [nvnews.net] thread at nvnews.net forum.
  • by Thagg ( 9904 ) <thadbeier@gmail.com> on Saturday July 03, 2004 @12:50AM (#9597928) Journal
    I've been testing these drivers under Fedora Core 2 for a while, and they appear to work flawlessly.

    Thad
  • by crow ( 16139 ) on Saturday July 03, 2004 @12:51AM (#9597933) Homepage Journal
    For people who are building home theater PCs for things like MythTV, this is a major step forward. The last release that supported overscan (so that a TV image doesn't have black stripes on the sides) was many releases back (version 4363). This release not only supports Linux 2.6 with 4K stacks, but has overscan and interlace support, making it ideal for TV and HDTV display.
  • by Anonymous Coward
    Are there any video card manufacturers left who release other than binary only drivers?
    • by Bootsy Collins ( 549938 ) on Saturday July 03, 2004 @02:33AM (#9598281)

      Are there any video card manufacturers left who release other than binary only drivers?

      Matrox releases open-source drivers for some of their product lines (e.g. the Millenium G series -- G400, G450, G550, etc.). The mga driver that comes along with X is the same as Matrox's, for that reason. And 2D performance under the open-sourced Matrox drivers is actually pretty damned good. This all sounds great, doesn't it? Unfortunately, Matrox's Linux support sucks, and the support for Matrox from the DRI project is fairly nonexistent right now. So if you do have any problems with the driver, or want to get 3D/DRI/hardware acceleration issues solved, you're gonna have to learn to hack the drivers/kernel modules yourself. Good luck.

  • by Xpilot ( 117961 ) on Saturday July 03, 2004 @12:57AM (#9597965) Homepage
    ...with the latest 2.6 kernels, simply turn off 4K stacks. But hey, now it's not necessary. Yay.

    4k stacks are a good thing, a first step for Linux to support an insane amount of simultaneous processes on the system.

  • by Anonymous Coward
    nVidia has finally released drivers for their chipsets and the 2.6 kernel that support 4K stacks...

    I don't know about you guys, but I think having the source code to recompile it manually would help out immensely.

    It's funny when you think about why hardware companies is they like to keep the source code secret (i.e. you only get the drivers). If they claim that someone may use it for some unfit purpose then the question is, if someone has the source code without the hardware isn't it inherently useles

    • by Graelin ( 309958 ) on Saturday July 03, 2004 @01:58AM (#9598181)
      I don't know about you guys, but I think having the source code to recompile it manually would help out immensely.

      That's funny, I don't.

      First, fixing this stack size problem is not a simple re-compile of the same code. Depending on how the driver is written this is certainly a non-trivial task.

      Second, even if you had the source that does not mean that you could distribute a fixed version. Open source != Free Software.

      Third, they may be closed source drivers but they are miles ahead of the current FOSS drivers. The Zealots can run their "pure" systems and suffer graphics glitches and poor 3d performance. I'd rather just use something that works. If that meant sticking with by old kernel a bit longer then so be it.

      they just don't want to fork it over because somehow you may "magically" make the component up yourself out of basement and not have to buy it.

      Not you - their competition. ATI has always been plauged by crap drivers. If ATI had a peak into how NVidia does it you can be sure they'd take something away from it. NVidia would lose a competitive advantage. The GPU war is nasty. The competition is killer - they'll take any advantage they can get.

  • Further Testing (Score:3, Interesting)

    by dangerz ( 540904 ) <<ten.soidutsadlit> <ta> <ffuts>> on Saturday July 03, 2004 @01:29AM (#9598094) Homepage
    I think I'll wait for this to be tested for more than 24 hours before I try my hand on it.
  • I hope ATI [ati.com] can catch up to compete because their current Linux drivers are terrible. I am disappointed. :(
  • ...is another important thing :) Finally, proper 32bit ioctls and libraries, no more mixing 32/64 bit releases and trying to use indirect rendering.

    Btw, that was done for DRI drivers quite a while ago - talk about the usefulness of having access to the source code. And no, they aren't that useless - you can still play UT2004 with them, although it won't look as good(and I didn't notice much difference, except for performance, in ET(btw, for some reason, my FX5200 is _way_ slower while playing on radar/batt
  • by This is outrageous! ( 745631 ) on Saturday July 03, 2004 @02:05AM (#9598199)
    Thank you nVidia. Now could you
    P L E A S E
    compile those drivers for us PowerPC owners [petitiononline.com] who also pay for the cards?

    It's not like nobody can do it... [apple.com]

    Thank you.

    • Hear hear - if the PowerPC systems had decent 3D video support for linux, I'd be running linux on mac now...
    • by NanoGator ( 522640 ) on Saturday July 03, 2004 @04:26AM (#9598604) Homepage Journal
      "P L E A S E
      compile those drivers for us PowerPC owners who also pay for the cards?"


      Oooo ooo ooo can I be the first to be modded up for saying "Just another reason to switch to Windows!"..?
    • by Sycraft-fu ( 314770 ) on Saturday July 03, 2004 @04:49AM (#9598669)
      Not trying to knowck on you, but please realise that you are in a very severe minority. Most people in the world use x86 systems. Just a fact of life, for better or worse. It's over 90%, in fact. Now when you break down x86 users, you find that, for desktops, it is again severly one sided with most people using Windows. Again we are talking over 90%.

      Hence Linux support is kind of thin at this point, it's just a smaller market than Windows. However some people, like nVidia, fell that there is enough to warrant writing drivers for, to increase sales. Remember: This is a company, they don't do thing for the good of humanity, they do things to make money.

      So let's take the Mac now, being the only real PPC platform that would use nVidia cards. What percentage of computers are Macs is something of dispute, but it's between 3-5%. Well then you consider that most Mac users don't run Linux. It's VERY rare, in fact, since one of the reasons most Mac users buy Macs is for MacOS. It is certianly under 5%, and probably under 1%.

      So, even using optimistic numbers, you are talking 0.25% of the market, and realisticly it's probably more like 0.05% or less.

      Now on top of that, second hand sales of Mac graphics cards are pretty low. Since they are special, and aren't compatible with normal off-the-shelf PC cards, you don't see a lot of them sold. What you buy with a Mac is what you have for the life of that Mac in most cases. Well, that means there isn't a big incentive to get you to switch to nVidia cards. You either got one with the Mac, or you didn't. You aren't likely to change later so no profit motive for nVidia.

      So you have a very small percentage of computer users that aren't likely to change cards after purchase, that use a different processor architecutre (and hence require more programming and testing). Not really a ripe market for a driver port.

      You have to understand that the x86 Linux market is populated by a high number of DIY computer builders. Those people can, and are, swayed to certian hardware by availibility of non-suck drivers. Thus it is in nVidia's financial intrest to make drivers for them, though they are a small market segment. The PPC Linux market is not capable of DYIing and is less likely to change to a new card because of it. Also, it is a much smaller market. thus it is NOT in nVidia's financial intrest to make a driver for it.

      When you deal with corperations, at least ones of any deceant size, you always have to remember that it is money that they care about, not humanity. They do things because they make them money, or get them good press, which leads to more money. Not because those things are for the good of humanity.
  • ATI (Score:5, Informative)

    by daemonc ( 145175 ) on Saturday July 03, 2004 @02:48AM (#9598328)
    Let's hope ATI follows suit.

    It took 2 third party patches and a recompile to get it their driver to install on Fedora Core 2, and it still crashes WineX.
  • by iwbcman ( 603788 ) on Saturday July 03, 2004 @05:22AM (#9598755) Homepage

    I must admit-I am a bit suprised that SLASHDOT didn't pick up on it. It might just be a little insignificant thing which doesn't warrant much attention anyway-who knows. Of course everyone is mentioning the support for 4k stacks. And of course this is important. Anyone who has used Andrew Morton's patch set knows what a PITA this issue was. But nvidia even did more than fix the single most blocking issue regarding their drivers and the 2.6.x kernels.
    They also:
    Added support for ACPI
    Fixed problem that prevented 32-bit kernel driver from running on certain AMD64 CPUs.

    Added support for GLSL (OpenGL Shading Language).
    along with the new nvidia-settings utility-GPL'ed and written in GTK2....
    and finally they added:
    Added a new Xv adaptor on GeForce4 and GeForce FX which uses the 3D engine to do Xv PutImage requests.
    Now I am not an expert on such things-25 years of experience and I am still left asking more questions than my ability to answer. _But I noticed this little innocuous "xv" thing and was like WOW-cool. I leave it up to those who know more to shoot me down-but doesn't this little "xv" thing mean that all those Linux users who use nvidia GeForce4 and FX cards suddenly got a a tremendous boost when doing much of anything with video ? After all XV is what all of the video players under Linux use for good quality full-screen video(mplayer, xine, totem, gxine, helixplayer etc.)
    Now if I understand this correctly everytime a PutImage() request comes along under XV this is handed over to the 3D engine-automatically. It seems as if this would be a very, very significant reduction in CPU usage-particularly for older generation(PII/PIII) machines which happen to have fairly modern graphic cards. Full-screen divx under mplayer with the new drivers uses 12% CPU on average on my machine-I unfortunately did not do a benchmark to test this-but if my memory serves me correctly this is significantly less than what is was with the older drivers.
    Now the downside to this-at least for the time being- is that some apps don't quite work with these new changes-Xine-and it's siblings(totem,gxine, kxine etc.)
    But I assume these will be fixed pronto.
    Well where am I going qoing with this train of thought:
    Putting this kind of support for XV in the NVIDIA drivers -is really simple for the NVIDIA guys-perhaps even trivial-but it can mean a tremendous improvement for the users of these cards. NVIDIA has always treated Linux like a second class citizen-but hey who can complain-at least they acknowledge that Linux exists-compared to the BSD's Linux support is great-of course only if you are using x86 CPU's. Now everyone knows that the graphic workstation market has all but disappeared. But what if NVIDIA was to decide to simply really take advantage of the X11 windowing system and it's features.
    Imagine if NVIDIA would actually provide good RENDER support-wow what a difference that would make for 2D desktop support-particularly under GNOME which uses RENDER extensively in VTE/PANGO-ie. why text scrolling in gnome-terminal is so abysmal. I am still stumped by the fact that the open-source X11 nvidia drivers support RENDEr far, far better than NVIDIA's own in-house drivers.....
    Imagine if NVIDIA would really support the libfixes, libdamages and libcomposite extensions which are currently being developed at Xorg-X11. Sun's Looking Glass is already using libdamages and libfixes-I got it up and running on my machine yesterday-and yes it is still pre-alpha-but I have never, ever seen such a fluid desktop environment. This tech is almost *evil*- the promise which it presents is simply baffling-rendering all previous X11 windowing experiences to the days of the stone age. I don't really care that much about Looking Glass-if NVIDIA properly supports the X11 extensions we will have cairo-enabled desktops inside of the next year which will fundamentally alter the X11 experience for X users.
    Ok. So here is the point of this little essay: If NVIDIA would simpl
    • Snipped from the driver's README:

      Option "RenderAccel" "boolean"

      Enable or disable hardware acceleration of the RENDER
      extension. THIS OPTION IS EXPERIMENTAL. ENABLE IT AT YOUR
      OWN RISK. There is no correctness test suite for the
      RENDER extension so NVIDIA can not verify that RENDER
      acceleration works correctly. Default: hardware
      acceleration of the RENDER extension is disabled.

      Personally I haven't noticed any difference, but then I've got some AGP issues, so YM

  • Not all is perfect (Score:4, Interesting)

    by MasterVidBoi ( 267096 ) on Saturday July 03, 2004 @12:23PM (#9600248)
    There are still serious bugs left from the previous revision, which was six months ago. That's a bit long to wait. While GLSL and 6800 support is nice, an interm bugfix wouldn't be unwelcome...

    A problem that leaves the console framebuffer blank after X is started remains. You need to work around it by adding
    Option "IgnoreDisplayDevices" "TV"
    to your xorg.conf. If you are actually using TV out, this could be a bit annoying.

    Even worse, it hasn't been more than 24 hours since I've installed them, and these drivers have already hung X twice. When an OpenGL process segfaulted, that process assumes state D (uninteruptable sleep), and becomes completely unkillable, along with X itself. I haven't figured out how to reboot cleanly once this happens. All I can do is ssh in, sync the disks, and hit the power button.
  • by Velex ( 120469 ) on Saturday July 03, 2004 @05:38PM (#9601936) Journal
    I'm fully aware of the liscensing issues and the whole PITA that doing that for them would involve, but here's the thing:

    Right now, I have to "dual-boot" my X depending on whether I want good RENDER performance or want to run OpenGL stuff. My webpage has a theme I really like that my boyfriend made. The background is an animated GIF of rain falling. I'll get 100% CPU usage on my Athlon XP 1400+ and my browser will become practically unresponsive using the "nvidia" driver, but when I switch over to the open source "nv" driver, it does maybe 15% CPU usage -- just like in Windows.

    Mesa as absolutely unacceptable for doing 3D graphics. Even a simple shooter I'm working on called "Blammo" for the time being will chug to about 5 fps under "nv."

    Now, if only we could bring the features of the "nv" driver and the "nvidia" driver together.

    I think the main problem with "linux being ready for the desktop" (as though it isn't -- all that linux really lacks is the ability to twist the arms of OEMs) is that if you want to use certain hardware, you can't get optimal drivers. This is, of course, a vicious circle, because NVidia could fix the problem I have in the "nvidia" driver tomorrow if they wanted, but they won't, because the target market is too small to waste their time.

    I might be willing to pay $300 for a brand-spanking-new ubervideocard once the X drivers get fixed, but there are also about 300 other people willing to do the same so long as the Windows drivers stay working.

    Perhaps the solution therefore is to change the liscense on the "nv" driver so that NVidia can use the code that's already out there. It makes the authors of the "nv" driver saints, and NVidia stays an evil corporation, and I get Windows-like performance out of my hardware in X, and everyone's happy.

Say "twenty-three-skiddoo" to logout.

Working...