Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Fedora's OpenGL Composite Desktop 392

An anonymous reader writes "First we had Novell's XGL and Compiz technology, which allows for OpenGL-based composite rendering on the Linux desktop. Now Fedora has created the Advanced Indirect GL X project, which aims for similar desktop effects but with a simpler implementation. Sure, at the end of the day it's just eye candy, but make no mistake - the Linux desktop is due for a massive shake-up!"
This discussion has been archived. No new comments can be posted.

Fedora's OpenGL Composite Desktop

Comments Filter:
  • by Watson Ladd ( 955755 ) on Wednesday February 22, 2006 @01:33PM (#14777696)
    How does this relate to the ongoing accelerated X11 efforts?
  • Screenshots (Score:2, Interesting)

    by Enigma_Man ( 756516 ) on Wednesday February 22, 2006 @01:37PM (#14777719) Homepage

    How can they talk about graphics advances without screenshots? I believe the term used these days is "TTIWWP".

    -Jesse
  • Not again (Score:4, Interesting)

    by prockcore ( 543967 ) on Wednesday February 22, 2006 @01:40PM (#14777757)
    I'm not liking where this is headed. Now we've got Xgl, Aigl and whatever Luminocity used.

    Why couldn't they just standardize on Xgl? It works *today*. Aigl doesn't even support my nvidia card right now.
  • by dtsazza ( 956120 ) on Wednesday February 22, 2006 @01:45PM (#14777792)
    It'll certainly be interesting to see effects like those of 3dDesk [sourceforge.net] the norm, rather than the exception. Also, if anyone else has played with it they might have noticed that it essentially works with lowish-res images of the desktop rather than the windows and icons themselves - you can notice it in some of the modes, there's a definite switch between the desktop itself, and the image of the desktop (in both directions). Having something fully integrated will open up many new possibilities... ...and on the same note, it's a challenge to designers to use them in a truly worthwhile way. While I agree with that eye candy does make a difference, it can also make a difference in a bad way when clueless designers turn to snazzy effects to make up for lack of basic competence (viz. many many webpages). It's the difference between
    "Let's make improvements to X - ooh, that new 3D stuff could help with that"
    and
    "Wow, that 3D stuff sure is snazzy! We'd best think up a way to get it into our next release."
    Now I'm certainly not saying that this is bad news, far from it in fact, but I can imagine there'll be temptation there to use it at any cost (especially once it starts making its way into competing projects). Hopefully interface designers will embrace the new possibilites open to them and give us some genuinely useful/nice improvements.
  • by Anonymous Coward on Wednesday February 22, 2006 @01:45PM (#14777796)
    I am absolutely irate with respect this project and XGL and even expose on the mac. Heaven forbid they let use scale a window and keep it scaled. I don't mean resize, I mean zoom in and out of ONE window. Imagine how useful that will be for tiled window managers, if we can scale windows we don't need to worry about the app handling resizing right.

    DEVELOPERS PLEASE STOP WORRYING ABOUT THE EYE CANDY AND THINK ABOUT FUNCTIONALITY.
  • What? (Score:0, Interesting)

    by SalsaDoom ( 14830 ) on Wednesday February 22, 2006 @01:59PM (#14777903) Journal

    Hi. :)

    While Linux eye candy is some of the sweetest in the market, IMHO, it's one of the reasons Linux will never be mainstream.

    I don't understand what you mean here, our eyecandy rules, so thats why we'll never be mainstream? huh?

    What we need is a concerted effort from our worldwide developers to create better interoperability with Microsoft's Active Directory structure and better hardware compatibility.

    Re: Active Directory: This is going on with the new Samba stuff. Its all being worked on as we speak, and in fact its coming along nicely. I think the samba team released a preview shortly ago.

    Re: Hardware support: I'm tired of hearing this. We create all the drivers we can for the hardware we have specs for. Better hardware support has to come in the form of vendors helping us with it. Very little of this has to do with the linux kernel team.

    What's also missing is the "zero-user" configurability that Windows has, allowing any user to load and install any application or hardware accessory without needing to be a hardware tech. Linux need to be engineereed to be "smarter" for the casual office user.

    No offense here, but you don't work in the industry do you? You don't go to a corporate office and see users installing any shit they want, you don't see them swapping out the video cards or whatever. That sort of thing is useful for the home user, but its no good to the corporate user. Also, hardware autodetection is handled by the distros, and I know redhat and ubuntu do a pretty good job of it where the specs are properly released to the appropriate team. The casual office user doesn't install his own software, he uses what the admin put on there. Windows machines in a corporate environment are locked down hard. In this regard Linux is already setup nicely for a corporate environment and the casual office user.

    Only until we solve the above issues and Linux becomes more mainstream on the corporate desktop should we worry about the eye candy factor.

    If I had a dime for everytime I heard someone say something like this. Did it occur do you that different programmers are good at different things? Leave them alone man. When it comes to interoperability its a thousand times harder when you have to reverse engineer something to get it to play, you know MS doesn't want linux to have good AD support.

    Linux has problems, but none of them are really the above. Obnoxious, ignorant vendors are the biggest problem, everyone who doesn't program is clamoring for better support, blaming the linux devs for not properly supporting their pet problem. Better MS Office? MS's fault. Better hardware support? ATI/nVidia/Broadcom/Whoever's fault. Better hardware autodetection? Same as the previous crowd. Linux moves against the grain of the rest of the industry and thus has a harder time of it.

  • by thk ( 142232 ) on Wednesday February 22, 2006 @02:25PM (#14778101) Homepage
    Now we're seeing FOSS' killer application -- mix and match modularity. Obviously RH needs to respond to Novel's efforts or potentially lose market/mind share. Because the different approaches are built on a truely open platform, you don't have to ditch your current environment from the hardware on up in order to get the solution that is right for you. Competition to fill niches exposed by open API's works. Anyone can play. (And of course there's also the fact that someone can come along and distill the best of serveral solutions into a derivative FOSS work.) There's something quite satisfying about that, particularly in relation to much of the rest of the modern world.
  • by 0xABADC0DA ( 867955 ) on Wednesday February 22, 2006 @02:31PM (#14778148)
    I don't mean to troll, but what the heck is wrong with the Fedora-type people that they think incrementally improving the X server is a good idea? I've looked into the source and its full of 30-year old code. The 'best practices' for a 0.1 MIPS machine is just cruft on a 1000 MIPS one.

    The xgl people are actually rewriting the X server from scratch to use opengl. That is a much, much better idea, and it shows with what they can *already* do:

    * virtual desktops on a cube
    * popup effect for menus
    * "gummi-bear" window effect when moving, sticks to other windows / side of screen
    * translucenty
    * gl screensaver on root window
    * shadows
    * fading
    * magnification
    * apple-style expose (show all windows non-overlapping)
    * accellerated 3d games (quake) and movies
    * make non-responsive windows go grey
    etc

    You can see the video at:
    http://lists.freedesktop.org/archives/xorg/2006-Ja nuary/011922.html [freedesktop.org]
    (click link for the movie)

    This is I think using an existing Xserver to give an opengl window, which can be running a software opengl for unsupported cards, and then their xgl server using that as the opengl backend until the drivers are ready. Which basically means people will be able to get the eye candy slowly on computers and force nvidia/ati/intel to support the server with a driver. Eventually xgl gets a native opengl driver for you hardware and runs as a 'normal' X server (only without all the crap from 30 years of evolution).
  • by ravee ( 201020 ) on Wednesday February 22, 2006 @02:39PM (#14778221) Homepage Journal
    Why should Red Hat go and start a seperate project to achieve more or less the same effects ? Can't they work with the XGL project team and improve on the existing code ? What good will re-inventing the wheel or duplicating the code help achieve? Reading about this takes me back to the 80s when the UNIX OS was severely fractured with applications working on one Unix flavor not running on another flavor of Unix. Even though Red Hat is doing a good thing, it is actually taking a step back by forking the project.
  • by IamTheRealMike ( 537420 ) on Wednesday February 22, 2006 @04:16PM (#14779062)
    Blah. Is that so? How comes then the first the world hears of AIGLX was on OSNews, but I've been reading about XGL on the Xorg mailing lists and development forums for literally years. Red Hat may claim they've been luvvy duvvy community huggers over this, but I've been watching the developments in X very carefully indeed and XDevConf 2006 (!) is the first mention I saw of it.

    Let's see. The GLX_EXT_texture_from_pixmap extension was developed jointly by David Reveman and some guys from nVidia, according to the credits on the spec. So not Red Hat. Reveman and Matthias Hopf have been everywhere on the X/Mesa mailing lists developing Xgl. The discussion and debate on the xorg list was all about Xgl and whether it should be the main focus instead of Exa. People who don't seem to be associated with any corp like David Airlie and Jon Smirl have been working on Xgl. The plan had seemed to be to move various parts of the driver code to do with initializing the cards into the kernel, use EGL as a simple GL interface that Xgl then ran on top of, with Xglx being a short term hack until that work was completed.

    Now Red Hat appear, apparently with the backing of nVidia, saying that actually this plan - which had been discussed for ages - is a bad one, and they have a brilliant new plan. Oh and by the way Evil Novell have been hoarding code and not working with the community.

    So when did this AIGLX work appear in CVS then? I don't recall reading about any such branch. Let's find out [freedesktop.org] shall we? Hmm, looks like it was committed in a massive checkin about a month ago. Did Kristian just magic this out of thin air one afternoon? I rather hope not.

    So anyway, my point is that from my perspective what Red Hat are saying appears to be the exact inverse of the truth. Novell have been far more visible in the X community doing this sort of work than Red Hat have, they've done a lot of the upstream Mesa work necessary for it to be efficient, they've been demoing it at conferences and so on. And now Red Hat is here trying to claim they went off and did their own thing, with no real evidence to back it up.

    And it's not just Red Hat, somehow Novell went off and created an entirely new window manager as they were testing what Xgl could do instead of extending an existing one. Oops! Bah. Huge, massive communications failure at best. Blatant NIH at worst.

  • by billybob2 ( 755512 ) on Wednesday February 22, 2006 @04:44PM (#14779298)
    So what happened to the OpenGraphics.org [opengraphics.org] project that aimed to create a graphics card with fully published specs and open source drivers? If they actually get it working, there will be no more debate as to what video card Linux users (and manufacturers of Linux-capable computers) should buy.

    Given the fact that the best open source drivers currently available are those for ATI cards, and that those cards are 15% as efficient in Linux as in Windows, the Open Graphics [slashdot.org] card will have no competition.
  • Re:Not again (Score:2, Interesting)

    by octopus72 ( 936841 ) on Wednesday February 22, 2006 @05:03PM (#14779470)
    XGL has much more potential than accelerated indirect GL. Compositing manager on AIGLX can do same as one working on GLX. Main thing here is good 3D driver support with proper extensions.

    XGL, however, can also do accelerated XRender (using glitz), and it's more similar to Quartz. It is also open for future protocol improvements, e.g adding some 3D API support to X server. For ol' xorg server (with XAA or EXA), it would mean lot's of rewriting and either drastically improving driver model (duplicating GL API) or using GL and essentialy becoming same thing as XGL. Note that Nvidia is already doinng GL-based imsplementation of old XFree driver model.

    Still, few important problems need to be addresed in XGL, like direct GL (at least fullscreen), multihead support, screen hotplugging, etc.

    Is compiz the best approach, I don't really know. Maybe they should've modified metaclty to do what they want (and that is primarily good plugin system for 3D effects).
  • by shaitand ( 626655 ) on Wednesday February 22, 2006 @06:18PM (#14780162) Journal
    We need to nip this in the bud right here. My understanding is that this approach will still allow the same eyecandy but will lose the only REAL feature of XGL. A hardware accelerated desktop. Some of you like the eyecandy and transparent windows. That must be nice for you. The rest of us want a snappy and responsive desktop. XGL delivers that by hardware accelerating the entire xserver.

    If my understanding is incorrect then by all means, enlighten me. If not, then please stop with differing standards and approaches and embrace the fully functional system in existance today.

    P.S. Nvidia will use what they have to. They support this approach because it requires the less work on their part than XGL and therefore costs less money. Therefore, their opinion should be ignored and only the interests of the USERS should be considered.
  • by 0xABADC0DA ( 867955 ) on Wednesday February 22, 2006 @07:28PM (#14780740)
    That's disappointing news that they are not rewriting the whole thing, but only the hardware-dependent code. I hope you are on crack saying that the rest is "relatively clean" (maybe relative to the device-dependent X code???).

    Some of the cruft:
    * using the value of a constant in a comment (/* XLFG length is 255 */)
    * form feeds (ctrl-l) in the code
    * magic macros with many lines of hidden code side effects (BRESINCRPGON for example)... to avoid slowdowns on cpu-drawn lines and paths. Nobody does this anymore. It's all accellerated and anyway they use static inline (see linux kernel).
    * massive argument lists (XRenderCompositeDoublePoly takes 12 arguments)
    * massive #define of symbols, and thus massive switch statements (not in a table someplace!). try searching programs/xkbprint/psgeom.c for XK_ISO_Prev_Group_Lock.
    * symbols artificially limited to 32 characters long because compilers back then were dumb.
    * supid implementation decisions, for example inserting a 'fake' client request should be a one-liner (ala requests.addFirst(fakeRequest) but instead is 64 lines because it actually puts the data into the stream being read from the client.

    I mean seriously, you just open up *any* file in the Xserver and it's just crap. I don't mean to diss the developers because a) it's a somewhat large undertaking and b) they didn't have the advantages of hindsight and c) they were using slow hardware. Still, I bet the NeWS server was much better despite being made about the same time. Hopefully the device-dependent part will be done well enough that the Xserver can be rewritten in something modern (Java, ObjectiveC, even C++).
  • Re:Don't forget... (Score:3, Interesting)

    by zsau ( 266209 ) <slashdot@thecart o g r a p h e rs.net> on Thursday February 23, 2006 @08:17AM (#14783588) Homepage Journal
    Actually, a half-decent serifed capital A will usually have curves around the serifs. You probably actually need 300 dpi before you can see it though, it's one reason printed output is so much more of a delight to read than on-screen.

The hardest part of climbing the ladder of success is getting through the crowd at the bottom.

Working...