Fedora's OpenGL Composite Desktop 392
An anonymous reader writes "First we had Novell's XGL and Compiz technology, which allows for OpenGL-based composite rendering on the Linux desktop. Now Fedora has created the Advanced Indirect GL X project, which aims for similar desktop effects but with a simpler implementation. Sure, at the end of the day it's just eye candy, but make no mistake - the Linux desktop is due for a massive shake-up!"
What about X11 acceleration? (Score:4, Interesting)
Screenshots (Score:2, Interesting)
How can they talk about graphics advances without screenshots? I believe the term used these days is "TTIWWP".
-JesseNot again (Score:4, Interesting)
Why couldn't they just standardize on Xgl? It works *today*. Aigl doesn't even support my nvidia card right now.
Interesting applications (Score:4, Interesting)
How about actually letting us use Scaled Windows (Score:3, Interesting)
DEVELOPERS PLEASE STOP WORRYING ABOUT THE EYE CANDY AND THINK ABOUT FUNCTIONALITY.
What? (Score:0, Interesting)
Hi. :)
While Linux eye candy is some of the sweetest in the market, IMHO, it's one of the reasons Linux will never be mainstream.
I don't understand what you mean here, our eyecandy rules, so thats why we'll never be mainstream? huh?
What we need is a concerted effort from our worldwide developers to create better interoperability with Microsoft's Active Directory structure and better hardware compatibility.
Re: Active Directory: This is going on with the new Samba stuff. Its all being worked on as we speak, and in fact its coming along nicely. I think the samba team released a preview shortly ago.
Re: Hardware support: I'm tired of hearing this. We create all the drivers we can for the hardware we have specs for. Better hardware support has to come in the form of vendors helping us with it. Very little of this has to do with the linux kernel team.
What's also missing is the "zero-user" configurability that Windows has, allowing any user to load and install any application or hardware accessory without needing to be a hardware tech. Linux need to be engineereed to be "smarter" for the casual office user.
No offense here, but you don't work in the industry do you? You don't go to a corporate office and see users installing any shit they want, you don't see them swapping out the video cards or whatever. That sort of thing is useful for the home user, but its no good to the corporate user. Also, hardware autodetection is handled by the distros, and I know redhat and ubuntu do a pretty good job of it where the specs are properly released to the appropriate team. The casual office user doesn't install his own software, he uses what the admin put on there. Windows machines in a corporate environment are locked down hard. In this regard Linux is already setup nicely for a corporate environment and the casual office user.
Only until we solve the above issues and Linux becomes more mainstream on the corporate desktop should we worry about the eye candy factor.
If I had a dime for everytime I heard someone say something like this. Did it occur do you that different programmers are good at different things? Leave them alone man. When it comes to interoperability its a thousand times harder when you have to reverse engineer something to get it to play, you know MS doesn't want linux to have good AD support.
Linux has problems, but none of them are really the above. Obnoxious, ignorant vendors are the biggest problem, everyone who doesn't program is clamoring for better support, blaming the linux devs for not properly supporting their pet problem. Better MS Office? MS's fault. Better hardware support? ATI/nVidia/Broadcom/Whoever's fault. Better hardware autodetection? Same as the previous crowd. Linux moves against the grain of the rest of the industry and thus has a harder time of it.
Competition done right (Score:2, Interesting)
Xgl Already Better than Mac and Vista (Score:4, Interesting)
The xgl people are actually rewriting the X server from scratch to use opengl. That is a much, much better idea, and it shows with what they can *already* do:
* virtual desktops on a cube
* popup effect for menus
* "gummi-bear" window effect when moving, sticks to other windows / side of screen
* translucenty
* gl screensaver on root window
* shadows
* fading
* magnification
* apple-style expose (show all windows non-overlapping)
* accellerated 3d games (quake) and movies
* make non-responsive windows go grey
etc
You can see the video at:
http://lists.freedesktop.org/archives/xorg/2006-J
(click link for the movie)
This is I think using an existing Xserver to give an opengl window, which can be running a software opengl for unsupported cards, and then their xgl server using that as the opengl backend until the drivers are ready. Which basically means people will be able to get the eye candy slowly on computers and force nvidia/ati/intel to support the server with a driver. Eventually xgl gets a native opengl driver for you hardware and runs as a 'normal' X server (only without all the crap from 30 years of evolution).
Won't this action dilute the whole project ? (Score:2, Interesting)
Re:There go the distros again.. (Score:4, Interesting)
Let's see. The GLX_EXT_texture_from_pixmap extension was developed jointly by David Reveman and some guys from nVidia, according to the credits on the spec. So not Red Hat. Reveman and Matthias Hopf have been everywhere on the X/Mesa mailing lists developing Xgl. The discussion and debate on the xorg list was all about Xgl and whether it should be the main focus instead of Exa. People who don't seem to be associated with any corp like David Airlie and Jon Smirl have been working on Xgl. The plan had seemed to be to move various parts of the driver code to do with initializing the cards into the kernel, use EGL as a simple GL interface that Xgl then ran on top of, with Xglx being a short term hack until that work was completed.
Now Red Hat appear, apparently with the backing of nVidia, saying that actually this plan - which had been discussed for ages - is a bad one, and they have a brilliant new plan. Oh and by the way Evil Novell have been hoarding code and not working with the community.
So when did this AIGLX work appear in CVS then? I don't recall reading about any such branch. Let's find out [freedesktop.org] shall we? Hmm, looks like it was committed in a massive checkin about a month ago. Did Kristian just magic this out of thin air one afternoon? I rather hope not.
So anyway, my point is that from my perspective what Red Hat are saying appears to be the exact inverse of the truth. Novell have been far more visible in the X community doing this sort of work than Red Hat have, they've done a lot of the upstream Mesa work necessary for it to be efficient, they've been demoing it at conferences and so on. And now Red Hat is here trying to claim they went off and did their own thing, with no real evidence to back it up.
And it's not just Red Hat, somehow Novell went off and created an entirely new window manager as they were testing what Xgl could do instead of extending an existing one. Oops! Bah. Huge, massive communications failure at best. Blatant NIH at worst.
OpenGraphics.org video card, open source driver (Score:3, Interesting)
Given the fact that the best open source drivers currently available are those for ATI cards, and that those cards are 15% as efficient in Linux as in Windows, the Open Graphics [slashdot.org] card will have no competition.
Re:Not again (Score:2, Interesting)
XGL, however, can also do accelerated XRender (using glitz), and it's more similar to Quartz. It is also open for future protocol improvements, e.g adding some 3D API support to X server. For ol' xorg server (with XAA or EXA), it would mean lot's of rewriting and either drastically improving driver model (duplicating GL API) or using GL and essentialy becoming same thing as XGL. Note that Nvidia is already doinng GL-based imsplementation of old XFree driver model.
Still, few important problems need to be addresed in XGL, like direct GL (at least fullscreen), multihead support, screen hotplugging, etc.
Is compiz the best approach, I don't really know. Maybe they should've modified metaclty to do what they want (and that is primarily good plugin system for 3D effects).
Halt the damn presses!! (Score:3, Interesting)
If my understanding is incorrect then by all means, enlighten me. If not, then please stop with differing standards and approaches and embrace the fully functional system in existance today.
P.S. Nvidia will use what they have to. They support this approach because it requires the less work on their part than XGL and therefore costs less money. Therefore, their opinion should be ignored and only the interests of the USERS should be considered.
Re:Xgl Already Better than Mac and Vista (Score:4, Interesting)
Some of the cruft:
* using the value of a constant in a comment (/* XLFG length is 255 */)
* form feeds (ctrl-l) in the code
* magic macros with many lines of hidden code side effects (BRESINCRPGON for example)... to avoid slowdowns on cpu-drawn lines and paths. Nobody does this anymore. It's all accellerated and anyway they use static inline (see linux kernel).
* massive argument lists (XRenderCompositeDoublePoly takes 12 arguments)
* massive #define of symbols, and thus massive switch statements (not in a table someplace!). try searching programs/xkbprint/psgeom.c for XK_ISO_Prev_Group_Lock.
* symbols artificially limited to 32 characters long because compilers back then were dumb.
* supid implementation decisions, for example inserting a 'fake' client request should be a one-liner (ala requests.addFirst(fakeRequest) but instead is 64 lines because it actually puts the data into the stream being read from the client.
I mean seriously, you just open up *any* file in the Xserver and it's just crap. I don't mean to diss the developers because a) it's a somewhat large undertaking and b) they didn't have the advantages of hindsight and c) they were using slow hardware. Still, I bet the NeWS server was much better despite being made about the same time. Hopefully the device-dependent part will be done well enough that the Xserver can be rewritten in something modern (Java, ObjectiveC, even C++).
Re:Don't forget... (Score:3, Interesting)