Slashdot is powered by your submissions, so send in your scoop


Forgot your password?

Fedora's OpenGL Composite Desktop 392

An anonymous reader writes "First we had Novell's XGL and Compiz technology, which allows for OpenGL-based composite rendering on the Linux desktop. Now Fedora has created the Advanced Indirect GL X project, which aims for similar desktop effects but with a simpler implementation. Sure, at the end of the day it's just eye candy, but make no mistake - the Linux desktop is due for a massive shake-up!"
This discussion has been archived. No new comments can be posted.

Fedora's OpenGL Composite Desktop

Comments Filter:
  • "Just eyecandy" (Score:5, Insightful)

    by Tim C ( 15259 ) on Wednesday February 22, 2006 @01:32PM (#14777693)
    I spend upwards of 10 hours a day staring at a computer screen; what I'm looking at had better be aesthetically pleasing.

    It *does* serve a purpose - it makes my day that little bit more enjoyable. Decorating your house serves no real purpose (unless you're trying to sell it), but most people want something a little nicer than bare walls. People decorate their cubicles and offices - a photo here, a plant there.

    I don't see why a desktop should be any different.
    • by Jordan Catalano ( 915885 ) on Wednesday February 22, 2006 @01:40PM (#14777749) Homepage
      Work's work. If you could dictate the asthetics of your work environment, I bet you'd have quite a different set of coworkers.
      • True, but that doesn't mean you shouldn't make the best of what you have.
        • So what happened to the [] project that aimed to create a graphics card with fully published specs and open source drivers? If they actually get it working, there will be no more debate as to what video card Linux users (and manufacturers of Linux-capable computers) should buy.

          Given the fact that the best open source drivers currently available are those for ATI cards, and that those cards are 15% as efficient in Linux as in Windows, the Open Graphics [] card will have no competition.
      • Work's work. If you could dictate the asthetics of your work environment, I bet you'd have quite a different set of coworkers.

        Right, but I don't have to sit at my coworkers cubicle nor do I have to use their desktop scheme.

        Every company I've worked for has let their employees decorate their cubicle. Chances are if you go through any office that has geeks you'll see printed dilbert, penny arcade, and various other cartoons printed out with posters, toys, plants, and god knows what else.

        This usually makes pro
    • Since you got the first comment, I assume youw aste those 10 hours reloading slashdot
    • Re:"Just eyecandy" (Score:5, Informative)

      by ajs ( 35943 ) <ajs.ajs@com> on Wednesday February 22, 2006 @02:12PM (#14778006) Homepage Journal
      Advantages of an OpenGL desktop:
      • Zooming in on any part of the desktop while still being able to interact with it (for the visually impared, this is HUGE)
      • Hardware support beomes much simpler.
      • Window manager interaction involves less CPU
      • SVG rendering can go straight to OpenGL for better rendering performance
      • Support for alpha blending (for PNG, anti-aliasing, gimp, etc.) is much lower overhead and easier to support universally

      • Advantages of an OpenGL desktop:

        * Zooming in on any part of the desktop while still being able to interact with it (for the visually impared, this is HUGE)

        This doesn't require OpenGL, a 2D compositing desktop does it fine.

        * Hardware support beomes much simpler.

        Simply untrue; witness the dearth of OpenGL drivers out there for Linux, BSD, et al. Manufacturers are unwilling to release 3d specs for their cards, which makes hardware support much more complex.

        * Window manager interaction involves less CPU
        * SVG r
      • Zooming in on any part of the desktop while still being able to interact with it (for the visually impared, this is HUGE)

        Did anyone else find this punny? =P
      • Don't forget... (Score:3, Insightful)

        by CarpetShark ( 865376 )
        Resolution independence!! It's really time we got past this idea that 1024x768 is an optimal resolution due to web site design or the lowest common denominator or because some people can't see too well. If my new monitor can draw the curve of an "A" at 300dpi, then that's what what I want to see it at, dammit. Sticking with 96dpi or similar is just dumb.
    • Not only does it make the desktop easier on the eyes, these rendering enhancements also increase usability, and allow for all sorts of new UI enhancements. Just being able to drag a window without causing a bunch of redraws is long overdue. The ability to do nice transparent, on-screen displays to show a variety of types of information will open up a lot of possibilities. Even shadows, as eye candy as they are, help the eye distinguish between windows in a very simple way. Things like live thumbnails (i
    • If aesthetic were important than a better place to start might be by having standards for the look and feel of X applications and then making Linux distributions adhere to it. Right now my X desktop looks like every application was designed and written by a completely different group of people (which of course they were, but it shouldn't be visible). This doesn't just apply to looks. It'd be nice if there were some commonality between the user interfaces too. This is far more imporant, to me, than a bunch o
      • Re:"Just eyecandy" (Score:3, Informative)

        by Hes Nikke ( 237581 )
        if you want a consitant look and feel, you're on the wrong platform buddy. Linux distros will *NEVER* have a unified look. Mac OS X otoh already does have a unified look and feel. (though the trend is in the wrong direction - Brushed Metal, Unified Toolbar, Pro, Smooth Metal, ect)
  • by Watson Ladd ( 955755 ) on Wednesday February 22, 2006 @01:33PM (#14777696)
    How does this relate to the ongoing accelerated X11 efforts?
    • by anandrajan ( 86137 ) on Wednesday February 22, 2006 @01:45PM (#14777790) Homepage
      From the FAQ, How is this different than XGL?

      "XGL is a different X server. This is a more incremental change which is slated to become part of Xorg. We don't believe that replacing the entire X server is the right path, and that improving it incrementally is a better way to modernize it. After talking to people at xdevconf, it felt like much of the upstream Xorg community shares this view. You can search Adam Jackson's notes [] for "large work for Xgl" to get the blow-by-blow or NVidia's presentation from XDevConf 2006 [] on using the existing model.

      We've been working on the AIGLX code for a some time with the community, which is in direct contrast with the way that XGL was developed. XGL spent the last few months of its development behind closed doors and was dropped on the community as a finished solution. Unfortunately, it wasn't peer reviewed during its development process, and its architecture doesn't sit well with a lot of people.

      The other question is Wait, can I use compiz? The answer there is a theoretical yes, although no one has actually gotten it to work. We love compiz and we think it's great stuff and is well polished, but it's often confused with the underlying architecture of XGL. Much like the code that we've added to metacity, compiz is a composite manager. With a bit of work, it should be possible to get compiz working on this X server. There's an excellent post from Soren [] on the topic of compiz vs. metacity."

  • OpenGL a big win (Score:5, Insightful)

    by andrewzx1 ( 832134 ) on Wednesday February 22, 2006 @01:35PM (#14777709) Homepage Journal
    Having increased OpenGL support for Linux and gathering development support for advanced graphics toolkits will be a big win for Linux desktop. Having a sexy and slick interface has helped make OSX very popular. Sexy graphics for Linux will open new possibilities for interfaces, data display, games, and more.

    Let us pay homage to Silicon Graphics, the originators of OpenGL. They may not live out the year.
    • Having increased OpenGL support for Linux and gathering development support for advanced graphics toolkits will be a big win for Linux desktop.

      Yes. Now all we need is a modern graphics card with good open-source drivers.

      I currently own a Radeon 9250, which is about the fastest graphics card that you can get with workable 3d open-source drivers for Linux -- and even those are quirky. If a vendor would just produce one solid set of drivers for an up-to-date product, I'd buy it. Yes, I know about Nvidia's b
  • Screenshots (Score:2, Interesting)

    by Enigma_Man ( 756516 )

    How can they talk about graphics advances without screenshots? I believe the term used these days is "TTIWWP".

  • Not again (Score:4, Interesting)

    by prockcore ( 543967 ) on Wednesday February 22, 2006 @01:40PM (#14777757)
    I'm not liking where this is headed. Now we've got Xgl, Aigl and whatever Luminocity used.

    Why couldn't they just standardize on Xgl? It works *today*. Aigl doesn't even support my nvidia card right now.
    • Re:Not again (Score:5, Informative)

      by Erwos ( 553607 ) on Wednesday February 22, 2006 @01:43PM (#14777776)
      Your nVidia video card doesn't support Aigl, you mean. It's missing an extension that nVidia is adding in the next driver release. This is hardly a show-stopper. Indeed, from the article, nVidia seems to believe this is the way to go.

    • Re:Not again (Score:5, Informative)

      by cortana ( 588495 ) <`ku.gro.stobor' `ta' `mas'> on Wednesday February 22, 2006 @02:21PM (#14778063) Homepage
      Please visit resentations.html [] and check out NVIDIA's presentation: "Using the Existing
      XFree86/X.Org Loadable Driver Framework to Achieve a Composited X Desktop":
      In this paper, we make the case for using the existing XFree86/X.Org DDX loadable driver framework to achieve a production-quality composited X desktop, as opposed to the X-on-OpenGL model. While the X-on-OpenGL model demonstrates what the graphics hardware is capable of, everything that the X-on-OpenGL model can achieve is equally possible with the current framework. Furthermore, the current framework offers flexibility to driver developers to expose vendor-specific features that may not be possible through the X-on-OpenGL model.
  • video card support? (Score:3, Informative)

    by atarione ( 601740 ) on Wednesday February 22, 2006 @01:42PM (#14777770)
    I'm as excited as the next guy about new sexy UI for linux.... and certainly apple's rather posh desktop has helped them out..... but how excited should we be when clearly only a small number of video adapters currently work ? ~~~ Video card status Here is the current status as far as we know. We also intend to release driver updates in the yum repository as we get those cards to work. If your card isn't supported, come back later to see if we've added support. Note that this support status only affects new functionality; everything should work as well as it did before with the compositing manager disabled. Success and failure updates to this page are welcome. Known Working ATI: Radeon 7000 through 9250 (r100 and r200 generations) Intel: i830 through i945 Occasionally / Possibly working Intel: i810. Should work but not tested. 3dfx: voodoo3 through voodoo5. Might need NV_texture_rectangle emulation. Known to not work ATI: Radeon 9500 through X850 (r300 and r400 generations). Some issues with rectangular textures may be fixed in new DRM CVS, need to verify. ATI: Rage 128. Looks like driver locking issue. ATI: Mach64. No DRM support in Fedora, still insecure. Matrox: MGA G200 to G550. Needs at least a driver update to fix DRI locking. PCI cards probably have other issues as well. nVidia: Any. No open DRI driver. Closed driver support coming soon though. 3dfx: Voodoo 1 and 2. No DRI driver. ATI: Radeon 8500 through X850 with the closed fglrx driver. Uses an ancient version of the DRI driver API that can't work with the new driver loader. No ETA on closed driver support. Anything without a free 3d driver. Unknown status via, s3 savage, sis. No intrinsic reason why these wouldn't work, as far as we know, but no one has tested them yet. ~~~
  • by dtsazza ( 956120 ) on Wednesday February 22, 2006 @01:45PM (#14777792)
    It'll certainly be interesting to see effects like those of 3dDesk [] the norm, rather than the exception. Also, if anyone else has played with it they might have noticed that it essentially works with lowish-res images of the desktop rather than the windows and icons themselves - you can notice it in some of the modes, there's a definite switch between the desktop itself, and the image of the desktop (in both directions). Having something fully integrated will open up many new possibilities... ...and on the same note, it's a challenge to designers to use them in a truly worthwhile way. While I agree with that eye candy does make a difference, it can also make a difference in a bad way when clueless designers turn to snazzy effects to make up for lack of basic competence (viz. many many webpages). It's the difference between
    "Let's make improvements to X - ooh, that new 3D stuff could help with that"
    "Wow, that 3D stuff sure is snazzy! We'd best think up a way to get it into our next release."
    Now I'm certainly not saying that this is bad news, far from it in fact, but I can imagine there'll be temptation there to use it at any cost (especially once it starts making its way into competing projects). Hopefully interface designers will embrace the new possibilites open to them and give us some genuinely useful/nice improvements.
  • by Anonymous Coward on Wednesday February 22, 2006 @01:45PM (#14777796)
    I am absolutely irate with respect this project and XGL and even expose on the mac. Heaven forbid they let use scale a window and keep it scaled. I don't mean resize, I mean zoom in and out of ONE window. Imagine how useful that will be for tiled window managers, if we can scale windows we don't need to worry about the app handling resizing right.

  • hmmm.....would that nessessarily be a good thing?
  • The article mentions that XGL made some architectural decisions that some people disagree with, and that AIGLX is a more "incremental" design. Does this mean that AIGLX is Yet Another Extension Bolted On to X?

    Unfortunately, although I've picked apart many XFree86 device drivers, I don't know very much about the architecture of X and X servers. Could someone give a thumbnail sketch of the issues at stake, and the tradeoffs?


  • by TheNetAvenger ( 624455 ) on Wednesday February 22, 2006 @02:05PM (#14777944)
    A good step, but not the end game...

    The project has a good concept model, not to destroy XWindows with a rewrite; however, this will considerably limit any real advancement into a comprehensive environment.

    I see this as more of a test bed, and partial stepping stone; however there are many issues not being addressed that just need to be ripped apart and rethought out, and this CAN be done without destroying the existing environments.

    Part of the problem of Bringing any 3D GPU functions to the desktop is the nature of Video cards, and they are designed to operate in a 2d accelerated mode and a full 3d accelerated mode, with both aspects of the cards not mixing normally.

    What this leads to is an environment that mimics the 2D acceleration features in the 3D mode, and turns the Video card into 3D mode full time.

    Strangely, what will help this push for full time 3D utilization or cross utilization is work being done at company people really don't like, Microsoft.

    Microsoft is pushing both ATI and NVidia to move their Driver technology to allow for overlapping of the two operational modes, and also adding in virtualization of the GPU RAM space - the WDDM/LDDM that will ship with Vista, as it will be the first consumer OS that has a full time 3D accelerated accessible UI environment active.

    Also by virtualizing the GPU RAM, Vista drivers (WDDM) are pushing the cards to pull off some interesting tricks, like pushing to System RAM lower priority applications Video, without out of memory considerations - Just like Virtual Memory on Hard Drive did years and years ago, and leaving a full 3D environment and 'appearance' of GPU RAM continually available to applications no matter how many remain active.

    Video RAM of the old days was basically having enough RAM to display the resolution and depth for the screen you were displaying, but in the 3D world, GPU RAM is filled with textures, etc - so this mixing and virtualization process has been a long time coming, and surprisingly, Microsoft if the company helping NVidia and ATI get it working at the driver level.

    Now for the good news, Microsoft has been generous to ATI and NVidia in the driver development process and in doing so has given both companies a lot of information and technology they would not of had access to from the multi-app OS environment viewpoint.

    So all the cool new functions of the WDDM that is being developed for Vista should eventually flow back through both NVidia and ATI and their own driver technologies for supporting these concepts in other OS environments.

    However, as I started out and still believe, this technology from the article, and even going full OpenGL desktop is not a complete answer. A full OpenGl desktop will be problematic when you want to run a 'windowed' version of Quake in for example, as the applicaiton will be expecting to have full control of the OpenGL/GPU and not expecting the first priority to be going to the Desktop Environment.

    So to get to the full OpenGL desktop is going to break a lot of existing 3D applications in the *nix/OpenGL world, or a technology to bridge this is going to have to come about. A technology that maybe sucks info from ATI and NVidia and Microsoft even to emulate what Vista is pulling off.
    • "A full OpenGl desktop will be problematic when you want to run a 'windowed' version of Quake in for example, as the applicaiton will be expecting to have full control of the OpenGL/GPU and not expecting the first priority to be going to the Desktop Environment."

      This is wrong. OpenGL is actually better suited for this than Direct3D, since OpenGL has a client/server architecture. OSX proves this - it is easily possible to play in windowed mode with no slowdowns. So, no 3D apps get broken. D3D needs to be red
      • This is wrong. OpenGL is actually better suited for this than Direct3D, since OpenGL has a client/server architecture. OSX proves this - it is easily possible to play in windowed mode with no slowdowns. So, no 3D apps get broken. D3D needs to be redesigned for this, in short it needs a similar client/server-architecture. I wouldnt be surprised to see Direct3D 10 heading in this direction.

        Actually OpenGL isn't better suited to this, in fact this is a concept it tends to omit completely.

        Secondly, you are not
  • One nice eye-candy and which would be usefull too, would be to be able to have different backgrounds for each of my workspaces. Why has this never been implemented? CDE has this! I read about all these efforts to implement complex eye-candy, but simply having different backgrounds for each workspace would, I believe, relatively easy to implement. I am using Gnome here.
  • by skryche ( 26871 ) on Wednesday February 22, 2006 @02:07PM (#14777957) Homepage
    Having watched the movies, I am greatly unimpressed. The reason the Mac UI works so well is that its eyecandy is a method of subtly including information that might otherwise be lost. For instance, when you minimize a window in MacOS (if I remember correctly), it slides down to a nice little parking place on the dock. In the first movie, the minimized document shrinks down in a nifty animation but shows no relationship between it and the button at the top of the screen. The second movie solves this problem (so why even have the first) but is slow (can you imagine minimizing eight windows? What a mess!).

    Similarly, in the third example -- what information is being given to the user by fading the menus? I'm not sure what it is; instead, it just looks messier, and therefore less useful.

    A side note: I knew this whole "No! Vorbis is the format! OGG is just the container" idea would bite me on the ass some day, and it looks like today's the day. I clicked on the movie links only to have my Winamp playlist destroyed. Even worse, Winamp didn't even know how to play the file. Is there a solution to this absurd problem?
    • You are right, it is annoying. I'd have done it differently - .oga for Ogg Audio files (right now, Ogg Vorbis), and .ogv for Ogg Video. Mind you, Audio/Video are not tied to Vorbis/Theora, but these extensions would be much easier to understand. Also, decent DirectShow Theora decoders are still missing :(
    • Many players will accept ".oga" for Ogg audio files, and ".ogv" for Ogg video files. Similarly, the ".wma" and ".wmv" extensions, or ".m4a" and ".m4v", are also commonly supported.
    • This is simply a technology preview. In order to allow for usable and intuitive visual effects, the engine needs to be ready for it. This simply shows that the engine is getting there.

      There is very unlikely that the effects in the video will show up in their current state in a release.
    • The reason the Mac UI works so well is that its eyecandy is a method of subtly including information that might otherwise be lost.

      Your point is well-taken, but I'd suggest you sit down with a copy of OS X 10.0 sometime. The eye-candy was pretty unsubtle back then. The refinement present in Tiger took Apple several years to get right. XGL is not yet a year old. Give it some time to mature.
  • by penrodyn ( 927177 )
    It's amazing that when Vista has new eye candy its bad but when Linux has it it's good!?
    • The difference? (which I assume is confusing you)

      Linux (or even *BSD) doesn't require you to install the eyecandy and gives you choice as to which EyeCandy implementation you can use. I don't think anyone is arguing that EyeCandy is bad. I happen to like it on my desktop and I'm a command line junkie. I just object to having it forced on me if I don't want it but do want the latest in high performance computing under the hood.

      So to respond... You missed the point.
    • by WhiteWolf666 ( 145211 ) <> on Wednesday February 22, 2006 @02:39PM (#14778227) Homepage Journal
      Vista's eye candy has tremendous system requirements. On X, if you can offload window operations to an OpenGL compositor, you save a significant number of CPU cycles.

      Yes, there is a difference. Take a look at your system. Turn off NVIDIA's custom render accel, and watch X's CPU usage while moving windows around, or resizing, or scrolling.

      Install XGL, or this new Fedora thing.

      Play a video on X, run a background compilation process, and then resize your video window. It'll stutter like mad. Try the same thing on XGL; its fluid. Watch all the fluid animations, and watch what happens to your CPU usage. With any accelerated video card (even ancient POS like Intel's i810, or Radeon 7500+, or older low-end Geforce) you'll see negligble CPU impact.

      Contrast that with Vista's requirements for the full "Aeroglass" experience. You can do the same thing on XGL at a far, far lower cost of system resources.

      One approach makes your computer faster. The other requires a faster computer. Understand?
  • Cool! Now Linux desktops can be as annoying as Windows XP.

    No, wait.

    Cool! Now Linux desktops can compete with Windows XP.

    No, wait... am I supposed to feel about this again???

  • by thk ( 142232 )
    Now we're seeing FOSS' killer application -- mix and match modularity. Obviously RH needs to respond to Novel's efforts or potentially lose market/mind share. Because the different approaches are built on a truely open platform, you don't have to ditch your current environment from the hardware on up in order to get the solution that is right for you. Competition to fill niches exposed by open API's works. Anyone can play. (And of course there's also the fact that someone can come along and distill the best
  • by 0xABADC0DA ( 867955 ) on Wednesday February 22, 2006 @02:31PM (#14778148)
    I don't mean to troll, but what the heck is wrong with the Fedora-type people that they think incrementally improving the X server is a good idea? I've looked into the source and its full of 30-year old code. The 'best practices' for a 0.1 MIPS machine is just cruft on a 1000 MIPS one.

    The xgl people are actually rewriting the X server from scratch to use opengl. That is a much, much better idea, and it shows with what they can *already* do:

    * virtual desktops on a cube
    * popup effect for menus
    * "gummi-bear" window effect when moving, sticks to other windows / side of screen
    * translucenty
    * gl screensaver on root window
    * shadows
    * fading
    * magnification
    * apple-style expose (show all windows non-overlapping)
    * accellerated 3d games (quake) and movies
    * make non-responsive windows go grey

    You can see the video at: nuary/011922.html []
    (click link for the movie)

    This is I think using an existing Xserver to give an opengl window, which can be running a software opengl for unsupported cards, and then their xgl server using that as the opengl backend until the drivers are ready. Which basically means people will be able to get the eye candy slowly on computers and force nvidia/ati/intel to support the server with a driver. Eventually xgl gets a native opengl driver for you hardware and runs as a 'normal' X server (only without all the crap from 30 years of evolution).
    • Xair can already run on bare hardware without another X server (based on 30 year old code) running underneath it.
    • by be-fan ( 61476 ) on Wednesday February 22, 2006 @03:05PM (#14778469)
      XGL isn't a rewrite of the server. It's a rewrite of the DDX (device-dependent) portion. That's probably the best part of the server to rewrite though, given the DIX (device-independent) is relatively clean code. XGL doesn't get rid of X's cruftiest part, though, which is xlib. XCB is ready to be a replacement, but GNOME won't be able to move to it until 3.x, because Xlib is implicitly a part of the GDK ABI.

      That said, I wouldn't say XGL is better than OS X yet. OS X can do the effects you listed, it just doesn't do a lot of them for asthetic reasons. Technically, I'd argue OS X's approach is superior to XGL's, since Quartz 2D Extreme uses a direct-rendering model as opposed to XGL's indirect model. Additionally, the fact that the compositor is seperate from the window server in XGL makes synchronization a much bigger PITA than in OS X. On the other hand, the indirect model allows the X server to access the geometry stream, which allows some effects the direct-rendering model doesn't. Technical merits aside, OS X still wins because its already a stable, mature, and widely used technology. It'll be awhile before XGL is as mature as Quartz (especially at the driver layer --- DRI is really not ready for XGL yet), and before GNOME/KDE apps use vector graphics as widely as OS X apps do.
      • by 0xABADC0DA ( 867955 ) on Wednesday February 22, 2006 @07:28PM (#14780740)
        That's disappointing news that they are not rewriting the whole thing, but only the hardware-dependent code. I hope you are on crack saying that the rest is "relatively clean" (maybe relative to the device-dependent X code???).

        Some of the cruft:
        * using the value of a constant in a comment (/* XLFG length is 255 */)
        * form feeds (ctrl-l) in the code
        * magic macros with many lines of hidden code side effects (BRESINCRPGON for example)... to avoid slowdowns on cpu-drawn lines and paths. Nobody does this anymore. It's all accellerated and anyway they use static inline (see linux kernel).
        * massive argument lists (XRenderCompositeDoublePoly takes 12 arguments)
        * massive #define of symbols, and thus massive switch statements (not in a table someplace!). try searching programs/xkbprint/psgeom.c for XK_ISO_Prev_Group_Lock.
        * symbols artificially limited to 32 characters long because compilers back then were dumb.
        * supid implementation decisions, for example inserting a 'fake' client request should be a one-liner (ala requests.addFirst(fakeRequest) but instead is 64 lines because it actually puts the data into the stream being read from the client.

        I mean seriously, you just open up *any* file in the Xserver and it's just crap. I don't mean to diss the developers because a) it's a somewhat large undertaking and b) they didn't have the advantages of hindsight and c) they were using slow hardware. Still, I bet the NeWS server was much better despite being made about the same time. Hopefully the device-dependent part will be done well enough that the Xserver can be rewritten in something modern (Java, ObjectiveC, even C++).
  • Why should Red Hat go and start a seperate project to achieve more or less the same effects ? Can't they work with the XGL project team and improve on the existing code ? What good will re-inventing the wheel or duplicating the code help achieve? Reading about this takes me back to the 80s when the UNIX OS was severely fractured with applications working on one Unix flavor not running on another flavor of Unix. Even though Red Hat is doing a good thing, it is actually taking a step back by forking the proje
  • Not impressed (Score:3, Insightful)

    by grendelkhan ( 168481 ) <scottricketts&gmail,com> on Wednesday February 22, 2006 @03:46PM (#14778833) Journal
    I've been running Xgl and Compiz on Ubuntu and I have to say, the Novell guys are way out in front of Fedora for this, Xgl is ready for primetime and runs nearly flawlessly for me. This looks more like sour grapes over Novell holding onto Xgl until nearly the last minute before opening it up to the community. While I don't agree with how Novell developed this, it's hard to argue with the product.
  • One up Novell? (Score:3, Insightful)

    by Ensign Nemo ( 19284 ) on Wednesday February 22, 2006 @05:48PM (#14779853)
    My take on this is Redhat doesn't like that Novell got all the press and kudos for Xgl and is trying to get mindshare back.

    Reasons for my viewpoint:

    1) I prefer Redhat over Suse. (This isn't an ego post about me, so hear me out.) I use both, but of the two I like Redhat better. I've had bad luck with Suse and Novell seems to be having trouble turning into an opensource/Linux company. We use Groupwise at work and evolution and Suse and have problems. So given a choice I'll take Redhat since I've had good luck with them. However, after reading about Novell's Xgl contributions and checking them out, my impressions of Novell have greatly improved. I'm definitely much more open minded now about them than before. Redhat has always had the reputation for commercial distros that give back to the community. Now with Novell's contributions, Redhat has contribution competition (if that makes any sense.) They are no longer THE company when it comes to good charma in the community. Another company has given back a HUGE contribution and a VERY visible one at that. Now if a person who has stated his biad towards Redhat has now given second thoughts to Novell, what is a person who has no bias or preference either way likely to think.

    2. They're not contributing to Xgl, but rather they came up with their own way and specifically stated is is different than Xgl.

    3. Make specific points about doing it 'upstream', which resurrects the flame wars on the xorg mailing list about in-house vs inet cvs development.

    4. Specifically mention how their approach is better than Novell's and how Novell's 'doesn't sit well with a lot of people.'

    My humble opinion. Don't get me wrong, I still like Redhat but in this case I think this is more for PR good than community good.
  • by shaitand ( 626655 ) on Wednesday February 22, 2006 @06:18PM (#14780162) Journal
    We need to nip this in the bud right here. My understanding is that this approach will still allow the same eyecandy but will lose the only REAL feature of XGL. A hardware accelerated desktop. Some of you like the eyecandy and transparent windows. That must be nice for you. The rest of us want a snappy and responsive desktop. XGL delivers that by hardware accelerating the entire xserver.

    If my understanding is incorrect then by all means, enlighten me. If not, then please stop with differing standards and approaches and embrace the fully functional system in existance today.

    P.S. Nvidia will use what they have to. They support this approach because it requires the less work on their part than XGL and therefore costs less money. Therefore, their opinion should be ignored and only the interests of the USERS should be considered.
  • by Brain_Recall ( 868040 ) <> on Wednesday February 22, 2006 @06:36PM (#14780343)
    Yes, I'm using XGL, right at this very moment. I'm running a Ubuntu beta release, DapperFlight4, to which compiz and XGL have been isntalled. The forum post on how to get it installed is here: 7 []

    It has also been reported to be working under Breezy Bager, but I'm not sure.

    And let me say, it's damn slick. Not everything is working (or at least not enabled by default), such as trasnparency, and the top and bottom of the desktop cube are simply white. I'll try to figure out if they're broken or disabled. But what is working, is everything else.

    Performance isn't the best. Theres some lagginess to DVDs, but only minor, and even less then expected when doing a wobbly-window move.

    As a plug for Ubuntu, this is by far the best distro I have played with. Every other time I have tried to get myself to Linux I ran into unmovable road blocks. This thing, (a damn BETA release!) boots up first try with all hardware detected and running (even my Dell-supplied Broadcom wireless NIC). Then, I go install the nVidia 3D driver and an experimental windower and stuff works perfectly. Honestly, I don't think it could get much better than this.

"We don't care. We don't have to. We're the Phone Company."