Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Thinking About Desktop Eyecandy 338

An anonymous reader writes "This article ponders over whether excess eye candy and special effects being incorporated on the desktop is a good trend after all? The author explains why he thinks the users are taken for a ride by the OS companies in compelling them to upgrade their hardware in order to enable these processor intensive and memory hungry special effects."
This discussion has been archived. No new comments can be posted.

Thinking About Desktop Eyecandy

Comments Filter:
  • I'm all for it... (Score:5, Insightful)

    by Megaweapon ( 25185 ) on Thursday March 23, 2006 @04:28PM (#14982947) Homepage
    ...as long as everything is configurable. The minute something becomes distracting I should be able to disable it. Forced fancy effects that do nothing but distract you and spin away CPU cycles are a waste.
    • by Bobke ( 653185 )
      Indeed, all I want is the same desktop that I have right now, but with the rendering done by my 6800GT, effectively saving me CPU cycles.
      I have run the "wobbly windows" XGL thing on my machine, and dragging windows in it IS a lot less CPU intensive (from 50% to about 15% CPU, but I have a HT enabled).
    • Disable vs Remove (Score:5, Interesting)

      by TiggertheMad ( 556308 ) on Thursday March 23, 2006 @05:06PM (#14983327) Journal
      ...you made a good point, that eye candy is ok, provided that it can be turned off. However, you may not realize that isn't always that simple.

      Let's take a hypethetical situation. Lets say I write a UI that uses a 3d api to render the desktop. (we will call this supposed UI, 'SparrowGrass' so we have a name to work with.)

      Using 'SparrowGrass' I can enable all sorts of 'spiffy-wa'(as my console gamer friend calls them) hardware accelerated effects, such as dynamic shadows, translucency, and such. But because they are expecting that there is a 3d card with a good T&L chip in the machine, it will run like a dog without it.

      So either because I find such 'spiffy-wa' effects morally offensive when I am trying to remotely reconfigure a DC, or because I lack the latest 3d card, I choose to disable the fancy 3d features of 'SparrowGrass'. However, I am still using a 3d API to render the desktop.

      If you looks at one very famous company's 3d API, printing text to the screen involves rendering a couple of polygons, and basically texture mapping the text onto them. While you have turned off the 'spiffy-wa' features, you are still going to be taking a hit for using the 3d API in the first place.

      it seems unlikely that there will be two sets of .dlls supported, one for providing a 3d based desktop, and another for providing a 2d desktop.
      • From what i've read, Vista will have three modes: Powerful graphics adapter, average/low powered graphics adapter, and "no 3d adapter." instead of running like a dog, your interface should just have alternate modes that do not require a 3d T&L card to run properly...

      • They don't need to. The '3d' desktop you talk about is highly different from an accelerated 2d desktop which most ventors are trying to sell. Just because a 2d desktop is rendered using OpenGL, it doesn't mean that client applications need to know that anything at all has changed. Its very possible to have a seperate 2d API then a 3d API but only for features that require 3d graphics.

        A feature like zooming can be implemented in either 2d or 3d space. It'd be more efficient to run a zoom on a 3d accelerated
    • Forced fancy effects that do nothing but distract you and spin away CPU cycles are a waste.
      You mean things like Clippy [wikipedia.org]?

      *scnr*
  • Well duh. (Score:2, Insightful)

    by beavis88 ( 25983 )
    Do people honestly believe that consumers are the ones who benefit most from a new operating system?
    • Do people honestly believe that consumers are the ones who benefit most from a new operating system?

      Consumers are their own worst enemy. The reason everyone is marketing eye candy is because that is what people want (or think they want). Companies are smart - they try to sell what the consumer is going to pay for. The problem is that consumers don't make smart buying decisions. SUVs sell like crazy in the middle of the city for crying out loud. They buy what looks cool.

      Don't blame the companies - blame
    • Yep. The fact that debian, as a successful non-commercial distro, as taken a different approach, where you can just update rather than installing this mythical "new operating system" that companies like MS call the upgrade from 95-98 or 2000-XP is evidence of the BS going on. In Debian, the releases are essentially just a distributable snapshot on physical media, for people who can't stay up to date in other ways.

      That model of having an OS and then upgrading to a "different" OS won't last, I suspect.

      More
    • Not directly, but they do benefit from software written to exploit it.

      No one buys a Mac (or upgraded to OS X 10.3) for the Dock and Expose, they buy it for iLife, which in turn depends on CoreImage, CoreVideo, etc.
  • by Anonymous Coward
    I don't mind home users buying this, but why do companies?

    It bugs the hell out of me that a select licenced windows server cd comes with eye candy switched on(ok its not much but it's a server!)

    why???
    • Actually it doesn't come with it switched on. I run server as my desktop OS, so I know this. You have to start the themes service and you have to configure the video card to "max acceleration" (as it is set to no acceleration on server SKUs by default). You often have to set things like the image aquisition service and others to automatic if you have scanners and the like. All of these things are off on the server sku...
  • Fat Eye (Score:2, Funny)

    by Nosklo ( 815041 )
    I think my eye has Diabetes [wikipedia.org], so I will pass on vista, and take Xgl Ligth please.
  • No thanks. (Score:2, Interesting)

    by Anonymous Coward
    I don't cary about eye candy on the desktop. Here's why:

    I played WoW for one year now on a Windows 2000 Professional box. As we all know, Windows 2000 is as about as bland a desktop that has ever existed.I was getting 90 to 100 FPS in WoW and I was happy with it.

    Recently, I was forced to upgrade to Windows XP because an application bombed out when trying to install on W2K. Now, I get 30 - 40fps. After turning off all the XP eye candy, I get 40fps steady.

    • Re:No thanks. (Score:3, Informative)

      by jekewa ( 751500 )
      I'd look at the in-your-face, under-the-hood, and behind-the-back services before blaming the eye candy. Unless you mean the eye candy as the instant messanger, anti-virus, quicktime and office quickstart programs, and the other notifications you have in your icon notification area. Nearly all of them is tapped into a running program, stealing cycles from your system. Also look at the services started on your behalf, including the built-in (and arguably nearly useless) firewall and other security checkers.
  • But it sure does help with the overall experience. For example, in what little amount of spare time I have I like making images with Povray. Whenever I want to try out something new it sure helps to have the code open in a mozilla browser window underneath the terminal window that I'm typing into. Also, I don't have much screen real estate.
  • Is negative and eventually detracts from the user experience.

    But there are certain effects that compliment the OS and do serve a purpose. In OS X, when a window is minimized and you get the "genie" effect, notice how it minimizes to the point for which the minimized application will reside? It leads you back, so that you can remember.

    -- Jim http://www.runfatboy.net/ [runfatboy.net]
  • OS X has loads of eye candy. The obvious benefit is that users get more feedback on their actions. This means less calls to tech support, because it is obvious to the user whether they are taking the right actions or not. For example, when a program crashes in OS X there is a spinning beachball, and when a program launches the dock icon bounces.

    The hidden benefit is that much of the eye candy in OS X is very soothing. It makes it easier to get work done when you have a soothing background and your actions o
    • On a cognitive level I think a lot of the OS X effects help the user understand exactly what is happening in computerland.

      An example of this is Expose, how the windows nicely slide and resize, making it obvious what is going on. The animation here is not really necessary (think windows alt-tab) but it certainly helps. Another example is the fast user switching feature. Rather than simply flipping to the other desktop leaving the user wondering where they are and how they got there, it does a nice rotatio
      • The average user would never find expose... you have to set it up in the control panel before you can use it anyway (and newbies would take one look at that screen, go 'ooo scary' and forget about it. For anyone who really needed that level of feedback it's wasted.

        The fast user switching thing is nice though.

        (personally I just wish they'd spend that amount of care with finder - when you close an OSX app it doesn't close.. you have to right click on the taskbar and select 'close'. The visual feedback for t
    • The hidden benefit is that much of the eye candy in OS X is very soothing. It makes it easier to get work done when you have a soothing background and your actions on the computer generate a continual calming effect.

      Yeah.. tell that to clippy.

    • I agree. OS X is a more comfortable user experience (OS X XP? hehe). But why does windows has such a problem with it's eye candy compared to os x? I'm currently running an iBook 600MHz with 640MB ram. If I tried to do the same with XP (and I have recently) on a system of the same specs, it's really sluggish. Default installs for both all the way. Changing XP's look back to classic really doesn't solve much of the problem.

      I'm no apple fan boy by any stretch of the imagination, but computers are for work - to
      • But why does windows has such a problem with it's eye candy compared to os x?

        That is a feature of targeting your OS to a particular hardware platform.

        Keep in mind that the eyecandy in OS X has gotten more optimized over the years with Altivec on G4 and G5 machines and now with SSE2 and/or SSE3 with the introduction of the Intel CPUs.

        Windows and Linux have to have drivers, hooks and code for every lowest common denominator CPU and video chipset in the world, so there simply is not much time to code efficient
        • Having background apps pop up dialogs and things in the background is THE BEST FEATURE EVER.

          I really hate it when some background application suddenly decides it should announce something right in front of the Slashdot posting I'm typing. Particularly when I'm right in the middle of hitting enter and so I see the dialog flash and I'm not sure what I agreed to. Never happens on the Mac. :)
    • There's a huge difference between eye candy and visual feedback.

      Visual feedback is not only nice, in many cases it is critical - thinking here of the hourglass that tells you a program is working on something. Eye candy, on the other hand, does nothing more than make the desktop look pretty - thinking here of the WindowsXP menu transition effects feature, or the Vista "glossy chrome" effect that will be on all window borders.

      Some features that could be called eye candy can also be called functions, such

    • Pshaw... having just moved from a Win2K box to a Powerbook, I don't find that the shiny shiny stuff makes a difference in my life.

      Some of it, like the animation that swooshes the dock, just irritiates me.

      I personally find the PB keyboard annoying compared to my Logitech, and the mouse button on the trackpad - man was that designed by a deaf person? CLICK! CLICK!

      I would love to be able to turn off even more of this flashola than I have already. I don't need my windows to swoosh down to an icon, or for every
    • when a program crashes in OS X there is a spinning beachball,

      Because when a program crashes, a beach ball is the first thing I usually think about.

      Still, I suppose it's better than the ever-cryptic pre-OS X "An error of type duuuuude has occurred."

    • by crabpeople ( 720852 ) on Thursday March 23, 2006 @06:23PM (#14983967) Journal
      "it is obvious to the user whether they are taking the right actions or not. For example, when a program crashes in OS X there is a spinning beachball.."

      Of course that means a program crashed, its like word association. Beach ball - ball park - giant hotdogs - thirst - cold beer - expensive ballpark beer - beer empty - gag at refill price - hotdogs stuck in throat - call ambulance - hospital room visit - wheeled into ER on crash cart

      see its completely intuative

    • by pclminion ( 145572 ) on Thursday March 23, 2006 @06:24PM (#14983972)
      For example, when a program crashes in OS X there is a spinning beachball

      I fucking HATE that. Sometimes Safari loses sanity and I get the dreaded beachball. Guess what -- the system menu is modal to the application, which means I can't select Force Quit. Instead I have to open a terminal and type 'killall Safari'. What the HOLY FUCK?

  • by kannibal_klown ( 531544 ) on Thursday March 23, 2006 @04:34PM (#14983010)
    I'm all for eyecandy in my OS so long as it is in moderation. To me, that means 2 things:
    1) It's not excessive. I don't need 10-second animations to show a window has popped up.
    2) It's not too hardware intensive for the time it's released. Around 3 generations ago for video cards.
    3) You can scale it back if needed.

    For #1, it shouldn't slow things down or cause a distraction. Something cool, but subtle. OSX's dock bar is a nice example.

    For #2, I mean you shouldn't need a current-gen system to render everything. If Vista came out today, I don't want to be required to have an nVidia 6800GT to view the desktop with the defaults on. If you required a Geforce 2 or 3, then fine; they've been out long enough that most should have something as good or better (plus you should be able to turn it down if you don't).

    For #3, you should be able to run an OS in a lighter configuration. This is for people that either don't have recent hardware or just want a light experience for performance (or personal preference).
    • by merreborn ( 853723 ) * on Thursday March 23, 2006 @04:53PM (#14983187) Journal
      People keep making noise about Vista requiring a cutting edge video card to use the Aero UI, but what people rarely mention is that Vista will run just fine on a machine without any 3D card at all. It'll just automatically disable Aero.

      So, if you've got cutting edge hardware, vista will take advantage of it. If you don't, it won't. Where's the problem?
    • For #2, I mean you shouldn't need a current-gen system to render everything. If Vista came out today, I don't want to be required to have an nVidia 6800GT to view the desktop with the defaults on. If you required a Geforce 2 or 3, then fine

      NVidia's current generation of card is the 7900; before that was the 7800, then the 6800s and 6600s, and before that the GeForce FX if memory serves. You then have the GeForce 4s and 4 MXs before finally getting to the 3s and 3 Tis then 2s. At point 2, you say "around 3 g
  • by onion2k ( 203094 ) on Thursday March 23, 2006 @04:35PM (#14983020) Homepage
    But building more special effects in the OS level will rob the extra power and memory from the applications and games which rightfully require them.

    This guy is incredibly clueless. Effects only take up "power" (argh) and memory when they're in use. The likes of OSX automatically scales down the fancy stuff if your system doesn't have the grunt to run them well, I imagine Vista will do the same. Switch of the swishy bits and your system will use no more RAM or CPU time than if they weren't there in the first place. Besides which .. the PC I'm typing this reply on has a 2.6GHz CPU and 1Gb of RAM, with a Radeon 9800pro graphics card. That's faaaaar more than my desktop requires. If I didn't "waste" the extra capacity on delightfully shiny effects it'd just go to unused. It's not like Firefox would start using it.

    And further to that .. I *like* swooshy effects. I'm a PHP developer. I need cheering up. ;)
    • > The likes of OSX automatically scales down the fancy stuff if your system doesn't have the grunt to run them well, I imagine Vista will do the same.

      You have used Microsoft products before, right?

      Okay, I'm being harsh, but last time I checked, Vista had fairly crazy minimum requirements, and even if it's not taking up CPU/graphics while not running the effects, I'd be bloody amazed if it's not still trying to take up a whole bunch of memory (which, sure, might be swapped out, but doesn't mean I like it)
    • Here's the problem with your assumptions:

      You're using your baseline computing conditions. Letting the GUI eat up cycles opening up firefox is fine. There are other times when you're doing computationally intensive tasks such as: compiling, ripping, packing files, watching video.

      I don't want the GUI to compound the problem and fight for system resources when I'm just opening a window or browsing my filesystem. I want to rip a CD and use the computer without the GUI screwing things up. The GUI needs to k
      • "There are other times when you're doing computationally intensive tasks such as: compiling, ripping, packing files...."

        In which case offloading effects to the GPU should be no problem.

        "... watching video. I don't want the GUI to compound the problem and fight for system resources when I'm just opening a window..."

        And in which case, you're no longer focusing your attention solely on the video, are you?

        Bottom line is that, in most cases, such effects are visual cues as to what's going on, are off-loa

  • it can be useful (Score:3, Insightful)

    by gEvil (beta) ( 945888 ) on Thursday March 23, 2006 @04:35PM (#14983028)
    Eyecandy incorporated in the proper way can in fact be useful. It can provide extra visual cues to indicate what's going on. It can help new users familiarize themselves with a system. However, for the most part, the ways it's currently being implemented are more of a distraction than a useful feature.
  • Please, someone correct me if I am wrong, but I thought the actual goal of all this desktop restructuring was to move the processing off of the CPU for the rendering and onto the GPU. The eyecandy was just a fringe benefit of the transition. Unless the benefit is more CPU cycles for non-GUI tasks, I would agree that this is a waste of time.
  • by linuxbaby ( 124641 ) * on Thursday March 23, 2006 @04:36PM (#14983033)
    Every time I help a friend set up Windows, it's always the first thing I do:

    Control Panels --> System --> Optimize for Best Performance
    It turns off ALL the fuzzy, fading, stupid stuff, and surprises them how much better it responds.

    Linux/BSD?
    IceWM [icewm.org] on top, but with KDE libs underneath, so you can run any KDE or Gnome apps, but don't need all that mem-hogging desktop candy just to run KMail or whatever.

  • Whilst desktops can make it more appealing and in a few cases easier to do things.

    You still cant beat a shell prompt, nomater the OS for so many tasks.

    rm -f a*.wibble or del a*.wib if you like - painful on a desktop given the ease of a command line.

    So the 60's hippies with the teletype consoles and lavalamps productivity wise and eye candy wise were way ahead of us :D.

  • Disagree (Score:5, Insightful)

    by GeorgeMcBay ( 106610 ) on Thursday March 23, 2006 @04:39PM (#14983057)
    Italics are quotes from the article:

    But building more special effects in the OS level will rob the extra power and memory from the applications and games which rightfully require them.

    I generally play games that require a lot of processing power in fullscreen mode, so the OS using fancy features for display will have very little impact (all of the OS's textures will be swapped off the GPU unless I alt-tab or otherwise task switch away from the fullscreen game). And the vast majority of applications I use just aren't going to have any significant negative impact from a bit of eyecandy. Computers are ridiculously fast these days... Word processors and web browsers have more than enough power to spare some for eye candy. There aren't too many applications for which this kind of eyecandy actually hurts performance on modern systems. Even things like, say, movie encoding or other heavy number-crunching apps aren't impacted significantly because almost all of the work in displaying the eye candy is done on the video GPU which would otherwise be unused anyway.


    There are other valid reasons too which prompt me to take the viewpoint that less eye candy is better for the OS. Experience tells me that it is futile to do productive work within a desktop with all the special effects enabled. The last time, I tried it, I was severely distracted and fell short of completing my work. Is it just me or are there others who have been through the same experience ? To do productive work, it always helps to have a fully functional but spartan desktop.


    I disagree here too. "Eyecandy" if used well (see MacOS X for some examples) can give subtle cues that actually make me more productive. This part is clearly subjective so YMMV.


    But the Windows users do not have this luxury. For example, a person using Windows 2000 will be forced to buy a copy of Vista if he needs the added security and extra features like better search. And to install Vista on his computer, he will most certainly have to embark on a spending spree to upgrade his PC to accomodate the extra special effects that are integrated into the OS


    The guy who wrote this should have done some research. You can run Vista without the Aero Glass UI being active, just as Windows XP can be dumbed down to look, feel, act and perform like Windows 2000 (except with much faster booting times).

    If you don't want the eyecandy, shut it off. You CAN do this in Windows XP and Vista, despite what the misinformed article states.

    • Re:Disagree (Score:2, Interesting)

      by nickheart ( 557603 )
      I used to play WoW on windows XP ... i was getting pissed off becuase i was getting disconnects every 8-10 hours or so, and only about 35 FPS... Then i installed windows 2000 (haha, no activation) i suddenly only disconnect when i tell Wow to, and holy moly, i'm getting another 10fps, just because i canned Xp...... Now, is this the fault of eye-candy, i htink not. i think it's the other extras that get added to Windows, especially by the time you get to sp2, you have so much crap running just for the GpOS!
      • I am mostly running Windows 2000. It's been almost perfectly stable for me. I have an XP laptop but I don't use it much . The new control panel layout is a bit aggravating because it adds another layer before allowing me to change what I want, so I did disable that.

        I generally turn off every effect and service that I can. Even if I do have a lot more powerful of a computer than I really need, it doesn't matter with time-based effects because a 3 second effect is a three second delay no matter how fast t
      • wait a minute.. you were playing WoW for more than 8-10 hours at a time?
  • by helix_r ( 134185 ) on Thursday March 23, 2006 @04:40PM (#14983063)

    I really disagree with the article. Computer interfaces should look good and be efficient. GUI's will always push the envelop of whatever technologies come around. If OS and software vendors aren't pushing the envelope, then they aren't working hard enough at improvement. Who cares about your lame 486's, anyway?

    The author then makes the claim that nice interfaces rob the computer of processing power. I disagree. Most of the time the computer (especially desktop) is doing nothing. In anycase, if what I understand is true, upcoming MS windows and some future X implememntations will use hardware acceleration for rendering window graphics-- so, the CPU won't be under any "strain" at all.

    Anyways, I paid my dues with the vt100 era. It is now a pleasure to use a nice interface. I would not have it any other way.

  • by vertinox ( 846076 ) on Thursday March 23, 2006 @04:40PM (#14983066)
    Just like the fact people like to decorate their slave boxes... Err... I mean cubicles, people actually like working with fancy high-tech OS technology that they see like the ones in the movies. The affect might wear off after 2,000 hours at working the same dead end job day in and day out, but if it feels like you are on the deck of the enteprise while doing Excel spreadsheets you might feel better about coming into work on time.
    • but if it feels like you are on the deck of the enteprise while doing Excel spreadsheets you might feel better about coming into work on time.

      So basically it'll make you feel like Wesley?

      (Apologies to Wil in advance)
  • Has anyone here ever seen CDE? For the love of God.

    If I'm going to stare at a screen for hours each day, I'd like to have what I'm looking at be easy on the eyes. I'm not a GUI nut either--text mode can be visually pleasing too, depending who is writing the software (ever logged into VMS? For the love of God!).

    Eye candy is not always necessary, but as long as it's helpful and not distracting, I'm all for it. Good examples are window managers such as fluxbox, windowmaker, and enlightenment. They're pr

  • by RingDev ( 879105 ) on Thursday March 23, 2006 @04:41PM (#14983078) Homepage Journal
    "For example, a person using Windows 2000 will be forced to buy a copy of Vista if he needs the added security and extra features like better search. And to install Vista on his computer, he will most certainly have to embark on a spending spree to upgrade his PC to accomodate the extra special effects that are integrated into the OS."

    Apparently, the author failed to notice that Vista has the option of the running classic interface, the XP interface, or the new Aero (ie: processor intencive) interface. So while a 2k user may want to buy a copy of Vista for security concerns, they should not have to upgrade their hardware in order to do so.

    -Rick
  • I've been testing the next release of the "unnamed proprietary operating system" in question, and I have to say that a great deal of the eye candy goes a long way to making things easier. Getting a live preview of a window if you hover over its taskbar button or flip between windows is a nice feature, as I constantly have a ton of windows open in the same app. Being able to move a window around without spiking the CPU to 60%+ is another subtle but nice benefit. In my testing of this release I've found th
  • That's why there's fluxbox [sourceforge.net]
  • Like Easter... (Score:2, Informative)

    by dedeman ( 726830 )
    It would appear that eye candy is a necessity, but only with the idea that there are different levels of eye candy, that the eye candy can easily be made to go away/less sweetening, and that it will work well with an average hardware base.

    That last idea would be the difficult to figure out. However, how much is decided by the user when they see screenshots, what is the coolness factor when icons appear to be crystal/brushed aluminum/iridecent blue/etc? How great is it when windows will shuffle like pages in
  • The graphics for Mine Sweeper haven't been updated in years.
  • by thatguywhoiam ( 524290 ) on Thursday March 23, 2006 @04:47PM (#14983123)
    Article sort of misses the overall point.

    First, let's all just admit that out GPUs are sitting mostly idle 96% of the time. This is not simply a question of CPU cycles anymore, like it used to be.

    Second, lets admit that when you refer to 'eye candy', you are framing the quesiton as a perjorative. It strongly implies that what you are talking about has already been judged as useless decoration.

    Good design follows function, as the saying goes. Examples of "good" eye candy - the Dock in OS X's genie effect. Its fast, it tells you where your minimized document is living, and it can be turned off (to straight scaling). Nothing wrong with this at all. Where developers go wrong is usually in two areas. One, developers are not designers. Developers write code, and should not attempt serious design, any more than the Photoshop and Illustrator jockeys should attempt C++. Second, picking an appropriate bit of eye candy should always follow an already identified need. This is the form-follows-function. Animation always draws the eye, it should not be misused to redirect your attention where it is not needed. Here's a great example: pull-down menus in Mac OS X vs the same in Windows XP. On the Mac, pulldowns appear instantly, and fade away once something is selected; this is correct behaviour, as you asked for a menu - there should be no delay. Fading away is fine because the selection has been made, and you have moved on. In XP, the menus fade up, and vanish instantly - totally backwards. That is bad eye candy.

    In the end it is always a question of design. Eye candy by itself is nothing, no value judgement can be rendered.. it is the application. So the way this article is framed is mostly useless for purposes of deciding when and where to employ such effects.

  • But the Windows users do not have this luxury. For example, a person using Windows 2000 will be forced to buy a copy of Vista if he needs the added security and extra features like better search. And to install Vista on his computer, he will most certainly have to embark on a spending spree to upgrade his PC to accomodate the extra special effects that are integrated into the OS. The alternative being to keep on using the same old OS with reduced features and dwindling security updates.

    No, no no and, no.

    Yo
  • kids get it (Score:3, Interesting)

    by kisrael ( 134664 ) on Thursday March 23, 2006 @04:51PM (#14983160) Homepage
    I always put my Windows box to "Classic" mode in short order.

    To me, UIs aren't "interesting" so I like to keep them as minimally distracting as possible. The less time it takes for my brain to say "this is a pushbutton" the better off I am.

    I've found that younger people are a bit less conservative about this stuff, and seem to embrace funky looking buttons faster.

    So I'm just turning into an old fogey...

    Some of the effects though...like making dropdown menus scroll down or fade in just take time. I understand how a total n00b might be impressed or even appreciate the connection (being less "jarring" than something just popping up) but it seems like a large cumulative time waiting for menus to open.
  • While I share some of the exasperation of the article's authors about the "need for speed" that Vista is requiring, at the same time, I recognize that this is nothing new or limited to Microsoft.

    This has been a function of all operating systems that use a GUI. It's been that way since they started. OSS is no less guilty - look at the specs for running Gnome or KDE, and compare the recent releases with the earlier versions. Compare hardware specs between Mac OS versions. Windows versions. In each one

    • My mom's 6-year-old iMac came with OS 9.whatever installed. Upgrading to OSX actually INCREASED system speed a great deal (not to mention vastly improving security and stability).

      If OSX is ramping up system specs, it's doing it at such a slow rate that very few users should realistically be affected. I expect my mom's hard drives to fail before she's forced to upgrade the system to meet OS requirements.
  • On my Pocket PC (iPAQ H2210, Windows Mobile 2003) the defaults both for window-open animation and cleartype are off. Turning on cleartype dramatically increased boot time, but on a 320x200 screen, it's pretty well mandatory. The window-open animation adds maybe a second of startup time to each application, maybe two - but it also lets me know when an application has launched, which is hugely useful since it's windows (even if it is CE) and it does things on its own schedule. And I haven't disabled the desk

  • This is, I think, part of the industry's attempt to keep things new and indeed sell os's and hardware. 10 years ago, computers were really just getting going in the mainstream and people were buying and upgrading al the time. Great times for the hardware and software manufacturers. Even better with the Y2K panic. It made the dot.com boom. Now, computers in the home are commonplace, and really, for most people who just browse and do e-mail etc (not hard-core gamers and hobbyists), the computer matured as a p
    • >Even better with the Y2K panic. It made the dot.com boom No, the dot.com boom came from the wide spread availablity of Internet access and the invention and mass marketing of the graphical web browser (Netscape, then IE) Y2K employeed a bunch of otherwise unemployable COBOL programmers and pumped money into consulting companies which had little effect on dot.com's. Most of that was under the radar stuff and much much more money went to the real fly by night dot.coms started by 20 something dropouts.
  • "This article ponders over whether excess eye candy and special effects being incorporated on the desktop is a good trend after all?"

    A great deal of us have been saying that working, being stable, secure and performing are much more important than a pretty interface for a very, very long time.

    The first thing I do when I install any OS is to turn off all the unnecessary crap.
  • Computers are now consumer items and as such are designed with consumers in mind, not anal-retentive "efficiency is all" types.

    For example, most people care how their cars look first, how they perform second. If you can mix both of these selling points then you have a market winner.

    Same thing applies to computers and other tools - Take a stroll through a home improvement store and look at how much industrial design goes into power tools these days, looks sell and this author doesn't seem to get it.
  • Desktop eyecandy sells. Sure, UNIX made OS X interesting, but if it wasn't pretty, it would not have gotten the response it did.

    I tend to like my OS to be as unobtrusive as possible. Many times, eye candy effects take the focus away from what I should be doing. Some examples:

    Any flash sites where the site has a million animations and sounds for the menu but lacking a lot of content, or useful content. Next time, don't spend 4 hours tweaking the window-close animations and add a damn site map.

    I really hope i
  • Im forced to agree with the previous comment, I don't know where the vast majority of people are getting their information, bit I'm a big fan of :

    "Tis better to remain silent and appear stupid than to open your mouth and remove all doubt."

    I know, I've read on several sites now that the fanciest of the UI effects will only be available if the machine meets the requirements, and that the effects and general UI look and feel has many many steps down it can take in the event that the hardware of the current

  • The article's closing point is that users' shouldn't be forced to upgrade to high-end graphics card. This is a moot point; Vista will include a low-frills GUI so that people don't have to upgrade [extremetech.com]:

    ...an old version that works like the current Windows XP GDI+ desktop drawing system exists in Vista only for backwards compatibility with systems that don't have the graphics hardware required for Aero Glass.

    Linux offers choice in GUIs, but so will Vista (as did XP). What would be really slick is a single, consi

  • For example, a person using Windows 2000 will be forced to buy a copy of Vista if he needs the added security and extra features like better search.

    You've got to be kidding, right? The "Search" tool in Windows has been broken since Windows NT 4, being unable to find anything in Unicode text files just ANSI text files. If you want to search Unicode files you've got to open a command prompt and use "find" or "findstr" - I doubt they'll fix this in Vista.

    Try this:

    • Start notepad and type in "hello world"
    • Us
  • by SmallFurryCreature ( 593017 ) on Thursday March 23, 2006 @05:22PM (#14983472) Journal
    Perhaps I got strange eyes, but mine start to hurt if I got a bright white nearly empty window in front of me with just some black text. Often with the width of a single pixel. Yes I am talking about your average web browser/file browser window. Adding a slight tone to it to soften the whiteness (bit like tanning for your computer) makes the desktop easier on the eyes and therefore easier to use in general.

    Adding an image. Perhaps even an image that tells me something about the contents of the window could be considered eyecandy OR and extra clue. Was it gnome that colored the entire desktop red if you ran as root? Eyecandy or vital visual feedback?

    Stricly speaking everything not in X is eyecandy. Run solaris on a xerox printer machine and you will get the bare basics of a window manager and yes it does everything it needs to but gee gods it is hard on the eyes.

    So where do you draw the line?

    Personally I liked Enlightenment but now run XFCE4 wich suggests that while I like a pretty picture I don't want it to get in the way of business. KDE 3.* is nice and all but gee gods it loves the animations. Gnome is too inflexible for me.

    Give me candy but don't slow me down. No animations. INSTANT popups/slides/whatever.

    Then again I do usually have gkrellm open. Lots of flashy blinky shiny thingys. But they don't slow me down and while they are eyecandy also tell me someting about my computer. Since I am on old hardware wich I tend to try to do things it isn't designed for I "use" the gkrellm eyecandy to tell me if I can expect a freeze to happen or when gentoos emerge is about to fill its HD space again.

    So usefull eyecandy?

    As for pure eyecandy effects like the holy grail of true transparancy. Well. My terminals are semi transparant and I would have it anyother way as I think (just my opinion) that it is easier on the eyes then a monochrome background. True transparancy would perhaps look even nicer and if it was as smooth as a FPS then all the better.

    Yes off course it doesn't really matter and I would hardly use a bad terminal emulator over a good one just for the sake of transparancy BUT if two terms are equal is the one that lets you choose your type of background better?

    Is the windows manager that then allows your term emulator to offer you transparancy then better for it? Etc all the way down to the kernel.

    I personally don't like eyecandy that steals window space OR takes time but I do like eyecandy that makes the desktop less endless grey slabs of unused space.

    Should the OS/window manager developers care about eyecandy? Well that is the beauty of OSS isn't it? Use pure X if you hate all eyecandy or use any of the window managers if you want more.

    A bit of sugar makes the medicine go down. Yes the medicine still needs to be good but sugar helps.

    Will windows new 3d desktop rendering be a good or a bad thing? Well, there was a recent discussion about offloading physics in games onto the gpu. That would help run the game a lot faster. Ages ago, long before GPU's, some video cards started offering windows acceleration wich supposedly helped offload some of the desktop rendering from the CPU onto the vidcard.

    It makes sense in a way. If you can save the CPU a boring task then it can spend its cycles on more meaningfull things. I do know for a fact that a true dual CPU machine has a lot less waiting for redraws then a single cpu machine. Would a single CPU machine with GPU desktop rendering be just as responsive? Surely that can't be bad.

    In a way I don't see the problem that the author has with it. Sure sure, windows users who want vista "security" (see a few articles below about IE7 for vista and how secure it is) need to upgrade and pay for the eyecandy but that is MS business model. They got more money some some countries so it works. Anyway I am fairly sure MS allows people to turn off all the candy they don't want.

    Ultimately the candy has little to do with the underlying OS. How a widget is drawn has

  • What is Vista if NOT Eye Candy???

    I mean, it sounds like they're removing most of the features from it. Except for suddenly requiring a new video card/high-end system ... what will Vista give me that XP doesn't?

    The only real features I remember hearing about are the new eye candy and the fact that IE will be separable from the OS. I can't for the life of me imagine why I'd be motivated to upgrade.

    I'm asking for real, not trolling ... what actual features of interest will be in Vista?
  • by Futurepower(R) ( 558542 ) on Thursday March 23, 2006 @05:48PM (#14983701) Homepage
    Microsoft makes a lot of its money selling to computer manufacturers. They want customers to be forced to buy new computers.

    This has NOTHING to do with doing the right thing for customers, in my opinion.
  • by leereyno ( 32197 ) on Thursday March 23, 2006 @08:10PM (#14984600) Homepage Journal
    My idea of a good desktop interface is one that doesn't slow me down. There are two kinds of eye candy, static and dynamic. Static eye candy in the form of a visually appealing interface that is simple, elegant, and ergonomic is good. Dynamic eye candy in the form of visual effects tend to be bad.

    Bad Eye candy:

    Example 1, fading menus: The default configuration of Windows XP features menus that fade in and out. Right click on the desktop of a fresh install of XP and you'll see what I mean. This is bad. Why? Because the rendering that is being done takes TIME. It slows down the user who has to wait for it to render. Admittedly it is only a few tenths of a second, but when you're a grand master hacker (!cracker) a few tenths of a second do make a difference. I always turn this 'feature' off.

    Example 2, window animation: Gnome has a very annoying "feature" where it animates the resizing of windows. Minimize a window and gnome draws a series of progressively smaller outline boxes on the screen tracing the minimization of the window. I'm not sure what use this is supposed to be. I do know that it slows me down. When I do something it should be as instantaneous as possible. KDE has the same "feature" but unlike Gnome it can be disabled. There are problems that I have with Gnome and the inability to turn off the bothersome BS (of which this is but one example) is a big one.

    Good Eye Candy:

    Example 1, Bouncing icons: Recent versions of KDE include what I call icon bouncing. When you double click on an icon to open a file or start a program, a miniaturized bouncing version of that icon appears next to the mouse pointer. The reason that this is not bad is because if I've double clicked on something I expect for there to be a lag while the program or file opens. The bouncing cursor does not slow me down. The reason it is good is because it lets me know that the program or file is actually trying to open. There are times when you double click on something and it doesn't quite register that you've done so. Without the bouncing cursor you might sit there for several seconds waiting for something to happen before realizing that it isn't going to. With the bouncing cursor you know immediately whether or not the system has registered your request or not.

    Example 2, Icon highlighting: Both Gnome and KDE feature icon highlighting. Whenever the mouse pointer is over an icon, it changes color. This is not bad because it does not slow you down. It is good because it gives that extra little bit of feeback to the user and creates a more interactive environment.

    In short user interfaces should be as efficient as they possibly can be. Eye candy that increases efficiency or improves aesthetics without reducing efficiency are good. Eye candy that reduces efficiency is bad, even if it arguably makes the interface more aesthetically pleasing.

    Now I realize that some people demand special effects and other such things. There is no reason why they cannot be accommodated. But at the same time the user MUST be able to turn any and all effects OFF. Furthremore I would argue that there should be a simple configuration tool that will provide both fine grained control of the effects as well as a set of general effects level settings (max, medium, low, off) to allow users to quickly set the level of eye candy they have to endure.

    I understand that Microsoft is adding in all sorts of eye candy to vista and that this is the primary reason why they recommend you have a Nvidia 12800^e24 super ninja turbo card with vertex dimpling and pixel shader 15 to run it. I have not seen vista yet, but I suspect that this is a grave mistake and that most experienced users will turn most or all of these new fangled 'features' OFF. I know I will.

    Lee
  • by real gumby ( 11516 ) on Thursday March 23, 2006 @10:41PM (#14985324)
    Sometimes I like it. I just run xterms or emacs all the time, with the windows stretched to take up the whole display. If I'm in the mood for eyecandy I add -F or even --color to my ls arguments....but generally I don't think it's worth the effort.

    YMMV.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...