Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Windows Operating Systems Graphics Software

Windows Longhorn to make Graphics Cards more Important 714

Renegade334 writes "The Inquirer has a story about MS Longhorn and its need for better than entry level graphics cards. This is due to the WGF (Windows Graphics Foundation) which will merge 2D and 3D graphics operations in one, and 3D menus and interfaces that require atleast Shader 2.0 compliant cards. Supposedly it will really affect the performance of the new Microsoft OS." This has been noted before in the system requirements for Longhorn, but it would seem the full impact is slowly being realized.
This discussion has been archived. No new comments can be posted.

Windows Longhorn to make Graphics Cards more Important

Comments Filter:
  • by Oculus Habent ( 562837 ) * <oculus.habent@gm ... Nom minus author> on Thursday January 13, 2005 @10:18PM (#11355534) Journal
    Mac OS X uses the graphics card heavily for much of its interfaces. All Macs sport at least a Radeon 9200 (Mobility in the iBook G4), and Apple takes advantage of those cards in plenty of apps... note the multi-person video chat layout & details in iChat AV, or the compositing

    That's not a knock on Windows - just an aside, really. The consumer graphics of PCs have been steadily improving, and there's little reason to not make use of that power. The only problems could be in the low-end motherboards offering cheap integrated video. Inevitably, some people are left out in the cold. Time to start moving to nForce or Radeon IGP, PCChips!

    I wonder if they'll have a cool Genie effect for minimizing... ;)
    • by mcc ( 14761 ) <amcclure@purdue.edu> on Thursday January 13, 2005 @10:42PM (#11355774) Homepage
      Making use of the available graphics power just makes sense, and Apple was smart to be the first to realize this. After all, window compositing is something you're going to have to do at some point anyway; why not offload that task onto that part of the hardware that's actually designed to composit things?

      But when you step into the realm of "hey, we've got this power-- let's waste it on something!". Then you're doing something really bad. Using pixel shaders to draw drop shadows on semitransparent textured menus or somesuch begins to fall into this territory.

      In the first case you're taking the present advantages offered by the hardware and leveraging them to improve the consumer experience. In the second case you're taking advantages offered by your hardware and eliminating them-- removing the power of your 3D hardware (which technically is there for the applications, not the OS, to use) by making sure that the 3D hardware is continually tied up running the particle engine floating around the talking paper clip or Enlightenment logo or whatever. This degrades the potential consumer experience because it means the consumers don't get to use the hardware they paid for, the OS is too busy using it.

      The difference between these two situations may be a little bit subtle and a larger bit subjective, but do you see the distinction here? Because given the curve of resource usage their OSes have followed in the past, I kind of doubt Microsoft does...
      • by fyngyrz ( 762201 ) on Thursday January 13, 2005 @11:07PM (#11355974) Homepage Journal
        Making use of the available graphics power just makes sense, and Apple was smart to be the first to realize this.

        Mmm, no. Commodore was the first to really do this. The original Amiga had native graphics capabilties that still aren't available (like multiple resolutions onscreen) in PC hardware. The OS used them, and used them well. When a more advanced Amiga came with more graphics capabilities, the OS automatically configured them and used them as well. Apple was me too, much later. :)

        But that's OK. Apple knows how to market -- that more than makes up for coming expensive, late and/or weakly with a number of things. Plus they provide a really nice end user experience.

        • Well, Jay Miner stopped at Atari before heading to Amiga. Gotta love the Antic and CTIA/GTIA. If he had had more years, I wonder what else he would have made.
        • by Junks Jerzey ( 54586 ) on Friday January 14, 2005 @12:13AM (#11356862)
          Mmm, no. Commodore was the first to really do this. The original Amiga had native graphics capabilties that still aren't available (like multiple resolutions onscreen) in PC hardware.

          In the interest of historical accuracy, the Atari 400 and 800, first publicly available in 1979 (six years before the Amiga), allowed mixing multiple resolutions on screen. You built a display list of modes and the hardware interpreted them. You could mix text, graphics, and various resolutions of each. You could also trigger interrupts to occur on a specific display list command.
          • IIRC, what diferentiated the Amigas was that you could not only mix multiple resolutions onscreen, but multiple resolutions with different bit depths, palettes, and even mouse cursors, which were drawn by hardware. This is from the top of my head, as i (sadly) never owned an Amiga and only fiddled when i saw friends who owed one, but i recall reading about that and be grossly impressed. It was truly a machine ahead of it's time.

            Anyone. feel free to correct me.
      • by pipingguy ( 566974 ) on Friday January 14, 2005 @01:10AM (#11357620)

        Hey, I *like* drop shadows and semi-transparency on menus and the like, it provides a "rich" environment and also helps to prioritize open windows. Perhaps you are a command line guru, I work with CAD software a lot and I appreciate the eye candy as a visual indicator. Then again, if it were up to me we'd toss all the CAD software and hardware and go back to board drafting - less "it's easy to revise because it's on the computer so let's do it a lot" attitude and more forethought required when designing.

        "Keeping up to speed" these days has more to do with updating one's computer knowledge quotient and not enough to do with actually doing real-world stuff and improving skills in the disciplines that we use computers to help us with in the first place.
    • Look at x.org [x.org]. Look at what they want to do with switching everything over to OpenGL rendering. I think you might find quite a few simularities between Longhorn, OSX, and x.org. It's the trend, and I think it's a smart desision.

      So what if you won't be able to use the windowing system unless you have an accelerated graphics card? Nearly all new(er) computers have graphics acceleration capability. It opens up a WHOLE lot more possibilities with what can be done within the windowing enviroment. PLUS it make
  • Shocking.. (Score:5, Funny)

    by Gorffy ( 763399 ) on Thursday January 13, 2005 @10:18PM (#11355535) Journal
    Using Windows as a way to sell more hardware!
  • by Anonymous Coward on Thursday January 13, 2005 @10:18PM (#11355539)
    Will be low-end by the time it actually gets released.
    • You are assuming that the spec won't move forward at a pace with the release date. That may not be a wise assumption, since we are talking about Windows2005IMeanSomeTimeBefore2015.
    • > Yeah, but today's high end
      >
      > will be low-end by the time [Longhorn] actually gets released.

      Yeah, but the Open Source and Free Software drivers for video cards will still be stuck at the level of the Radeon 7500 when it comes to 3D acceleration, due to the (unfortunately, for valid competitive-analysis-type business reasons) concerns of video hardware manufacturers (namely ATI vs. nVIDIA) when it comes to disclosing specifications.

      And then Gates and Jobs will both be able to point at a Linu

  • How silly (Score:5, Interesting)

    by grub ( 11606 ) <slashdot@grub.net> on Thursday January 13, 2005 @10:19PM (#11355546) Homepage Journal

    This is due to the WGF (Windows Graphics Foundation) which will merge 2D and 3D graphics operations in one, and 3D menus and interfaces that require atleast Shader 2.0 compliant cards.

    That's just plain stupid. Grandpa & Grandma want to check their email and pics of the grandkids, why on earth should they require a Radeon MegaXP293823-XtremeSLI+ to do that? I hope there's an option to disable all that cycle-wasting crud or MS may be shooting itself in the foot: how many offices will spend a few hundred dollars on individual video cards just to upgrade the OS? What about those machines with onboard video (ala Dell?)
    • Re:How silly (Score:5, Interesting)

      by Tasy ( 410890 ) on Thursday January 13, 2005 @10:34PM (#11355712)
      I think something most people don't realize is that by using the GPU to render, you are actually taking load OFF of the CPU, not adding to it. Bravo to Microsoft for this.

      Now all we have to do, is pray they don't leave some loop hole open that lets someone burn your video card. Can you imagine, built in Windows overclocking?

      *shudder*
      • by mattyrobinson69 ( 751521 ) on Thursday January 13, 2005 @10:56PM (#11355896)
        if they build it into internet explorer, how long before somebody finds a bug in the jpeg library that allows for a webpage to beable to set fire to your graphics card with a simple javascript?

        but seriously, rendering a GUI with the GPU is a good thing.
    • call me crazy... (Score:2, Interesting)

      by dAzED1 ( 33635 )
      but who says grandpa and grandma need to move to longhorn as soon as it comes out, when MS is just nowending support for WINNT4.0, as reported recently here on /.?

      Grandpa and grandma will be just fine on 2000 or xp, or...and here's the crazy part...even 98. My father in law still uses win3.freaking-1 on a 486, for Christ's sake. Grandpa and grandma will be just fine.

      • windows 3.1 is definately still used, altough not much. i know of 1 (out of ~3000 customers) that still use it (along with a 14.4k modem). it does the job, why should he bother upgrading?
    • That's just plain stupid. Grandpa & Grandma want to check their email and pics of the grandkids, why on earth should they require a Radeon MegaXP293823-XtremeSLI+ to do that?

      I think you've touched on one of the more hilarious parts of the computer industry. It's not about what people NEED, it's what you can require them to need. Want the new security features of Longhorn? Want to do email faster? You'll need a better graphics card.

    • by Trejkaz ( 615352 ) on Thursday January 13, 2005 @10:43PM (#11355783) Homepage
      Grandpa & Grandma will probably be dead by the time Longhorn comes out.
    • I've seen 98Lite, 2000Lite and XPLite. Perhaps the guys will make a LonghornLite that will enable you to use low-end graphics cards. Man, Microsoft should HIRE these guys. No. Put them on the TOP of the devel. team.
    • Re:How silly (Score:2, Insightful)

      by HermanAB ( 661181 )
      and 640kB memory should be enough...

      It is called 'progress' and it is not necessarily bad. You can keep your green on black Hercules graphics adaptor, but I'll go for the modern colour, thanks.

  • by TouchOfRed ( 785130 ) on Thursday January 13, 2005 @10:19PM (#11355553)
    I really fail to see how this will be useful, and help productivity. Personally, i dont think an operating system needs to be that fancy. Just like those who use the console now, "back in my day, we had to use 2d interfaces"
    • Unfortunatly 3D interfaces aren't really that usefull either. The human mind can far easier interpret 2D information from 2D data. 3D is only really usefull for visulisation of physical objects that are nativly 2D anyway.
    • Maybe eventually someone will have a 3d UI that is significantly easier to use than 2d, and is even really necessary for some apps.

      I mean, who would have thought that graphics would make email so much easier? But it does.

      For now, I have to laugh at the fact that NT people have to reboot to use the "recovery console", which is barely multitasking, if at all!

      So, I don't worry that it will be pointless, or that it will waste cycles. Think about the speed of Firefox vs the speed of Links. Eventually the s
    • by LincolnQ ( 648660 ) on Friday January 14, 2005 @02:19AM (#11358320)
      As I've experienced it, having an accelerator render your windows is really very helpful for usability. Rather than having things pop into place, you animate them. You run your animations quickly, so it's not annoying -- but a bit of motion can do several things:
      - Draw your eye towards whatever is moving. Your peripheral vision can see something moving better than it can see a sudden pop.
      - Give you a better sense of what is happening. If I press Minimize and the window disappears, I sometimes have to go hunting around my screen for where it disappeared to. If it animatedly shrinks, it helps your spatial memory to find it again. Having a decent graphics card to render the shrinking effect makes the transition smooth and nice.

      Having a graphics card for your windowing system also allows for reflection, transparency, and other effects like that. I haven't seen a good use for those effects in user interface yet, but I think they could turn out useful.
  • by Anonymous Coward on Thursday January 13, 2005 @10:20PM (#11355562)
    "KDE: Gets 5000% performance out of your graphics card by using our patented 'It Doesn't Use Fucking Pixel Shaders Just To Display A Fucking Menu' technology!"
  • Cool (Score:2, Interesting)

    Finally a move into using hardware to speed stuff up.

    I know we'll see a bunch of folks protesting bloat and other fud - but it'll be cool to see what they come up with with a home UI that strains a vid card.
  • No biggie. (Score:5, Insightful)

    by stratjakt ( 596332 ) on Thursday January 13, 2005 @10:21PM (#11355570) Journal
    You can get a card today for ~80 bucks that fit the bill. Even PCI models, if you're that far out of the loop. By the time longhorn is released, they'll be commonplace.

    Frankly, I can't wait to see this. All that GPU power of my 9800 is basically being wasted 99.99999999% of the time right now.
  • But... (Score:3, Funny)

    by rune2 ( 547599 ) on Thursday January 13, 2005 @10:22PM (#11355579) Homepage
    Can it run Longhorn? oh wait....
  • not so much impact (Score:5, Informative)

    by diegocgteleline.es ( 653730 ) on Thursday January 13, 2005 @10:22PM (#11355585)
    IIRC, longhorn installer will check your graphics card (if it's lower than X fps then...) and will enable or disable 3D functions depending on if you've a good or bad graphics card

    In short: the "3d mode" it won't be the one available. There will be a much lighter desktop available (somewhat like current XP or something like that, you'll miss all the 3d stuff but...)
  • Lobby (Score:2, Insightful)

    by FiReaNGeL ( 312636 )
    Honestly, do we NEED a 3d-accelerated interface? I'm sorry, but the "cute" factor vanish rapidly, and if it's gonna cost me a 200$ video card, I'll pass my turn. So basically, we will be required to buy a 3d card if we want to upgrade past Windows XP?

    Anyone else think that Nvidia and ATI might have lobbied aggressively for this? I can't justify this... if it was an option, sure, no problem, but a necessity...
    • Re:Lobby (Score:3, Insightful)

      by stratjakt ( 596332 )
      Oh shut the fuck up. 200 dollars my ass. I seriously am sick to fucking hell of "computar linux exparts" spouting such nonsense. Mod me down, call me a MSFT astoturfer or whatever. I absolutely hate intellectual dishonesty.

      A Radeon 9200 is 36 dollars.

      And no, you don't need it. Don't buy longhorn.

      I don't know if you'd noticed, but you can't buy anything BUT a 3D card new these days. By the time longhorn is out, if you don't have a 3D card with PS2.0 support, that would make your PC about 5 years old
    • Re:Lobby (Score:5, Insightful)

      by jellomizer ( 103300 ) * on Thursday January 13, 2005 @10:44PM (#11355790)
      Guess what things change. Back in the 80s when the Mac was released People said the same thing. Why do you need a GUI Interface where we can get all that we need done in text mode. GUI is only for games and cute apps. Then by the Mid 90s GUI became nessary for most modern computing needs. Besides just allowing ability such as WYSWYG Word Processing. The windowing interface made it common to have multible apps open at the same time where you can see information on one app and the other. Yes Desqview could do that too in text mode but it was difficult to get the data you needed without the resolution. Then you were paying $200 or More just for a card that can do "Ultra High Resulution" 640x480 at 16 colors. Shortly after all the computers needed them there production price went down to match competition.

      The same will happen with 3d cards after longhorn is released in some times in the distant future. The prices will go straight down, because there will be more then just 2 that will make a Longhorn compatible Video Card.

      I can't justify this... if it was an option, sure, no problem, but a necessity... Nobody is forcing you to upgrade you will not be put in Jail if you use your 8088XT with MS DOS 2.0 with 256k of RAM and a CGA (2D 4 Colors at 320x240, 2 Colors 640x240, 16 color Text Mode) Video card. But honestly as time goes on the system requirements for new systems increase. It is the same for Most Linux Distributions, Mac OS, BSD, Solaris... It happens deal with it.
  • I frequent several diffrent gaming forums. I have noticed that there are always people trying to play games on intel integrated graphics. Since intel just barely supports standards, its not a suprise that many games dont run at all or hardly on those cards.

    Hopefully it will encourage intel and other intergraded graphics makers to make decent video chipset or get replaced by demand. On the other hand, intel might make it just good enough for longhorn but not games.
    • there are always people trying to play games on intel integrated graphics

      But... but... the sticker on the front of the case says 'Intel Extreme Graphics! How can anything beat 'Extreme'?
      But don't worry, by the time Longhorn hits the market, I bet we'll have 'Intel Excessive Graphics' and be all set!
  • by g0dsp33d ( 849253 ) on Thursday January 13, 2005 @10:26PM (#11355623)
    In other news, Microsoft issues critical security warnings about bugs that let hackers run 3D viruses and worms natively in windows.
  • Great, but. (Score:5, Interesting)

    by PenchantToLurk ( 694161 ) on Thursday January 13, 2005 @10:27PM (#11355629)
    I've used Windows since 3.0. I'm a Windows (.Net) developer. And I agree that the gee-whiz factor will be great. Animations, depth to menus... it'll be gorgeous.

    But... It doesn't matter how fast computers get, Windows Explorer Shell always seems to become less snappy, even on fresh installs. XP made the start menu slower than ever as it retrieves nonessential metadata on the shortcuts. Myriad Shell extensions, over time, bring the Explorer UI to a crawl.

    Sexy is great, but I have to use it every day. It's just not worth making the UI dog even worse.
    • Re:Great, but. (Score:5, Informative)

      by bogie ( 31020 ) on Thursday January 13, 2005 @11:20PM (#11356071) Journal
      That is the $64 question isn't it? Can Microsoft learn to make an OS that doesn't slow down massively over time. I just did a fresh install on my one machine that runs XP and its night and day. Over time XP just gets slower and slower. Of course the battle cry for MS defenders is "its the fault of 3rd party drivers and apps". Well, then make freaking OS that doesn't let "3rd party" apps run it into the ground. Why do I even need to use an app's uninstaller? Why by default doesn't XP know exactly how to remove every last bit of registry crap that got shoved in there in the first place? How come it take 10 minutes for the start menu to come up after I've been using the OS for a while? How come many explorer operations still lock up the OS and stop whatever work you doing cold? When will MS make an OS that you can actually multitask on no matter what's going on in the background? MS has a lot of work to do and somehow I get the feeling that they haven't learned their lessons yet.
  • ... Really. How much 3D are you going to stuff on a display that except in a very few rare cases isn't able to display more then two dimensions?

    Mac OS X makes use of some 3D hardware for slight tricks when the hardware is there (on a G4 or G5 it will use a rotating box effect when logging in or switching users, on a G3 it won't) and I'm sure there's some acceleration used in Expose to move windows around although that works on all the macs I've tried it on, but what exactly could they possibly do 3D wise t
    • Re:3D Interfaces? (Score:5, Informative)

      by akac ( 571059 ) on Thursday January 13, 2005 @10:32PM (#11355696) Homepage
      No, not 3D interfaces in the way you're thinking. Think of it this way: every window is now an DirectX object. No need for redrawing by an app. Since every window is now a 3D object (one with only one pixel depth), you can do simple things like moving all the maintenance of a windows' DC from the app itself to the OS.

      That's what Quartz Extreme does on OS X. This is just Quartz Extreme on PC.
    • It cubes here on a 900MHz g3 iBook. It isn't cpu that the cube effect is concerned with, but rather the gpu. It runs smooth here with a Radeon 7500 (massive old).
  • Back at WinHEC in May (and before I believe) Microsoft gave out some more specific details about what the graphical requirements for Longhorn would be. Here's a summary [neoseeker.com] of the what they were expecting hardware requirements to look like. There is a more detailed version buried on their site somewhere but I'm too lazy to dig it up
  • Am I one of the only ones who prefers usability, stability, and performance... to eye candy?

    I'd rather it work on an old ATI Rage PRO.

    Why?

    Simply because that means good performance for modern computing. If the minimum is "latest and greatest"... Ugh.

    Nor do I like the idea of upgrading hardware around my OS. If anything I want to upgrade because I need it for my job. Not because of some 3D glitter covered start menu.

    Call me crazy... but performance is much more important.

    Why doesn't Microsoft invest
    • There are these things called video games. Look real sweet on your PC. They need good video 'cards' (get it? video games, video cards, its all video related), and if MS wants to give me a pretty desktop in pseudo-reward for blowing $300+ on a NVideo Deathbringer 5k, then I'm happy about that.

      Call me crazy.
    • I completely agree. Having a desktop that is not great on eye candy but usable is far preferable to an eye candy filled desktop that is even a bit less useable. I'd much rather see the desktop get more responsive and feel quicker as hardware speed increases rather than have the look improve but the "snappiness" stay the same or go down. It is always nice to be able to have an eye candy filled desktop, thats for sure, but there needs to be a way to get rid of it when not needed. Microsoft did it in XP, a
    • Re:Is this necessary (Score:3, Informative)

      by PyroMosh ( 287149 )
      If you have powerpoint installed, check this out [microsoft.com]. It's a fairly in depth discussion on Longhorn with emphesis on the new Windows Graphics Foundation.

      If not, I'll sumarize. Or you can google for essentially the same info, but this powerpoint file is well done.

      One of the goals of longhorn is to further the requirements of signed drivers, and to offload the complexity of drivers into the new WGF. The idea being that it's better to have MS write the code once well, than to have lots of third party vendors
    • by eV_x ( 180493 ) on Friday January 14, 2005 @01:32AM (#11357873)
      Let us all not forget that many years ago the video requirements of modern interfaces were substantially different than now. Things must progress and evolve. Interfaces will become heavier on some levels but easier on others, but you can clearly count on the advancements of technology to help OFFLOAD the strain to new devices and components. By Longhorn doing this, my guess is that my CPU will actually get less of a load on most things by making the graphics board do what it does better than a general purpose CPU.

      You can't stop evolution simply because you can't keep up or you get comfortable.

      I am consistently blown away by people who make comments like this:

      "Am I one of the only ones who prefers usability, stability, and performance... to eye candy?"

      Do you watch TV? Do you look at magazines? Style is here to stay my good friend. I don't know about you, but I DO care about what my OS looks like. If I wanted my OS to look and feel like a windowless brick room with flickering flourescent lighting, I'll skin it that way myself.

      Do you even use modern software? Almost all of it is skinnable. Why do you think that's popular? Because people are bored? No, because modern software is generally an extension of your personality. My guess is yours is like vanilla ice cream.

      On top of that, you are CLEARLY in the minority.

      A couple scenarios:
      Do you drive an old beater for a car because it "does the job"?
      Do you live in a tiny room with an integrated flip down bed and sit on the floor to eat because it's a more efficient use of space?
      Do you wear burlap clothes because it seems more practical?

      I'm sure you talk tough on computer crap, but you probably are wasteful in other areas. People like me DO care. I care about my car having the latest features. I care about my house being more than just a few walls with a ceiling. I care about personality and enjoying what I'm working with and where I live.

      "But do I really need to get new hardware... for eye candy?"

      Mr. Vanilla: Do you realize that every game id and Valve release sells new hardware? Oh, that's right, you wouldn't know because you're too busy with your CGA graphics board playing pong so you're not forced to "upgrade".

      Rock on - now excuse me while I go play my 8-Track.
  • Prices (Score:3, Interesting)

    by Sophrosyne ( 630428 ) on Thursday January 13, 2005 @10:30PM (#11355665) Homepage
    Watch those PC prices go up for a little bit... then potentially drop- but ATI and Nvidia would be smart to cash in on this-- maybe bundle Longhorn with video-cards and extra ram.
  • Big Deal. (Score:2, Insightful)

    by huber ( 723453 )
    How is this different from Apples Quartz Extreame or soon to be realeased Core Image? Its not. It the natural evolution of things. While naysayers will shout "idont need this" and " Its not productive" , When you have several CPU Intensive apps open and running, wouldn't it be nice to know that your otherwise unused gpu is taking care of your windowing?
  • and 3D menus and interfaces that require atleast Shader 2.0 compliant cards

    What about the high end audio cards so my computer can say "DooWeeeeeeeeeeooooooooOOOOO! BOOP!" as the cool 3-D Start Menu pops up when I hit the Start button and then another "BOOP! OOOOOooooooooeeeeeeewwwwDooo..." when I close the Start Menu?

    Dude! That would be so cool!

    DooWeeeeeeeeeeooooooooOOOOO! BOOP!
    BOOP! OOOOOooooooooeeeeeeewwwwDooo...
    DooWeeeeeeeeeeoo o oooooOOOOO! BOOP!
    BOOP! OOOOOooooooooeeeeeeewwwwDooo...

    Why do I suspect
  • BSOD (Score:4, Funny)

    by datafr0g ( 831498 ) <datafrogNO@SPAMgmail.com> on Thursday January 13, 2005 @10:31PM (#11355681) Homepage
    Wow! A 3D Blue Screen of Death? That would look really cool with Shader 2.0
  • by csoto ( 220540 ) on Thursday January 13, 2005 @10:32PM (#11355691)
    Just plunk down $500 for a Minimac.

    Quartz Extreme makes good use of the graphic hardware of any Mac. Many applications use this to their advantage.
  • I suspect this will just be another thing I have to turn off in order to use a new computer. The list is already getting pretty long:
    • Start menu --> Classic Mode
    • Screen --> Themes --> Windows Classic
    • Background --> anything simple and non-distracting
    • Appearance --> Effects --> disable transition effects, font smoothing, shadows, and alt-underline hiding (this is the kind of "enhancement" that most likely requires the video extravagance in Longhorn)
    • Screen Saver --> NONE (ain't LCD mon
  • Comment removed based on user account deletion
  • by Dominic_Mazzoni ( 125164 ) on Thursday January 13, 2005 @10:34PM (#11355707) Homepage
    Is this going to be another case of where Microsoft tries to copy Apple, but misses the point?

    Mac OS X 10.2 introduced "Quartz Extreme", which uses your graphics card to composite your screen. This meant that dragging windows around now required almost no CPU power at all. In 10.3, they introduced several 3-D effects to enhance the interface - most notably a rotating cube when you switch users.

    There are two key points that Microsoft seems to be missing, though:

    * Mac OS X looks exactly the same if you don't have a powerful enough graphics card, and screen redrawing is not too slow. Having a graphics card just makes the system more responsive because the CPU is doing less of the work.

    * The system degrades gracefully - if you don't have a powerful enough graphics card or run out of video RAM, certain 3D transitions may be skipped. But everything will still function, and everything will look the same.

    It's too early to tell, but it is starting to sound like Microsoft may be creating a new interface that requires a super graphics card, leaving those with only cheap integrated video with a completely different interface. To me that sounds like a recipe for tech support hell - novice users won't understand why their screen doesn't look like someone else's.
  • if requiring a graphics accelerator card is an unchangeable part of the Operating System, the system is obviously badly designed. Longhorn should separate the graphics modules from the interface. If the kernel doesn't detect an accelerated graphic, use the 2D system.

    Is that too much to ask? A simple *IF* ?
  • Not just eye candy (Score:4, Informative)

    by miyako ( 632510 ) <miyako AT gmail DOT com> on Thursday January 13, 2005 @10:44PM (#11355789) Homepage Journal
    I'm anticipating that a lot of people are going to bitch and moan about how it's pointless eyecandy, but if Microsoft is able to do what Apple has been doing, then it could really add to the UI.
    Things like expose and translucent windows can come in amazlingly handy in OS X (I've never found anything quite as useful as transparent terminal windows in OS X allowing me to have code open in one window, and documentation in the window behind it, and look through the code window to read documentation, especially when working with an API your not familiar with).
    I think that as 3D accelerated UIs become more common, we'll see even more useful features popping up. It's not like there is any good reason for new computer to have a video card that won't run this, and the type of person who would upgrade would probably either already have a newer videocard anyway.
    I just wish this would make it into X, but alas I suspect that it's the sort of thing that might take a while to get properly implemented and supported.
    • by nathanh ( 1214 )

      Things like expose and translucent windows can come in amazlingly handy in OS X (I've never found anything quite as useful as transparent terminal windows in OS X allowing me to have code open in one window, and documentation in the window behind it, and look through the code window to read documentation, especially when working with an API your not familiar with).

      I just wish this would make it into X, but alas I suspect that it's the sort of thing that might take a while to get properly implemented and

  • Well... (Score:5, Interesting)

    by i0wnzj005uck4 ( 603384 ) on Thursday January 13, 2005 @10:46PM (#11355816) Homepage
    I'd say that 3D acceleration is a Good Thing. After using QuartzExtreme on multiple macs, I have to say it makes a massive difference in most apps. It *does* speed up even moderately easy 2D things, like word processing apps. Also, where you notice the most difference is when switching between programs. Basically you've already got the images loaded in video ram, so a lot of stuff is instantaneous. And yeah, iChat AV wouldn't be quite as pretty on Win XP.

    But the real question is: why are pixel shaders needed? Unless you're doing strange reflections or simulating bumps or playing around with reflectivity in realtime, I can't imagine a use for them. I certainly can't see why you'd need anything more than simple textured quads or triangles. Oh, and some sort of alpha support for shadows. All of that sounds like a TNT2-era card, like the one I used to use to do Quake II.

    What this really feels like is Microsoft pushing hardware adoption again. Ever notice how new motherboards don't come with USB drivers for Windows XP? How you have to upgrade to the latest service pack to get USB support? Partly piracy curbing, and partly I think to keep a hold by forcing people to use approved hardware.
  • by Transcendent ( 204992 ) on Thursday January 13, 2005 @11:11PM (#11356011)
    An afterthought to an earlier post.... did anyone notice we're fretting over an artice from The Inquirer???
  • by captaineo ( 87164 ) on Friday January 14, 2005 @12:00AM (#11356690)
    This is a very good thing, if only because it will force developers to think in terms of arbitrary units (like "inches on the screen") as opposed to hard-coding pixel dimensions into their software*. Recent high-resolution monitors have exposed painful problems of hard-coded pixel interfaces - like text that becomes virtually unreadable at 3840x2160.

    As a side benefit, this move towards a more vector-oriented display architecture means anti-aliasing will be easy to perform. Imagine dragging a window around with sub-pixel precision, and having the window contents and edges anti-aliased with a high-quality filter.

    Not to knock Apple, but from what I have heard, Microsoft's implementation goes further in making the graphics API completely resolution-independent.

    * and if you still want to use bitmaps for certain things, go right ahead, just let the graphics card re-size them to the appropriate pixel dimensions with high-quality filtering.
  • by Yaa 101 ( 664725 ) on Friday January 14, 2005 @12:25AM (#11357039) Journal
    Wow, we are treated now with 3d buffer overflows... yummy...

    Thank you Bill...
  • Okay (Score:4, Funny)

    by SunFan ( 845761 ) on Friday January 14, 2005 @12:39AM (#11357213)

    So is Longhorn going to have any new useful features or just sit there and look pretty?
  • by BFaucet ( 635036 ) on Friday January 14, 2005 @12:50AM (#11357368) Homepage
    I'm a computer animaton/FX guy and I need every little bit of speed out of my GPU... in many cases my GPU ends up holding me back, not my CPU. I don't really need menus and windows to be taking video RAM either.

    I wish MS would work to make computers cheaper and more a part of everybody's life instead of trying to make companies spend $1000 to upgrade each system so they can continue to use Office (on top of the already unbelievable MS Office tax.)
  • Apple... (Score:3, Insightful)

    by vought ( 160908 ) on Friday January 14, 2005 @03:20AM (#11358759)
    will sell a ton of Mac Minis in two years. When people realize they can't run the latest and greatest, they will have to buy a new machine to keep up with the Joneses.

    Given the creeping resource requirements of Longhorn, you'll need something relatively powerful to run it. Powerful usually means big and loud. The mini suports quartz extreme with it's 32MB Radeon, but $500.00 mass-manufactured PCs definitely don't, Buy a new $500.00 PC today and you'll get shared DRAM video memory, unsuitable for Longhorn's graphics model.

    When Longhorn finally ships, you get to spend money and time upgrading your video card and buying more RAM - or you can just buy a new machine ready to run, virus-free, and which requires only an upfront investment in a keyboard and mouse. Everyone has a TV - and the Mac mini connects to a TV out of the box.

    And do you really think even a midrange PC today will be capable of running any decent video editing app in Longhorn?

    Now remember, these people already have monitors, keyboards, and mice. The mini comes with none of these. Just replace your old, decrepit PC with a Mac mini.

    Apple is introducing this new idea and expression of the home computer now, because it gives them time to gradually inform the market, generate buzz, and work up to a similar condition to what we se with the iPod today.

    They will learn from this first, good product, and make something even better. The iMac was the first example of this thinking; iPod was the most successful. Start with only the best ideas and build upon them. Kill the bad ideas quickly. Drop the size, drop the cost. Apple is innovating at hyperspeed, catching up for years lost wandering in the wilderness.

    If you're going to spend $500.00 on a new machine so you can run a new OS, what's to keep you from geting one of these Mac Mini things anyway? Especially when you can just hook it to the TV, put it in Simple Finder, and give one to granny for e-mailing pictures of her fancy dog to her friends with fancy dogs?

    Just my two cents. Everyone's in the PC business has been secretly that afraid Apple would do this for years now. Now they're left to squeeze their margins even further, remaining at the sole mercy of Microsoft - who appear to be displaying an incredible ability to screw up nearly everything they've touched over the past couple of years.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...