Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Graphics State of the Union 148

Tom's Hardware has put out a nice recap of where computer graphics have been and where they are headed in the near future. While there are some definite shiny toys being displayed in new product releases and on the test beds, the overall problem of power consumption continues to rear its ugly head demanding attention. From the article: "while all of these things are interesting, exciting and new, the problem remains the same. Getting smaller and faster only makes sense if the design also is less demanding on the wall socket and cooling system. We all want different things when it comes to advancements, but first and foremost we need better power management. The bottom line is simple: graphics makers must take a step back from feature brainstorming until the power issue is resolved."
This discussion has been archived. No new comments can be posted.

Graphics State of the Union

Comments Filter:
  • Power & Physics (Score:3, Interesting)

    by JPFitting ( 990912 ) <communicate@jpfitting.com> on Tuesday July 25, 2006 @02:13PM (#15777595) Homepage Journal
    I do agree that power is becoming quite an annoyance these days with the video cards. I would like to say that I believe that to move forward we need to take a step back. I am finding more and more games that are simply pleasing to the eye but lack the originality, functionality, and creativeness of older games. These video card makers focus too much on realism and tend to encourage game makers to focus on the like. Let's make cards that are functional, less power hungry, well-rounded (physics), and cooler.
    • by powerlord ( 28156 ) on Tuesday July 25, 2006 @02:47PM (#15777894) Journal
      Just out of curiousity, lets look at the current CPU offerings.

      Intel came out with a truly Power-Hungry CPU.
      AMD came out with a cooler and better CPU.
      Intel came out with an even cooler CPU that out performed the AMD one. (Core Duo/Core 2 Duo)
      The ball is now in AMDs court.

      In other words, the presure on Intel was that they had to compete in that area in order to be competitive.

      Perhaps AMD, coming from their battle with Intel can help focus the ATI division on less power consumption/heat generation, and perhaps that is that AMD can help bring to the table.

      If they even BEGIN to make inroads in this, while maintaining a competitive stance against Nvidia, it will force Nvidia to compete on this point also, which should move GPUs in a cooler direction :)
      • I stated this in the amd+ati deal, but somehow it didn't get modded :)

        AMD had to go together with ATI to get _low power_ systems up. They won't make it on the processor line alone, they need to have a chipset+cpu solution (which is the most important anyway). ATI has just that, low powerconsumption (I believe their GPUs use less power too). Add to this that in the recent presentation they gave clue [dailytech.com] of what could be done to use less power. Namely the scaling of systems, this would optimise the compute

        • (went back and read your comment :) )

          Yes, I can see where for AMD adding ATI's chipset over Nvidia's makes sense, especially if ATI's is less power hungry.

          Interesting article, and definately in line with what we were both thinking. Suddenly it looks like Nvidia and Intel might have some competition on their hands.
    • It's called a gforce 5200 fx *. I have one in both my G5 tower and my secondary p.c. It's good enough to play Quake 3, and Need for Speed Porsche unleashed which are realistic enough for my taste, and the card has a tiny heat sink and no fan so I assume it's not drawing much power. If you want to play the latest games at 1800 x 1400 or whatever than yes you will need a giant heat producing card with a fan that sounds like a jet taking off. I'd comment though that I think recent games don't look THAT muc
      • If you want to play the latest games at 1800 x 1400 or whatever than yes you will need a giant heat producing card with a fan that sounds like a jet taking off.

        FUD, pure and simple.

        For $30 (and less) you can buy a nearly noiseless heatsink that will cool your card better than most stock heatsinks (e.g. Zalman VF(7/9)00, Arctic Cooling Silencer line), and if you don't want to go the DIY route there are more and more cards coming out that have quality quiet heatsinks preinstalled (e.g. ICEQ3 by HIS).
        • I'll take your word for it dude, sometimes I'm glad to be 40, over the hill and a casual gamer. :)
        • I second the recommendation for the Zalman VF series. I've put a VF900 on my Geforce FX5900XT and even with the fan at the minimum speed, it's cooler than with the stock heatsink+fan.

          The only thing that worries me is the amount of flex the heatsink puts on the PCB when you tighten it down all the way like it says in the manual. It flexed so much that it actually made inserting the card into the AGP slot a bit difficult...
    • I find it odd that when making recollection about older games that they never think about the fact that most of the older games, both looked ugly and played badly. At least today, most games have at least the looks category covered.

      The games that people remember tend to be the good/original titles. ie.) Super Mario Bros. Zelda, Contra

      Does anyone else have thoughts about this notion?
  • Wrong. (Score:5, Insightful)

    by The Living Fractal ( 162153 ) <banantarr@hot m a i l.com> on Tuesday July 25, 2006 @02:17PM (#15777628) Homepage
    The bottom line is simple: graphics makers must take a step back from feature brainstorming until the power issue is resolved.

    Today this is irrelevant. If consumers continue to purchase ever more power hungry graphics cards, what is to stop the companies from making them? When the market actually changes and people start considering the power requirements of their cards, then I'll believe this statement about the bottom line. Because right now the only thing I hear from people building or buying new computers about the power requirements is "make sure you get a PFC PSU and get lots of watts", not "make sure you get a low-power GPU". For one thing, some people actually enjoy saying they have a 600+ watt PSU. I can imagine that with current power costs today this trend will continue. Do the math, it's not actually costing a person much more per month to go from 600 to a 1000 watt PSU, especially since most people don't use their GPU to full power most of the time.

    Power requirements take a back seat to overall performance, and will continue to do so until electricity costs are driven up further. It's simple economics. People are willing to pay for the power-hungry cards. And until they're not, power consumption will continue to be less important to the producers than performance. This is analogous to today's vehicles, still being built and shipped with huge fuel sucking engines. For many people, and I'd wager to say enough to sustain the market for years to come, the cost of energy (either liquid or electrical) is still low enough that they aren't going to give up their cherished powers, be they piston driven or transistor.

    TLF
    • Re:Wrong. (Score:5, Interesting)

      by xanadu-xtroot.com ( 450073 ) <xanaduNO@SPAMinorbit.com> on Tuesday July 25, 2006 @02:23PM (#15777673) Homepage Journal
      If consumers continue to purchase ever more power hungry graphics cards, what is to stop the companies from making them?

      And what choice do we currently have? If the companies made Watt-Friendly cards, I'm willing to bet people would buy them especially for laptops. But they don't. We don't have the choice BUT to buy these double bay, amp eating, juggernauts we have today.
      • Re:Wrong. (Score:5, Informative)

        by The Living Fractal ( 162153 ) <banantarr@hot m a i l.com> on Tuesday July 25, 2006 @02:33PM (#15777741) Homepage
        Well, TBH the companies are beginning to focus on this sector (mobile).

        A lot of the newer mobile GPU (like GeForce Go) are capable of greatly reducing their overall consumption when their total demand is low. They ramp up when needed.

        Of course this doesn't address the fact that, when needed, and when ramped up, they consume a lot of power. To which I say, yes, we need more power efficient cards.

        This is unique to the mobile sector for now, but of course will eventually find its way into the entire realm of graphics computing.

        Unless of course we find a way to produce power more cheaply and abundantly than with hydrocarbons. In which case the only thing we'll care about then is cooling ;) But I suspect that could be a long ways off.

        TLF

        • Re:Wrong. (Score:1, Insightful)

          by Anonymous Coward
          A lot of the newer mobile GPU (like GeForce Go) are capable of greatly reducing their overall consumption when their total demand is low. They ramp up when needed.

          The other problem with this solution seems to be that with Vista coming, and making use of the graphics card for its user interface, there will be very little down time for the videocard.

        • Imagine that electric power is nearly free, and 50kW PSUs are common. The problem is, these 50 kilowatts all turn into heat nearly instantly. And you have to do something to drive away all this heat from your house, or be boiled. Can you fancy a CS match seating near a furnace pipe that drives hot air away? %)

          Currently a moderate cooling system that is capable of cooling a 600W gaming rig costs $150-200. With 50kW systems, it's going to be much more expensive, large, and cumbersome, ask people in the neares
          • Look at it this way: Your monster PC will lower your heating oil bills in the winter. :)

            New project idea: how to turn a Windows cluster into an on-demand water heater.
          • Bigger is better. Period.
            In the playstation generation, nobody cares about power consumption.

        • A lot of the newer mobile GPU (like GeForce Go) are capable of greatly reducing their overall consumption when their total demand is low. They ramp up when needed.


          Mobile? The desktop ATI cards, including the twin-slot monoliths, have done this for years.
      • Re:Wrong. (Score:3, Insightful)

        by samkass ( 174571 )
        Plenty of folks make Watt-friendly integrated graphics chipsets for laptops, including Intel. They just get laughed at by gamers. When it comes to graphics cards, the market still prefers performance over power consumption, and that's probably not going to change too much anytime soon. Unlike the more complex instruction sets in PCs, I doubt graphics cards have a lot of optimization wiggle-room when it comes to eeking out more performance per Watt. So you're pretty much left to die-shrinking, for which
      • And what choice do we currently have?

        You haven't bothered looking much, have you? Fanless video cards are available for the taking. They're quite prevalent in Home Theater PC's (HTPC's) due to low noise levels, and the lack of a fan pretty much puts a very conservative upper limit on how much juice it can pull. Even given those limitations, you can find cards that'll give you fairly decent performance. Just look at AnandTech's recent HD-DVD/Blu-ray video card comparison.

        The point is this: people do hav
        • You haven't bothered looking much, have you? Fanless video cards are available for the taking. They're quite prevalent in Home Theater PC's (HTPC's) due to low noise levels, and the lack of a fan pretty much puts a very conservative upper limit on how much juice it can pull.

          Depends on your viewpoint. Some of the graphics cards I own don't even need a bloody heat-sink, yet they do desktop things as well as cards made in the third millennium. (Or would have, if video RAM hadn't been so expensive back in 1996

          • Re:Wrong. (Score:3, Insightful)

            Anyway, from that viewpoint, a huge heat-sink signals high energy consumption.

            "High energy consumption" is a very relative term. When compared to the video cards sporting multiple 5000rpm fans, heatpipes, and three pounds of copper-cored heatsink fins, a little one-inch-by-one-inch heatsink covering the GPU generally signals low energy consumption.

            As for cards with no heatsinks at all, I think you'll agree that such animals are becoming very, very scarce these days, and they represent the barest fringe of
      • And what choice do we currently have?

        Buy two when they come out [newsforge.com].

        These soul-sucking proprietary vendors can piss off.
    • True on the supply/demand curve. As long as customers can brag that they have the best machines and the manufactuer's make money the trend will not stop. As with my previous post, it will drive the need for more graphically challenging games rather than the games themself being challenging.
      • IMO it isn't about the cost of electricity. I don't think the consumer will ever care how many cents it costs to play WoW for an hour. Honest question, which costs more to run, the video card or the 21'' CRT gamers love so much? My guess is that graphics cards would have to eat loads more power to even come close to the power consumption of the monitor...

        If you want the issue to be one of cost, wattage use isn't the problem. Consumers will only care when the up when it becomes too expensive to buy both a
      • As long as customers can brag that they have the best machines and the manufactuer's make money the trend will not stop.

        You almost need top-of-the-line stuff to get a constant 60 FPS in games that have been coming out recently.

        Personally I would be nothing but happy if game developers slowed the rate at which they are improving graphics in games. Having to continuously purchase new hardware is really expensive, and the improved visuals rarely add anything of value to the game.

    • No kidding ... I live in a cold area, and during the winter I make extensive use of a 1500watt electric heater. My electric bill is maybe $120 in the coldest month, including everything else as well. Who cares about buying a 600-1000watt pc power supply, which isn't even going to be close to full load most of the time.
      • I live in a cold area, and during the winter I make extensive use of a 1500watt electric heater. My electric bill is maybe $120 in the coldest month, including everything else as well. Who cares about buying a 600-1000watt pc power supply, which isn't even going to be close to full load most of the time.

        Yeah, but note that using electricity for heating is expensive and wasteful, unless you have no other options for heating. If you had used district heating or something, the figures would have looked diffe

        • That actually strengthens my point ... when a grossly inefficient use of electricity for pure heating isn't enough to deliver an even mildly scary heating bill, who is going to worry about a drastically more efficient computer that probably uses only 1/10th of the power during normal use?
    • For one thing, some people actually enjoy saying they have a 600+ watt PSU
      These are the same people that buy 'Canyoneros' and Hummers -- probably for the same reason (I'll give you a hint -- it's not safety, Stumpy).
    • Re:Wrong. (Score:3, Insightful)

      by MrFlibbs ( 945469 )
      Yes, the market will decide the issue -- much like it did the CPU market. Intel didn't drop the PIV lightly, but were forced to do so when the costs of pushing the power envelope were hurting them in the market. They fell behind AMD in performance because the power limitations were slowing the clock speed pushes they needed to keep up. Intel eventually saw the writing on the wall and went with a design where power consumption was a primary consideration.

      Eventually, the market will force GPUs down the sam
    • While it's true that most of the market isn't concerned with power, that doesn't mean they aren't painting themselves into a corner. Power consumption is one factor, but as these cards require huge ammounts of power, they also need to dissapate huge ammounts of heat.

      Now to play the latest games you need a minum requirement for a card that requires a leaf blower attached to cool it. So lets see here, they've got power hungry computers, that are loud as hell to power PC games that are starting to struggle w
    • This is analogous to today's vehicles, still being built and shipped with huge fuel sucking engines. - but many of these vehicles are unnecessarily thirsty! Consider even Ford Focus and Hyndai Accent, two comparable vehicles in both size and performance. Accent still beats Focus in terms of fuel economy by at least 30% (personal experience.)
    • Re:Wrong. (Score:3, Informative)

      by evilviper ( 135110 )
      Because right now the only thing I hear from people building or buying new computers about the power requirements is "make sure you get a PFC PSU and get lots of watts", not "make sure you get a low-power GPU".

      No, you'll never hear "low-power GPU". You will, however, heard "fanless videocard" ALL THE TIME, and it's effectively a code for the same idea.

      Regular people understand the issues far more than geeks give them credit for.
      • Regular people understand the issues far more than geeks give them credit for.

        Yeah, regular people who don't know why they need a video card since they don't play videos on their machine. Regular people who want to know why you want them to bring in their 'hard drive', when it is the computer that is the problem. Regular people who are using Office 2003 as their operating system. Regular people who don't know why their computers aren't running during a blackout.

        These are the regular people.

        The peop
        • [...] 'hard drive', when it is the computer that is the problem.
          [...] using Office 2003 as their operating system.

          "Regular People" don't know the terminology, they don't know the difference between "RAM" and "Hard Drives", they don't know the difference between "Operating System" and "Programs", they wouldn't recognize a "hard drive" if it fell on them. Still, if you skip most of the terminology, average people are perfectly able to understand.

          Regular people who don't know why their computers aren't runnin

    • Re:Wrong. (Score:3, Informative)

      by asuffield ( 111848 )

      Do the math, it's not actually costing a person much more per month to go from 600 to a 1000 watt PSU

      A 1000 watt PSU does not use 10/6 times as much power as a 600 watt PSU. There are two reasons for this:

      • The PSU only draws power proprotional to the load on it. The rating is the maximum draw, not the minimum. Two PSUs of equal efficiency but different ratings, supplying the same load, should draw the same amount of power.
      • The ratings are lies anyway. The manufacturers add up the numbers in ways that make NO
      • Re:Wrong. (Score:3, Informative)

        by jandrese ( 485 )
        It's actually more complex than that too. If you're running a 600W PSU near its limit, there's a good chance you could save a fair bit of energy by upgrading to the 1000W PSU, just because the efficency of the PSU tends to go down as you get closer to its maximum load. Ultimately, the efficency is what you're concerned about, not how many watts aggregate it can put out across all of the rails.
  • by Rob T Firefly ( 844560 ) on Tuesday July 25, 2006 @02:19PM (#15777647) Homepage Journal
    Thanks to the 30-40 seperate power-chomping ads on each page of Tom's Hardware stories, the lights in my office dim whenever I accidentally hover my cursor over the word "graphics," "Microsoft," or "processor." Thanks, Tom!
  • by Bungleman ( 955072 ) on Tuesday July 25, 2006 @02:32PM (#15777733)
    I was thinking about this yesterday... I had downloaded a rom of Crono Trigger for the SNES, and I'm having a blast. When all the new games like Battlefield 2, Titan Quest, UT2004, and FEAR get old, I like to go back to the old games. So someone might say... why go back to the old games? They're old and pixellated. But they're FUN! The old classics like Crono Trigger, Secret of Mana, original Mario Bros., Zelda Link to the Past, Super Metroid... they don't make em like that anymore. And there's a generation of "gamers" coming up that have missed out on a lot because of that.

    Nowadays it's all about the graphics, and the gameplay tends to (but not always) suffer. Even the best of the best new games have these problems. FEAR? A pathetic 8-9 hours of gameplay, though it was pretty fun while it lasted. Oblivion? Tons of hours of gameplay, but completely SHALLOW in terms of the overall experience. Even Morrowind had this game beat IMO. Battlefield 2? Awesome graphics, and fun gameplay... oh, but don't try running more than a few bots on your machine unless you want to run at 2fps, and forget about coop play, and don't expect single player with more than 16 player maps (mods notwithstanding).

    It seems to me that the more games focus on graphics, the more they lose in other areas. They either have cut features, performance issues, lack of content, or something... this isn't always the case (think Half Life 2), but unfortunately we're paying for the 'shiny factor' more often and losing out on the content that made the old games fun. Maybe I'm getting too old, or maybe I'm just jaded, but I still miss the old style games.
    • You're right that there's a trade-off in resrouces that's involved with high quality graphics, but you're operating under a false premise when you think that an emphasis on graphics is in any way new.

      Graphical output trumped text. VGA trumped CGA. Etc. Etc. The advance in graphical capabilites and the emphasis on graphics has always been with us. Why was Dragon Slayer wildly successful in 1984? Because it was the most extreme gameplay for graphics tradeoff ever.

      What is new is that the vast majority of big p
    • I was thinking about this yesterday... I had downloaded a rom of Crono Trigger for the SNES, and I'm having a blast. When all the new games like Battlefield 2, Titan Quest, UT2004, and FEAR get old, I like to go back to the old games. So someone might say... why go back to the old games? They're old and pixellated. But they're FUN! The old classics like Crono Trigger, Secret of Mana, original Mario Bros., Zelda Link to the Past, Super Metroid... they don't make em like that anymore. And there's a generation

  • When the new equipment requires more and more power, I am forced to add more and more fans to my system.

    I wouldn't care a bit about power consumption if it wasn't so closely connected to noise levels.
  • Perhaps it's just me, but I thought that folks started taking note if the power issue when faced with the need to plug their graphics adapters directly into their power supplies...
  • by benzapp ( 464105 ) on Tuesday July 25, 2006 @02:36PM (#15777778)
    I stopped reading the article after it started to suggest 1100 watt power supplies are necessary for this nonsense.

    I'm sorry. No video game is worth that much power.
    • by Hoi Polloi ( 522990 ) on Tuesday July 25, 2006 @03:45PM (#15778524) Journal
      I forsee a coming together of household technology. The CPU will also become the oven and the GPU will also become the water heater.

      Wait until you have to switch your PC from a regular 110V outlet to a round 220V outlet like the ones they use for electric ovens.

      Maybe if you had a little meter next to you that rang up how much you were paying for electricity since you turned on your pc people would be more conservative. Right now it is a bit of a hidden cost since it all gets lumped together into a monthly bill, along with your AC, fridge, etc.
  • I also agree that the power required for video cards today are quite insane. To run a 7900GT alone requires a minimum of 21A on the 12v rail, and in SLI you'll need at least 28A on the 12v rail minimum (this is just to run the video card without having it auto clock down for lack of power). I currently run one 7900GT and had bought a monster PSU (pcpowercooling) to support the future power needs of these gfx cards. Just to back up the craziness of the matter, whenever I had turned my computer on it would o
    • To run a 7900GT alone requires a minimum of 21A on the 12v rail, and in SLI you'll need at least 28A on the 12v rail minimum (this is just to run the video card without having it auto clock down for lack of power).

      I love it when people take RECOMENDATIONS out of context, and suddenly start calling them REQUIREMENTS.

      "MINIMUM SYSTEM REQUIREMENTS" are nothing of the sort. They are not the minimum requirements for the device to work, they are simply a VERY general "minimum" value with lots of play built-in.

      For
  • With CPU manufacturers chasing higher performance per watt and less heat production, you would think GPU makers won't be far behind, but consider the consumers. Your average XJoexGamerX doesn't care how high his mom's electric bill goes, he needs his 400 fps in goddamnit.
  • I want graphics cards to support things like nurbs more in the future. Instead of having to use polygons, and interpolating polygons from a spline before rendering, it would be nice to draw actual curves. Unfortunantly technology and algorithms havent made this feasable yet (AFAIK)

    I wish it wasn't just workstation gpu's that got the good 2d and line drawing support (I guess it's not economic for consumer cards) I'm really interested in NPR (nonphotorealistic rendering) but all graphics cards are concerned
  • by podperson ( 592944 ) on Tuesday July 25, 2006 @03:09PM (#15778153) Homepage
    "Tom's Hardware has put out a nice recap of where computer graphics have been and where they are headed in the near future."

    No. It's an article more-or-less solely devoted to discussing the issue of power consumption in new and upcoming graphics cards. It doesn't describe the state of the union or even have much to say about any shiny new toys beyond their likely impact on power consumption.

    It's an interesting article, but not the article that goes with its title nor the Slashdot summary.
  • by Hortensia Patel ( 101296 ) on Tuesday July 25, 2006 @03:16PM (#15778216)
    Not disputing anything in TFA, but there's another power-related annoyance that (IMHO) should be easier to address.

    When rendering in double-buffered mode with vsync on, the graphics card driver needs to wait for the display's vertical retrace before it swaps (or blits) the back buffer to the front. Today, all Windows drivers that I know of accomplish this with a spinlock. This means that an animated app grabs ALL available CPU cycles, even if the CPU actually needed to redraw each frame is trivial, and thus runs much hotter than it ought to for the amount of work being done.

    For a high-end game that stresses the system anyway, this isn't a big deal. For more modest games or non-game applets, it's embarrassing to have a single rotating triangle forcing the machine to run all-out, particularly on battery power.

    Application-level 'fixes' for this problem are very unsatisfactory - mostly trying to guess how long you've got until the next flip, Sleep()ing a bit and hoping you get woken up in time. It's clumsy, imprecise and the wrong place to be solving this. Why can't the driver wait on the flip - the flip it controls, for crying out loud - in some more efficient manner? (Can the new MWAIT instruction in EMT64 help with situations like this?)
  • Simply put, newer fancier faster bigger better prettier technology is sexy and sells both upgrades and new cards.

    "Better power management" gets funny looks.

    Funny look != profits

    Step 2. is NEVER "generate funny looks"
  • Consider:

    1. GPUs have higher transistor counts than modern CPUs.

    2. The development cycle for GPUs is much shorter than CPUs

    3. The shelf life of GPU designs is much shorter than CPU designs (the C2D is a direct descendant of the P5, (pentium 4 arch is a dead end evolutionarily).

    Given the preceding, it is unlikely that a reduction of power consumption will be the focus of GPU companies in the future, it would be suicide in a market which demands performance above all else. nVidia has shown that there ar

  • by DoctaWatson ( 38667 ) on Tuesday July 25, 2006 @04:47PM (#15779147)
    For all the complaining that Tom's does about the escalation of video card power usage, you don't see them benchmarking peak power consumption on their video card comparisons. It's all framerates and synthetics.

    Why would a PC builder take power usage into consideration if the major review sites don't?
  • Heat is entirely an issue of process tech. The chip designers do not run the fab. They do not invent SOI, and they do not magic up new lithographic techniques. Should the chip designers sit around with with their collective thumb up their collective ass while the material scientists poke silicon around with a nanotube for 6 months?

    Heat is the boundry of performance. The entire chip industry spends every day working out how to make things cooler per unit of performance so that they may increase total perform
  • What choice is there right now?

    I don't want 400fps at max settings, I just want something relatively new that can run 3 year old games decently without turning the room into a sauna. Unfortunately with the things I've heard about Matrox and S3 it doesn't look like I've got much of a choice :/
    • Get an Nvidia 7600GS. Not the GT, the GS. Almost all I've seen have been fanless and should run even new games at lower resolutions. Also EVGA make a fanless 7600GT if you need the extra oomph.
  • by DrXym ( 126579 ) on Wednesday July 26, 2006 @06:17AM (#15782629)
    Modern PCs consume a *horrible* amounts of power. I bet if power consumption were taxable that consumption could miraculously drop by a third without any loss in performance. Suddenly you would find that hardware & software makers flip on the power saving functionality by default rather than expecting people to find it. And the Nvidia & ATIs of this world producing desktop GPUs which have performance characteristics closer to their laptop versions. If Intel can produce CPUs that consume less power than the last generation then the GPU makers sure as hell can too. Who knows, it might even lead to cheaper graphics cards since they won't need so much circuitry including power connectors and massive fans to keep them cool.

Software production is assumed to be a line function, but it is run like a staff function. -- Paul Licker

Working...