Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

PS3 8x More Power Hungry Than PS2 260

MonsieurCreosote writes "The Playstation 3 apparently demands eight times as much electricity as the Playstation 2, and more than twice as much as the Xbox 360. It also consumes much more power than a top-end PC gaming rig. It's not clear what's causing the massive drain, but Sony is now denying reports that the PS3 experienced overheating problems at the Tokyo Games Show last month. From the article: 'While an Intel Core 2 Duo PC with high-end graphics card chews politely on a 160 watt entré, the PlayStation 3 gorges itself on 380 watts... The extra power consumption of the PS3 over the PS2 suggests that we're not really getting much better at designing efficient systems, we're just pumping more 'fuel' into existing paradigms'. Are modern console hardware designers getting sloppy?"
This discussion has been archived. No new comments can be posted.

PS3 8x More Power Hungry Than PS2

Comments Filter:
  • by Umuri ( 897961 ) on Wednesday November 01, 2006 @03:21PM (#16677279)
    Sony is obviously trying to extend from home electronics into the home heating business. Since most kids never move from their gaming consoles, these will remotely heat just the area immediatly around them, saving you tons on heading for kids who never use the rest of the house!
    • Well, given that me and 3 other guys never had to turn on the heat in an apartment in Conneticut during the winter, due to the 80 degree temperature inside our apartment, I can see that happening.

      Though in our case it was due to 4 computers running 24/7, 3 laptops, 2 TVs, 2 of each game console and 2 complete stereo systems. Total power consumption if we had everything turned on at once would have been 3 or 4 kilowatts.
  • by Control Group ( 105494 ) on Wednesday November 01, 2006 @03:24PM (#16677337) Homepage
    Included in your $600 is a miniature power plant that runs on burning batteries.
  • You see performance comparisons all the time, and websites dedicated to them, but how often do you get someone comparing the power drain? Or any sort of benchmark using consoles, for that matter.
    • by joshetc ( 955226 )
      I think the reason is because the power drain of consoles is more or less static and isn't at all opionated, unlike the other views on the product. Expecting otherwise would be like expecting people to argue over this 2.8ghz processor being 2.8ghz, bios says it is 2800mhz so that is what it is...
  • The Cell has about 20x the processing power as a Core Duo with a high-end graphics card combined. Add nVidia's RSX.. you're looking at a system which has stayed within the budget (even less) for systems of today.

    In the future they'll process reduce it, cost reduce it, the PS3 will end up using less power. However you can't get done what they want to do in 5 years, without forcing everyone to buy a new PS3 every year, without hammering the electricity grid now :)

    • Re: (Score:3, Funny)

      by bigman2003 ( 671309 )
      Why stop at 20X?

      If you're making up numbers, you should go for something bigger, like 100X, or 1,000X. Or maybe even A Gajillion Times Faster.

      • Re: (Score:3, Informative)

        by NekoXP ( 67564 )
        Simple floating point power of the SIMD units is easy enough to benchmark.

        Core Duo has one SIMD unit. Cell has 10 (7 SPE and 3 AltiVec). You can
        Google for this stuff fairly easily. It's even the same benchmarks.

        In 5 years the Core Duo will be just as fast; and everyone will have bought a
        new PC (or two) to get it. Sony has to put this chip out now, so that it will
        still be relevant in the MIDDLE of the console lifecycle.
        • Please learn something about computing and stop taking one unit of an entire system and assuming that's equal to overall performance. It's be like saying you're 10X smarter than someone else because you can complete those pattern matching puzzles 10X faster.
        • Simple floating point power of the SIMD units is easy enough to benchmark.

          Core Duo has one SIMD unit. Cell has 10 (7 SPE and 3 AltiVec). You can
          Google for this stuff fairly easily. It's even the same benchmarks.

          In 5 years the Core Duo will be just as fast; and everyone will have bought a
          new PC (or two) to get it. Sony has to put this chip out now, so that it will
          still be relevant in the MIDDLE of the console lifecycle.


          Core Duo has two SIMD units (one per core). Of course the SIMD performance of Cell is exce
    • by ProppaT ( 557551 )
      I'm sure that Kaz is glad that someones buying into his hype.
    • Re: (Score:3, Insightful)

      The Cell has about 20x the processing power as a Core Duo with a high-end graphics card combined. Add nVidia's RSX.. you're looking at a system which has stayed within the budget (even less) for systems of today.


      Wow ... that is either the dumbest or funniest thing I have read in a long time ...

      The simple fact is that the Cell processor is (probably) very similar in performance to most processors that are similar in size and use a similar manufacturing process; the variations in design will allow for certain
    • "The Cell has about 20x the processing power as a Core Duo with a high-end graphics card combined." Where did you get this idea? Maybe you should have read the article here --> http://gprime.net/board/archive/index.php/t-5989.h tml [gprime.net] that was pulled from anandtech.
    • "The Cell has about 20x the processing power as a Core Duo with a high-end graphics card combined."

      Where did you get this idea? Maybe you should have read the article here --> http://gprime.net/board/archive/index.php/t-5989.h tml [gprime.net] [gprime.net] that was pulled from anandtech.
  • by NineNine ( 235196 ) on Wednesday November 01, 2006 @03:26PM (#16677407)
    C'mon guys. This is getting ridiculous. First off, this is just a game console. I don't understand how ANYBODY could feel that strongly one way or another about a game console. It's a plastic and metal box for playing GAMES. Secondly, the quality of Slashdot "reporting" is getting really, really, really, REALLY bad. The ONLY reason I still come here is to interact with other similar people. The articles (like the constant "PS3 suxors" stuff that this article is) are worse than 50% of the personal blogs out there right now.
    • "The articles (like the constant "PS3 suxors" stuff that this article is) are worse than 50% of the personal blogs out there right now."

      The "PS3 SUXORS" articles are about all the press Sony's getting lately. It's not Slashdot's fault that Sony tripped over their own two feet, fell down a flight of stairs, slipped on a banana peel, and didn't run far enough after lighting a stick of dynamite.

      You don't understand how Slashdot can be so biased against Sony, I don't understand how anybody (even the biggest So
      • "missteps"? Who cares? I'm not a stockholder. I'm not an employee. I'm just a guy who likes to play video games. If the PS3 is as good as it's supposed to be, I'll buy one and enjoy it. I honestly don't care one way or the other about what the press has been saying about what's going on behind the scenes. That's like being interested in the production problems with a hairbrush. It's about as interesting (and relevant to anybody who isn't an owner) as watching grass grow.

        Power consumption? Jesus chr
        • ""missteps"? Who cares? I'm not a stockholder. I'm not an employee. I'm just a guy who likes to play video games."

          You don't care about the $599 price tag, rootkits, or the unfair shutting down of Lik-Sang?

          Ok... That's fine. You, however, don't represent the world.

          "It uses more power. Big deal. I'd wager that there is not a single person on the planet that will not buy a PS3 because of power consumption. Not one."

          That's up to the individuals to decide. I personally don't like the idea of spending $600 on a
        • by Perseid ( 660451 )
          It matters. Indirectly it matters a lot. The question you have to ask yourself is why is this happening. Does Sony have a good reason for this insane power consumption? Maybe. I can't think of one. Nobody in this thread seems to have been able to think of one. So does that lead us to the thought of bad hardware design? If so, what else is poorly designed?
  • so what (Score:2, Informative)

    by j00r0m4nc3r ( 959816 )
    Are modern console hardware designers getting sloppy?

    I don't think consumers care much about power consumption. If I can design something cheaper and faster, but hotter, and the consumer doesn't care, why wouldn't I do it? Lower bottom line. Higher profits. Booyah! More Ferraris for my garage.
  • "Are modern console hardware designers getting sloppy?"
    correction
    Is this modern console hardware designer getting sloppy?
  • by _xeno_ ( 155264 ) on Wednesday November 01, 2006 @03:29PM (#16677475) Homepage Journal

    ...that trying to run 8 cores at once might be what's causing the power drain.

    The real question is, of course, are any games going to actually make use of the eight cores? Video games aren't really known for being very parallel-friendly - you might make an excuse for five threads (logic, graphics, sound, controller I/O, and disk I/O), but generally they're fairly serial processes. While updating the game logic, you don't want to draw a frame using half-updated information.

    Ultimately, you have to wonder if Sony's decision to go with the Cell and use Blu-Ray was really that intelligent - most of the cost and production problems can be traced to them, and they provide very little real benefit to the end-user.

    • TFA is wrong... (Score:5, Insightful)

      by ivan256 ( 17499 ) on Wednesday November 01, 2006 @03:36PM (#16677655)
      If you dig down through the four layers of links to the original source, you will see that they came up with the 380 watt number by multiplying the amperage number with the wattage number on the power supply label. That gives you the peak draw that the power supply is capable of, and probably not even close to average consumption.

      I have a 600 watt power supply in my PC, but even when I'm gaming it drinks in only 250 or so watts of power. The only time it gets even close to the 600 watt mark is for a fraction of a second after power up. I'll bet the PS3 only comes close to 380 watts for about the same amount of time right after powerup.
      • Source (Score:3, Informative)

        by ivan256 ( 17499 )
        Here's the link to the original source before it went through a 4-blog telephone game:

        http://www.jp.playstation.com/support/qa-591.html [playstation.com]
        • and for those who need a translation [google.com] :) ...

          What it seems to imply is that the maximum load it would draw is 380W.

          Gee ... its good we don't need 400W and 500W power supplies for our desktops now. ... oh wait, a lot of us DO.
      • by G-funk ( 22712 )
        But if they're comparing it to the peak numbers of the PS2, it's still using around 8 times as much power. Just neither is using as much as they say.
      • I went back throough CNET, ZDnet, and god knows whatever blogs to try and find a source too and I'll be damned if someone hadn't done it already. Max power draw means nothing.
    • by Megane ( 129182 )

      The real question is, of course, are any games going to actually make use of the eight cores?

      Well, first, the answer is NO. Why? Because Sony is officially declaring one SPE unusable to increase chip yield. Which means that even two years from now when there are no yield problems, games will still be limited to seven cores. Except even that's not right, because Sony is reserving another core (or two? I can't remember) for use by their libraries, etc.

      So a game can't use more than six cores for its own

      • by ivan256 ( 17499 )
        Except even that's not right, because Sony is reserving another core (or two? I can't remember) for use by their libraries, etc.

        That just means that when the game gets backgrounded (like when you push the system button on the PSP) the game will have to give up two cores while the menu is active. Unfortunatly, people like you keep mis-reporting this functionality after hearing about it third hand.
        • by Megane ( 129182 )

          Uh, no, actually I heard it first-hand at the Austin game developer's conference back in September. One SPE will be unusable for yield purposes, another will be reserved for the OS, and that's Sony's official word to developers.

          Unfortunately, fanbois like you keep denying this lack of functionality no matter how many times you are told about it.

          • by ivan256 ( 17499 )
            I'm not disputing that one will be disabled to increase yields. I'm merely stating that you are misinterpreting the reserved portion.

            If you want to argue about this with a fanboy, you are in the wrong thread. If you want to be a racist you're going to have to talk to somebody else. Even if you're right, there are still 6 cores left for games. You would have to be an (Xbox|Wii) fanboy to consider that a 'lack of functionality'.
    • The real question is, of course, are any games going to actually make use of the eight cores? Video games aren't really known for being very parallel-friendly - you might make an excuse for five threads (logic, graphics, sound, controller I/O, and disk I/O), but generally they're fairly serial processes. While updating the game logic, you don't want to draw a frame using half-updated information.

      Well to be fair, 8 processing elements are typically not available to video game developers, so you have a bit

    • by dnaumov ( 453672 )
      Actually, I believe it will be quite easy to use up all the cores. I recall reading an interview with some game designer (I believe it was someone working on one of the Fable games) and he was asked about features they had to drop before shipping the game: one interesting feature he mentioned was each and every tree growing individually in the game world. With the game world being huge and the amount of trees being rather big as well, they realised that even when optimised in the best possible way, it would
      • I'm no programmer and I don't know much about Fable, but couldn't they take a cool yet ultimately useless feature like that, and only have it run in otherwise idle cycles? It might not be as constant and dynamic as they had originally hoped, but it sounds like there could've been a decent compromise in there. I wouldn't be surprised to hear that other issues (budget, time) factored into their decision, but it could've been cool.
    • Re: (Score:3, Informative)

      by DrXym ( 126579 )
      The Cell was designed to draw 30 watts which is considerably less than a conventional processor. It emits less heat too. By way of comparison, the 3 core PPC in the 360 is supposedly meant to draw anywhere between 80-120 watts which is normal for that kind of chip.

      This whole story stinks and it is just being bounced around by people less interested in whether it is true or not than putting the PS3 down.

    • Modern video games are EXTREMELY parallel friendly.

      You clearly are not familar with the workings of vertex or pixel shaders.

      The Xbox 360 performs automatic shader workload balancing. Let's say I have 10,000 pixels to shade. Each pixel is shaded completely independantly of the other 9,999 pixels, so I could very easily shade 5,000 on each core of a chip.

      Additionally, Direct3D queues draw calls. The "Present()" function typically returns instantly as the game logic is permitted to get a frame or two ahead of
      • by _xeno_ ( 155264 )

        You mean the parts that people generally offload to another processor are easily parallelized? The Cell doesn't do the graphics rendering in the PS3, it doesn't do the pixel-shaders, it doesn't do anything like that. Remember the Slashdot story ages ago about how the PS3 had abysmally low read speeds from graphics memory? The Cell won't be used for anything graphics-related beyond feeding data to the GPU.

        Of course, having a separate GPU from the CPU isn't exactly new.

        Trying to spread game logic among

  • by dextromulous ( 627459 ) on Wednesday November 01, 2006 @03:30PM (#16677513) Homepage
    The extra power consumption of the PS3 over the PS2 suggests that we're not really getting much better at designing efficient systems
    With the PS2 at 6.2GFlops [wikipedia.org] and the PS3 at 2.18 TFLOPS [wikipedia.org] you're looking at about a 350x performance increase (yeah, I know flops aren't exactly meaningful, but its the only metric I can see right now.) In order for the PS3 to be "less efficient" than the PS2 it would need to consume over 15kW!
  • No (Score:4, Informative)

    by Troed ( 102527 ) on Wednesday November 01, 2006 @03:31PM (#16677525) Homepage Journal
    Max power rating of the PSU does not equal power used in normal operation

    You've been trolled - most likely by someone paid by Microsoft

    • by Detritus ( 11846 )
      True, but Sony isn't going to spec a power supply that is substantially larger than what is needed by the console.
  • Pointing out that the PS3, given what's inside, is more power hungry than a PS2 is like pointing out that the sun is hot.
  • by AKAImBatman ( 238306 ) * <akaimbatman AT gmail DOT com> on Wednesday November 01, 2006 @03:39PM (#16677747) Homepage Journal
    Are modern console hardware designers getting sloppy?

    Only if you consider a console with more processing power than older Cray Supercomputers for a fraction of the energy cost to be "sloppy". Let me put that in context to explain what I mean.

    One of the things that Digital pioneered with its Alpha chips was the matter of clocking CPUs at incredibly high speeds (for the time); easily breeching 200MHz. With the fabrication technology of the time, however, such high speeds were found to have major issues with problems like metastability [wikipedia.org]. By upping the amount of power applied to the chip, they found that they could force the logic to switch faster and thus reduce these issues. This research was the basis for modern chip design. The more power you apply, the faster you can clock the CPU. (With various caveats freely sprinkled in.)

    Now put yourself in Sony's place. You decide you want to build the most powerful game console EVER; with cost being no barrier. So you go and pick up this super-computer-on-a-chip technology from IBM. (The Cell) You then ask NVidia for their latest GPU technology to combine with that processor. You then take a look at the system, to decide how high you should clock it. You decide to max out the GPU for MAXIMUM PERFORMANCE. (Who wouldn't?) So you're now chewing upwards of 100 watts just on your GPU. Then you decide that a power friendly 1.5GHz isn't going to cut it in this competitive race. (Especially if you've got spies over at Microsoft, who are reporting back 3GHz chips.) So you look at it, and decide to ramp up for MAXIMUM CPU PERFORMANCE. Now you've got 3GHz, but your CPU is also using 100+ watts.

    So it's really no surprise that the PS3 is consuming so much power. The real issue is whether the super-computer-on-a-chip idea was really the way to go. Some people seem to think so. Some even believe that it's a requirement to hit 1080p resolutions. Only time will prove them out, though. In the meantime, Sony is banking on the consumers being taken with an uber-powerful system. If they'll purchase Aibos and HDTVs, they'll purchase a $600 PS3, right?

    Separate Note: Of course, Sony keeps shooting themselves in the foot. This strategy *might* have worked reasonably well if confidence in Sony was still high. But with people boycotting them over everything from rootkits to Lik-Sang, PLUS Sony's extremely poor E3 presentations, PLUS their general arrogance when handling the public, I seriously doubt that they're going far this generation.

    • Only if you consider a console with more processing power than older Cray Supercomputers for a fraction of the energy cost to be "sloppy". Let me put that in context to explain what I mean.

      No, the question is if this is an efficient use of power today. Comparing the power usage of a PS3 to a Cray is totally irrelavent. If I designed a solar panel that was 5% efficient would I say "No, it's SUPER DUPER efficient compared to the solar panels of 30 years ago"? No, I'd compare its efficiency with todays sola
      • You know you're in trouble when there are more people are arguing over exactly why your console is going to fail than there are people arguing over whether or not it will fail in the first place.
      • No, the question is if this is an efficient use of power today.

        In context, we are. The XBox 360 is no power slouch itself. (~160 watts.) Now if you compare 3 cores + GPU to 8 cores + GPU, it becomes clear that the PS3 is simply going to draw more power. A lot more. Proportionally, it should be drawing ~60% more power. The PS3 PSU is proportional (~57% more power) to the capacity of the XBox 360's. We don't know how much of a safety margin was built into these machines, so the actual difference in power usag

        • Proportionally, it should be drawing ~60% more power.

          The cores are not similar enough to be compared that way. It's like saying that since a diesel locomotive has 12 cylinders and my Honda has 4, the locomotive ought to use 3 time as much fuel as my Honda.

  • Is it a Republican? Thanks, ladies and gentlemen I'll be here all night. Try the shrimp and remember to tip your wait staff! ;)

    I'm sorry, I know better but I couldn't resist :)
  • - reports of people saying the PS3 is barely better than an Xbox 360, the Xbox 360 already being extremely noisy (at least with Final Fantasy XI)
    - the Wii is 2-3 times more powerful than a Gamecube while at the same time requires half or a third of the power
    - the Nintendo DS can play for hours and hours on a single charge, not really so with the PSP

    More expensive = more heat, more power required, less battery life (if applicable)?

    What good is HD graphics if you have to keep the same quality per pixel as the
  • Pure FUD (Score:5, Interesting)

    by Ender Ryan ( 79406 ) <TOKYO minus city> on Wednesday November 01, 2006 @03:47PM (#16677949) Journal
    This entire story is pure FUD.

    The PS3 has a 380 Watt PSU. There is no info here about what the actual power draw is likely to be at most times.

    For comparison, my gaming PC has a 600 Watt PSU. IIRC, with my hardware, it should be peaking at about about 250 Watts while running games.

    • Re: (Score:3, Informative)

      by Sketch ( 2817 )
      However, the PS3 is probably a lot less expandable than your PC, so it doesn't need an overkill power supply. Sony knows how much power it needs, and they aren't going to waste money putting in a bigger PSU than it needs.

      Of course with 8 cores, chances are it will not spend all that much time at maximum power usage very often...
    • my gaming PC has a 600 Watt PSU. IIRC, with my hardware, it should be peaking at about about 250 Watts while running games.

      If you're only expecting to pull 250W while doing strenuous computation like playing games, it's a waste of money to have a 600W PSU. You could do exactly as well with a comparable PSU rated for 300W.

      You built your gaming rig; Sony is building millions of them. If they're using such heavy-duty power supplies, it's either because:
      1. they're intentionally buying the most expensive compo
  • Thats huge. So more than $50 of the $600 price is the power supply.

    I dont think customers at higher lattitudes will complain, not in Canada and not in winter. But not all sockets and power bars will be able to handle that.

    They should add a metal plate on top for metallic coffee mugs. If they use a water cooled system I could reroute the water to my water blanket and go camping with the PS3:

    main()
    {
    for (int x=0;x<8;x++)
    fork();
    while(1);
    }
  • Utter bollocks (Score:5, Informative)

    by DrXym ( 126579 ) on Wednesday November 01, 2006 @05:57PM (#16680533)
    This story is utter bollocks that has been parroted from one blog to another without the slightest thought gone into checking if it might be true or not. Just follow the links and speculation from blogs all the way down.

    How were these figures calculated? By taking the 127 from 100-127V range supported by the PSU and multiplying by 3A to get 381. 3 amps is what the FCC label says. But since the PS3 runs in Japan at 100V, the PS3 must demand at most 300 watts. At most. But that's just the PSU. It doesn't mean the device actually draws that power.

    By way of illustration, the XBox 360 PSU run at 5 amps. 5 times 127V = 635 watts. So why no stories about the XBox demanding 635 watts? Why no stories that say the PS3 actually uses half the power of the 360? Because the XBox 360 consumes 160 watts in normal usage. It is entirely misleading to look at what the PSU can deliver to determine what the device actually uses.

    The same will be true of the PS3. Unless some reputable site such as ARSTechnica, Toms Hardware etc. sticks a probe in the thing and states what power the thing draws this story should be treated as bollocks. Bollocks swallowed whole by Zonk as usual.

  • Could any other supposedly technical editor have let a story like this slip through? Another case for the 'zonked' files.
  • by iamghetto ( 450099 ) on Wednesday November 01, 2006 @06:38PM (#16681259) Homepage
    The actual root of this story is a someones blog entry. True story. And now the story has been repeated and repeated and repeated and now it's apparently become a fact without context. The only fact is, is that the PS3's power adapter runs has a peak power of 380W. It doesn't require that power at all times. To compare the PS3's max power consumption to max power of a single Core Duo CPU seems disingenuous at best. Remember, the PS3 is an entire system, Cell Processor, Video Card, and HDD... So it has the components of a computer and it consumes at computer-esque amount of power. Maybe I am the only person who doesn't see this power consumption as relevant. I get that it will increase my power bill by few dollars every month, maybe even a few more dollars that Xbox 360, but that's ok.

    And this an irrelevant fact, but I'd be curious to see the power consumption levels or a non-core Xbox 360 powering a HDD, and also requiring another outlet for it's HD-DVD add-on. I'd be suprised if we didn't see that 200W's for a core system creeping up into the +300W range as well.

    At any rate, this story seems like a non-story to me.

Love may laugh at locksmiths, but he has a profound respect for money bags. -- Sidney Paternoster, "The Folly of the Wise"

Working...