PS3 8x More Power Hungry Than PS2 260
MonsieurCreosote writes "The Playstation 3 apparently demands eight times as much electricity as the Playstation 2, and more than twice as much as the Xbox 360. It also consumes much more power than a top-end PC gaming rig. It's not clear what's causing the massive drain, but Sony is now denying reports that the PS3 experienced overheating problems at the Tokyo Games Show last month. From the article: 'While an Intel Core 2 Duo PC with high-end graphics card chews politely on a 160 watt entré, the PlayStation 3 gorges itself on 380 watts... The extra power consumption of the PS3 over the PS2 suggests that we're not really getting much better at designing efficient systems, we're just pumping more 'fuel' into existing paradigms'. Are modern console hardware designers getting sloppy?"
It's the new all-in-one solution. (Score:5, Funny)
Re: (Score:2)
Though in our case it was due to 4 computers running 24/7, 3 laptops, 2 TVs, 2 of each game console and 2 complete stereo systems. Total power consumption if we had everything turned on at once would have been 3 or 4 kilowatts.
Yeah, but it's ok... (Score:5, Funny)
Everyone always ignores this bit. (Score:2)
Re: (Score:2)
Not fair comparison (Score:2)
In the future they'll process reduce it, cost reduce it, the PS3 will end up using less power. However you can't get done what they want to do in 5 years, without forcing everyone to buy a new PS3 every year, without hammering the electricity grid now
Re: (Score:3, Funny)
If you're making up numbers, you should go for something bigger, like 100X, or 1,000X. Or maybe even A Gajillion Times Faster.
Re: (Score:3, Informative)
Core Duo has one SIMD unit. Cell has 10 (7 SPE and 3 AltiVec). You can
Google for this stuff fairly easily. It's even the same benchmarks.
In 5 years the Core Duo will be just as fast; and everyone will have bought a
new PC (or two) to get it. Sony has to put this chip out now, so that it will
still be relevant in the MIDDLE of the console lifecycle.
Re:dumb comparison (Score:2)
Re: (Score:2)
Core Duo has one SIMD unit. Cell has 10 (7 SPE and 3 AltiVec). You can
Google for this stuff fairly easily. It's even the same benchmarks.
In 5 years the Core Duo will be just as fast; and everyone will have bought a
new PC (or two) to get it. Sony has to put this chip out now, so that it will
still be relevant in the MIDDLE of the console lifecycle.
Core Duo has two SIMD units (one per core). Of course the SIMD performance of Cell is exce
Re: (Score:2)
Re: (Score:3, Insightful)
Wow
The simple fact is that the Cell processor is (probably) very similar in performance to most processors that are similar in size and use a similar manufacturing process; the variations in design will allow for certain
Re: (Score:2)
Re: (Score:2)
Where did you get this idea? Maybe you should have read the article here --> http://gprime.net/board/archive/index.php/t-5989.
Re: (Score:2)
Cell in the PS3 has 7 SPE units at 3.2GHz, and an AltiVec unit on the PPE. Add in nVidia's
RSX which is about the same as their GeForce 8xxx series is supposed to be.
http://www.netlib.org/utk/people/JackDongarra/PAPE RS/cell-linpack-2006.pdf [netlib.org]
There's a good performance white paper on the Cell. Peak performance on the
units put together is something like 250 GFLOPS/s. Real world performance is
about 100 GFLOPS/s in the standard BLAS benchmark. Page 11 and 13 for pretty
graphics.
http:// [linuxclust...titute.org]
Re: (Score:2, Informative)
The first paper is about adapting LIN/LAPACK to the cell processor. If you read carefully, they are running single precision for that "amazing" ~100 GFlops a second; double precision drops off quite considerably, roughly around 10 GFlops, although the scale of their graph makes it difficult to determine accurately (page 13). It should also be noted that they are tuning all of this i
Another PS3 slam? (Score:3)
Re: (Score:2)
The "PS3 SUXORS" articles are about all the press Sony's getting lately. It's not Slashdot's fault that Sony tripped over their own two feet, fell down a flight of stairs, slipped on a banana peel, and didn't run far enough after lighting a stick of dynamite.
You don't understand how Slashdot can be so biased against Sony, I don't understand how anybody (even the biggest So
Re: (Score:2)
Power consumption? Jesus chr
Re: (Score:2)
You don't care about the $599 price tag, rootkits, or the unfair shutting down of Lik-Sang?
Ok... That's fine. You, however, don't represent the world.
"It uses more power. Big deal. I'd wager that there is not a single person on the planet that will not buy a PS3 because of power consumption. Not one."
That's up to the individuals to decide. I personally don't like the idea of spending $600 on a
Re: (Score:2)
so what (Score:2, Informative)
I don't think consumers care much about power consumption. If I can design something cheaper and faster, but hotter, and the consumer doesn't care, why wouldn't I do it? Lower bottom line. Higher profits. Booyah! More Ferraris for my garage.
correction (Score:2)
correction
Is this modern console hardware designer getting sloppy?
Re: (Score:2)
Re: (Score:2)
I'd hazard a guess... (Score:3, Interesting)
...that trying to run 8 cores at once might be what's causing the power drain.
The real question is, of course, are any games going to actually make use of the eight cores? Video games aren't really known for being very parallel-friendly - you might make an excuse for five threads (logic, graphics, sound, controller I/O, and disk I/O), but generally they're fairly serial processes. While updating the game logic, you don't want to draw a frame using half-updated information.
Ultimately, you have to wonder if Sony's decision to go with the Cell and use Blu-Ray was really that intelligent - most of the cost and production problems can be traced to them, and they provide very little real benefit to the end-user.
TFA is wrong... (Score:5, Insightful)
I have a 600 watt power supply in my PC, but even when I'm gaming it drinks in only 250 or so watts of power. The only time it gets even close to the 600 watt mark is for a fraction of a second after power up. I'll bet the PS3 only comes close to 380 watts for about the same amount of time right after powerup.
Source (Score:3, Informative)
http://www.jp.playstation.com/support/qa-591.html [playstation.com]
Re: (Score:2)
What it seems to imply is that the maximum load it would draw is 380W.
Gee
Re: (Score:2)
MOD PARENT UP (Score:2)
Re: (Score:2)
The real question is, of course, are any games going to actually make use of the eight cores?
Well, first, the answer is NO. Why? Because Sony is officially declaring one SPE unusable to increase chip yield. Which means that even two years from now when there are no yield problems, games will still be limited to seven cores. Except even that's not right, because Sony is reserving another core (or two? I can't remember) for use by their libraries, etc.
So a game can't use more than six cores for its own
Re: (Score:2)
That just means that when the game gets backgrounded (like when you push the system button on the PSP) the game will have to give up two cores while the menu is active. Unfortunatly, people like you keep mis-reporting this functionality after hearing about it third hand.
Re: (Score:2)
Uh, no, actually I heard it first-hand at the Austin game developer's conference back in September. One SPE will be unusable for yield purposes, another will be reserved for the OS, and that's Sony's official word to developers.
Unfortunately, fanbois like you keep denying this lack of functionality no matter how many times you are told about it.
Re: (Score:2)
If you want to argue about this with a fanboy, you are in the wrong thread. If you want to be a racist you're going to have to talk to somebody else. Even if you're right, there are still 6 cores left for games. You would have to be an (Xbox|Wii) fanboy to consider that a 'lack of functionality'.
Re: (Score:2)
Well to be fair, 8 processing elements are typically not available to video game developers, so you have a bit
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Informative)
This whole story stinks and it is just being bounced around by people less interested in whether it is true or not than putting the PS3 down.
Re: (Score:2)
You clearly are not familar with the workings of vertex or pixel shaders.
The Xbox 360 performs automatic shader workload balancing. Let's say I have 10,000 pixels to shade. Each pixel is shaded completely independantly of the other 9,999 pixels, so I could very easily shade 5,000 on each core of a chip.
Additionally, Direct3D queues draw calls. The "Present()" function typically returns instantly as the game logic is permitted to get a frame or two ahead of
Re: (Score:2)
You mean the parts that people generally offload to another processor are easily parallelized? The Cell doesn't do the graphics rendering in the PS3, it doesn't do the pixel-shaders, it doesn't do anything like that. Remember the Slashdot story ages ago about how the PS3 had abysmally low read speeds from graphics memory? The Cell won't be used for anything graphics-related beyond feeding data to the GPU.
Of course, having a separate GPU from the CPU isn't exactly new.
Trying to spread game logic among
More power != Less efficient (Score:3, Insightful)
No (Score:4, Informative)
You've been trolled - most likely by someone paid by Microsoft
Re: (Score:2)
Stating the obvious. (Score:2)
Supercomputing@Home (Score:3, Insightful)
Only if you consider a console with more processing power than older Cray Supercomputers for a fraction of the energy cost to be "sloppy". Let me put that in context to explain what I mean.
One of the things that Digital pioneered with its Alpha chips was the matter of clocking CPUs at incredibly high speeds (for the time); easily breeching 200MHz. With the fabrication technology of the time, however, such high speeds were found to have major issues with problems like metastability [wikipedia.org]. By upping the amount of power applied to the chip, they found that they could force the logic to switch faster and thus reduce these issues. This research was the basis for modern chip design. The more power you apply, the faster you can clock the CPU. (With various caveats freely sprinkled in.)
Now put yourself in Sony's place. You decide you want to build the most powerful game console EVER; with cost being no barrier. So you go and pick up this super-computer-on-a-chip technology from IBM. (The Cell) You then ask NVidia for their latest GPU technology to combine with that processor. You then take a look at the system, to decide how high you should clock it. You decide to max out the GPU for MAXIMUM PERFORMANCE. (Who wouldn't?) So you're now chewing upwards of 100 watts just on your GPU. Then you decide that a power friendly 1.5GHz isn't going to cut it in this competitive race. (Especially if you've got spies over at Microsoft, who are reporting back 3GHz chips.) So you look at it, and decide to ramp up for MAXIMUM CPU PERFORMANCE. Now you've got 3GHz, but your CPU is also using 100+ watts.
So it's really no surprise that the PS3 is consuming so much power. The real issue is whether the super-computer-on-a-chip idea was really the way to go. Some people seem to think so. Some even believe that it's a requirement to hit 1080p resolutions. Only time will prove them out, though. In the meantime, Sony is banking on the consumers being taken with an uber-powerful system. If they'll purchase Aibos and HDTVs, they'll purchase a $600 PS3, right?
Separate Note: Of course, Sony keeps shooting themselves in the foot. This strategy *might* have worked reasonably well if confidence in Sony was still high. But with people boycotting them over everything from rootkits to Lik-Sang, PLUS Sony's extremely poor E3 presentations, PLUS their general arrogance when handling the public, I seriously doubt that they're going far this generation.
Wrong question... (Score:2)
Only if you consider a console with more processing power than older Cray Supercomputers for a fraction of the energy cost to be "sloppy". Let me put that in context to explain what I mean.
No, the question is if this is an efficient use of power today. Comparing the power usage of a PS3 to a Cray is totally irrelavent. If I designed a solar panel that was 5% efficient would I say "No, it's SUPER DUPER efficient compared to the solar panels of 30 years ago"? No, I'd compare its efficiency with todays sola
Re: (Score:2)
Re: (Score:2)
In context, we are. The XBox 360 is no power slouch itself. (~160 watts.) Now if you compare 3 cores + GPU to 8 cores + GPU, it becomes clear that the PS3 is simply going to draw more power. A lot more. Proportionally, it should be drawing ~60% more power. The PS3 PSU is proportional (~57% more power) to the capacity of the XBox 360's. We don't know how much of a safety margin was built into these machines, so the actual difference in power usag
Re: (Score:2)
The cores are not similar enough to be compared that way. It's like saying that since a diesel locomotive has 12 cylinders and my Honda has 4, the locomotive ought to use 3 time as much fuel as my Honda.
Re: (Score:2)
The "majority" doesn't have $600+ to shell out at launch. (If some of the reports are to be believed, all the preorders are going to scalpers. So early adopters will be paying a LOT more.) The "majority" be saving up and waiting for a price drop. The "early adopters" (i.e. the same ones who shell out $2000 for an HDTV, and use Lik-Sang to import expensive games from acr
Why is the PS3 so power-hungry? (Score:2)
I'm sorry, I know better but I couldn't resist
Yet another win for Nintendo (Score:2)
- the Wii is 2-3 times more powerful than a Gamecube while at the same time requires half or a third of the power
- the Nintendo DS can play for hours and hours on a single charge, not really so with the PSP
More expensive = more heat, more power required, less battery life (if applicable)?
What good is HD graphics if you have to keep the same quality per pixel as the
Pure FUD (Score:5, Interesting)
The PS3 has a 380 Watt PSU. There is no info here about what the actual power draw is likely to be at most times.
For comparison, my gaming PC has a 600 Watt PSU. IIRC, with my hardware, it should be peaking at about about 250 Watts while running games.
Re: (Score:3, Informative)
Of course with 8 cores, chances are it will not spend all that much time at maximum power usage very often...
Re: (Score:2)
If you're only expecting to pull 250W while doing strenuous computation like playing games, it's a waste of money to have a 600W PSU. You could do exactly as well with a comparable PSU rated for 300W.
You built your gaming rig; Sony is building millions of them. If they're using such heavy-duty power supplies, it's either because:
1. they're intentionally buying the most expensive compo
Wow 380 Watts! (Score:2)
I dont think customers at higher lattitudes will complain, not in Canada and not in winter. But not all sockets and power bars will be able to handle that.
They should add a metal plate on top for metallic coffee mugs. If they use a water cooled system I could reroute the water to my water blanket and go camping with the PS3:
main()
{
for (int x=0;x<8;x++)
fork();
while(1);
}
Utter bollocks (Score:5, Informative)
How were these figures calculated? By taking the 127 from 100-127V range supported by the PSU and multiplying by 3A to get 381. 3 amps is what the FCC label says. But since the PS3 runs in Japan at 100V, the PS3 must demand at most 300 watts. At most. But that's just the PSU. It doesn't mean the device actually draws that power.
By way of illustration, the XBox 360 PSU run at 5 amps. 5 times 127V = 635 watts. So why no stories about the XBox demanding 635 watts? Why no stories that say the PS3 actually uses half the power of the 360? Because the XBox 360 consumes 160 watts in normal usage. It is entirely misleading to look at what the PSU can deliver to determine what the device actually uses.
The same will be true of the PS3. Unless some reputable site such as ARSTechnica, Toms Hardware etc. sticks a probe in the thing and states what power the thing draws this story should be treated as bollocks. Bollocks swallowed whole by Zonk as usual.
Power up with 'zonked' tag. (Score:2)
An idiot tells an idiot who tells an idiot... (Score:3, Informative)
And this an irrelevant fact, but I'd be curious to see the power consumption levels or a non-core Xbox 360 powering a HDD, and also requiring another outlet for it's HD-DVD add-on. I'd be suprised if we didn't see that 200W's for a core system creeping up into the +300W range as well.
At any rate, this story seems like a non-story to me.
Why a tax? (Score:3, Insightful)
Re:Green tax (Score:5, Insightful)
Re: (Score:2)
Re: (Score:2)
Sounds right (Score:3, Funny)
Re: (Score:2)
What if these devices had an internal battery for maintaining the stand-by circuitry - would that make any difference to power usage? (Make it semi-internal - so that it could be removed/replaced).
Re: (Score:2)
You don't ignore the losses for storage in a battery, e.g. the OP's "not 100% efficient" comment.
Re: (Score:2, Interesting)
If you believe that, then there should be a tax against the manufactures of the appliances, not the consumers. After all, it is not the fault of the consumer that the manufacturer decided to create their devices so they are never really off. My DVD player has three levels of power: On, Standby, UNPLUG. There is no
Re: (Score:2, Insightful)
Re: (Score:2)
But those people can simply get a third job in order to pay for the electricity bill that the PS3 that they bought with money from their second job consumes!
Re: (Score:3, Interesting)
I once monitored my power usage - leaving just a single thing on while at work and killing all other circuits (even my refrigerator). I found my biggest power drains were the fridge, air conditioner, incandescent light bulbs, tv, and my stereo.
Re: (Score:2)
i'll give you your answer:
don't have kids! or have at most two. if every person on the planet only spawned 2 children, the world population would flatline. i work with people raising 3, 4, or even 5 kids, and it makes me sick to think about how bad that is for the environment. if i have children, i plan on having only 1.
i know that this is slashdot, and most ppl here
Re: (Score:2)
When you have countries where the population expands explosively, without any effort put into infrastructure, the only reason they're able to continue to do so is the influx of medicine, food, and exter
Re: (Score:2)
Re: (Score:2)
We only have a week of winter, on average, so I don't worry about the heating bills.
As for a $50 bill, you must get cheaper power up there. After the price increases, $50 is nearly unachievable (and requires significant sacrifice - my refrige
Re: (Score:2)
Re: (Score:2)
if you are ever in a situation where you feel this is needed.. may i recommed digging a whole.. the land is your friend.. sorry but this is bull shit.. "air conditioning" is always for comfort - you may use it for survival but you are doing wrong.. heat waves that kill people in citys is due to how we build our citys not because they lack a necisity
Re: (Score:2, Interesting)
> What part of this process are you not understanding?
I think you're missing the bit about sustainable energy consumption, and the need to encourage people to use less energy. It's not that people aren't paying - just that they're not paying enough to cover the damage that's being done, and to make energy consumption sustainable. There'd be nothing wrong with using the current amount of power if it came from wind, solar energy etc, but sadly it's not, becaus
Re: (Score:2)
People are too dumb to know they are buying a power-wasting product. So taxes will fix it!
People are too dumb to buy CFL bulbs instead of old style bulbs. Taxes are the solution! (see a previous slashdot article where someone suggested taxes regarding CFLs. Similar situation.)
In fact according to slashdot comments, taxes can fix any economic problem!
Government: knowing what's better for you since 1933.
Start with you! (Score:2)
This coming from someone posting to
informed choice (Score:2)
An easier, less controversial solution would be to require labelling for all electronic devices that would tell potential buyers how much energy the devices use. Something like "this devices uses a maximum of 200 watts when in use, 30 watts when idle, and 10 watts in standby mode". We have labels like these on water heaters and the like, why not smaller devices as well?
The real problem here is that if buyers can't distinguish a good product from a bad product, bad products will dominate the market. See
Re: (Score:2)
Re: (Score:2)
1) they would melt the discs
2) Sony wouldn't be allowed to ship them in consumer electronics (especially something targeted at kids)
Re: (Score:2)
Care to provide any background evidence to suggest that it wouldn't be a possibility?
Re: (Score:3, Funny)
Sorry... I'll get me coat.
Re: (Score:3, Informative)
A BluRay Rewritable drive uses at most 30 watts. That's peak, and probably only used for a few seconds when spinning a CD-ROM up to 52x. [nevada.com.mk]
Re: (Score:2)
Very good. So why not just say that initially instead of pulling the karma-whore card? I wouldn't mind, but all this hostility surrounding Sony stories is fucking obnoxious.
Re: (Score:2)
Re: (Score:2)
Thing is, though, he was asking a question (admittedly his biases were showing), not making a statement.
Re: (Score:2)
wait.. 50GB of data
No way in the slightest is that the case. (Score:3, Informative)
- The GPU
- The Cell processor
- The highly clocked Rambus XDR DRAM
IO devices like the hard drive and the Blu Ray drive are peanuts compared to those.Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
We'll find you. You won't know how, but we'll find you.
The Libyans
Re: (Score:2)
It's still got an Intel Core Duo and a fairly nippy ATi graphics card - it's also got a built-in screen. Okay, it's a 15in MacBook Pro, but still...
Re:Wait a minute...or more... (Score:2)
What do you mean gaming systems? There are graphic cards that draw 75W from the slot, and have two more 75W/ea additional power connectors on them. Do that math!
Re: (Score:2)
Ok.
75W + 2 * 75W = 225W
Leaving 175W for the rest of the gaming system to use, if we're talking a 400W budget.
(Not that I don't think 225W is a metric assload of draw for a video card, but you seem to be implying that just the video card eats up 400W...but your numbers clearly don't bear that out)