Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror

PS3 Cell Processor 'Broken'? 417

Posted by Zonk
from the it's-drinking dept.
D-Fly writes "Charlie Demerijian at the Inquirer got a look at some insider specs on the PS3, and says, Sony screwed up big time with the Cell processor; the memory read speed on the current Devkits is something like 3 orders of magnitude slower than the write speed; and is unlikely to improve much before the ship date. The slide from Sony pictured in the article is priceless: 'Local Memory Read Speed ~16Mbps, No this isn't a Typo.' Demerjian says when the PS3 comes out a full year after the XBox360, it's still going to be inferior: 'Someone screwed up so badly it looks like it will relegate the console to second place behind the 360.'" This is the Inquirer, so take with a grain of salt. Just the same, doesn't sound too good for Sony or IBM.
This discussion has been archived. No new comments can be posted.

PS3 Cell Processor 'Broken'?

Comments Filter:
  • Go Sony, go! (Score:3, Interesting)

    by timecop (16217) * on Monday June 05, 2006 @07:28AM (#15471410) Homepage
    What is this 'local memory'? On-die cache? How the fuck can you screw that up to make it 16Mbit?

    PS3 is way overkill for a console anyway. What are they thinking? Not everyone needs a console with 1GB of memory, huge HDD, which also doubles as a DVD Player/Entertainment center/Memory stick player (you betcha sony is already adding THAT feature), oh and can also play some games.

    I'm all for Nintendo's new console. Its cheap, it will have amazing games AND they're not trying to make it the center of your digital home.
    • Re:Go Sony, go! (Score:3, Interesting)

      by Ford Prefect (8777)
      Is it memory local to the graphics subsystem, or something?

      If so, then presumably getting the graphics chip to copy stuff out into main memory for the central processor to read would be the sensible workaround. But still, 16MB/s seems more like a throwback to the age of my old Atari ST. I think that could manage a few megabytes a second...
      • Re:Go Sony, go! (Score:5, Informative)

        by Retric (704075) on Monday June 05, 2006 @08:29AM (#15471757)
        "The "Local Memory" is the RSX graphics memory. The Cell shouldn't need to read this. The PS3 would still work even if the Cell couldn't read this memory at all. This memory is where you store textures and other graphics data."

        IMO it's reasonable to have asynchronous communication with the graphics subsystem. The only stupid thing going on is calling graphics cards memory "Local Memory". It suggests that the X-Box got it right by having one big chunk of memory that is read by both the CPU and GPU even if most developers will make the same basic split anyway.
        • Re:Go Sony, go! (Score:5, Informative)

          by Ford Prefect (8777) on Monday June 05, 2006 @08:54AM (#15471920) Homepage
          "The "Local Memory" is the RSX graphics memory. The Cell shouldn't need to read this. The PS3 would still work even if the Cell couldn't read this memory at all. This memory is where you store textures and other graphics data.

          Presumably in the (unlikely?) event you did need the output from the RSX graphics chip for manipulation by the Cell processor gubbins, you could get it to render to main memory, let the processor do the appropriate data-diddling, then have the RSX read it back again?

          The 'local memory' is presumably the RSX's private play area, and thus the RSX gets maximum-stupendous-speed priority, and the Cell gets occasional access at weekends. Which is a bonus, and not even necessary...
    • by the_humeister (922869) on Monday June 05, 2006 @07:41AM (#15471469)
      Exactly. I'd rather have a console that has a 3-core cpu, 512 MiB memory, 20 GiB hardrive, and etherner ports. Oh wait...
    • Re:Go Sony, go! (Score:2, Interesting)

      by ClamIAm (926466)
      What is this 'local memory'? On-die cache? How the fuck can you screw that up to make it 16Mbit?

      I'm wondering the same thing. I simply cannot believe that the cache in this processor would be this slow (at least for read ops). I'm betting on this having something to do with the Cell architecture that got lost in translation.

      • by Mr Z (6791) on Monday June 05, 2006 @11:08AM (#15473001) Homepage Journal

        Either that, or a broken benchmark. Each Cell processor (Synergistic Processing Element -- SPE) [ibm.com] shares its instruction fetch port with its data memory port. The SPE can buffer up 80 instructions at a time (2.5 fetch words), plus an additional 32 from a branch target. Fetch will stall if the memory system gets saturated with loads and stores. Properly written memory-intensive code includes explicit fetches to keep these buffers full. Incorrectly written code will cause problems. Still, that doesn't explain a 3 orders of magnitude drop.

        If you look at the slides on the page I linked to above, you'll see the SPEs are not connected into the global address space. They connect to a private single ported memory, and to each other through two unidirectional rings. (The ring structure is not apparent from that diagram, but trust me, it's there.) These rings then connect to a DMA engine.

        If you wade through this paper, [ibm.com] you'll see that the Cell compiler implements a software cache. (The same paper also explains the instruction fetch mechanism mentioned above, BTW.) That is, it emulates a cache in software, using the DMA to actually move memory around. Depending on the nature of the benchmark and how it was written, it could be that the read benchmark spends all its time allocating stuff into this cache and waiting for it to arrive. Writes would be faster because the cache can "write behind" without having to wait for the allocation to happen, if the compiler is smart enough to know that the previous data will be entirely overwritten. So, if the benchmark goofed, then the results are meaningless.

        Fact of the matter is that the SPEs are capable of reading 128 bits a cycle each (128 bytes / cycle across the 8 SPEs). Other benchmarks, such as the article recently posted to Slashdot [slashdot.org] about using Cell for scientific computation [berkeley.edu] confirm that this thing hauls--and these are bandwidth-intensive tasks. The quoted paper did run some numbers on real silicon and showed numbers similar to their simulation results.

        With all this in mind, I find it hard to believe that Cell is broken.

        --Joe
    • by Bromskloss (750445) <auxiliary DOT ad ... privacy AT gmail> on Monday June 05, 2006 @07:47AM (#15471509)
      AND they're not trying to make it the center of your digital home.
      My other home is a digital home.
    • Re:Go Sony, go! (Score:5, Informative)

      by adubey (82183) on Monday June 05, 2006 @07:54AM (#15471551)
      On the cell processor [wikipedia.org], local memory is similar to a cache, but is not "transparently" managed by the CPU. Rather, the software must explicitly say what it wants to have in the local memory.

      • Re:Go Sony, go! (Score:3, Informative)

        by robosmurf (33876) *
        Informative, yes (and with a good link), but not relevant to this discussion.

        In the slide attached to the article, the "Local Memory" is the memory local to the RSX graphics system, NOT the Cell local memory.
        • by default luser (529332) on Monday June 05, 2006 @10:07AM (#15472392) Journal
          Take a closer look at the linked image. The two top colums are CELL. Not RSX, CELL.

          And the theoretical bandwidth numbers listed for CELL to main memory are those of the direct XDR interface. You'll note that the RSX has much lower numbers because it accesses main memory through a bridge bus (much like a graphics card on PCIe).

          On the Cell, there is only one thing local memory can mean, and that is the local memory of each SPE.

          NOTE: this can be a serious issue, because each SPE MUST read instructions and write results to the local memory. It is up to the main processor to load instructions into this memory from main memory, and to copy results from this local memory to main.
    • Re:Go Sony, go! (Score:5, Interesting)

      by Monkelectric (546685) <slashdot AT monkelectric DOT com> on Monday June 05, 2006 @08:20AM (#15471694)
      I'd just like to remind everyone that there was the *EXACT* same type of rumors about the PS2 when it launched. People were saying it didn't have NEARLY enough texture ram and "experts" were pouring over the specs and shaking their heads ...

      And it turned out to be one fo the most successful consoles ever.

      • Re:Go Sony, go! (Score:4, Insightful)

        by SQLz (564901) on Monday June 05, 2006 @08:22AM (#15471705) Homepage Journal
        And it turned out to be one fo the most successful consoles ever.

        That tends to happen when your basically the ONLY console. Not discounting Nintendo but its targeted a very different group of people than the PS3.

      • Re:Go Sony, go! (Score:3, Insightful)

        by mikeisme77 (938209)
        But that was true about the PS2... Just because it was successful, doesn't mean it wasn't underpowered for its "powerhouse" status. I've been playing God of War recently and while it's a fun game and all, its graphics are clearly not as smooth as the graphics on XBox and GC games. If Sony expects me to pay $500-600 they better have either: a) an undeniably awesome first party (as 3rd party exclusives will quickly no longer be exclusive if a system doesn't sell--even Square will jump ship) or b) a demonstrab
      • Re:Go Sony, go! (Score:5, Insightful)

        by /ASCII (86998) on Monday June 05, 2006 @09:15AM (#15472036) Homepage
        Yes, but on the other hand, the PS2 games don't look anywhere near as good as Sony claimed they would. Remember the claims that in PS2 games individual hairs on a persons head would be modelled? Both the GC and Xbox games generally have better graphics than PS2 games.
      • Re:Go Sony, go! (Score:3, Informative)

        And they were right. PS2 had the muddiest textures among the three consoles.
      • Re:Go Sony, go! (Score:4, Insightful)

        by Blakey Rat (99501) on Monday June 05, 2006 @10:22AM (#15472524)
        First of all, it's "poring over."

        Secondly, of the reasons the PS2 was successful, its graphical performance isn't relevant. It's successful because:

        1) When it came out, it had (basically) no competition. The Nintendo 64 was way past its prime, and the Dreamcast was pretty much already dead by that point. PS2's coming out was a death-blow to Dreamcast, and everyone knew it.

        2) Because of backwards-compatibility, it had a huge selection of games at release.

        The PS2's graphics performance *is* disappointing. It barely beats out the Dreamcast, and it can't hold a candle to the Gamecube or Xbox. Has nothing to do with success.
    • Re:Go Sony, go! (Score:3, Informative)

      by Pius II. (525191)
      Sixty comments, and not one of them is actually setting this whole thing right.
      Yes, the local memory can be understood as some kind of cache. It's local to the SPUs. Every SPU uses its own local memory, and can meddle around in it as it likes. The local memory is cache for the SPU, not the for the CPU.
      There is no reason for the main processor to ever read from an SPUs memory.If you just want to send it more data, use a DMA. If you want to review the results of a computation, have the SPU DMA them to main m
      • Re:Go Sony, go! (Score:5, Informative)

        by robosmurf (33876) * on Monday June 05, 2006 @08:56AM (#15471926)
        Sadly, you are also wrong.

        In the slide, the "Local Memory" refers to the RSX local memory, not the SPU local memory. The article says that the next slide is Sony telling devs to use the RSX to do the transfer instead, which only makes sense if it is talking about the RSX memory.

        Your conclusion is right though, as this also is memory that the Cell doesn't need to read from.
    • Re:Go Sony, go! (Score:2, Informative)

      by operagost (62405)
      Well, it's 16 MEGABYTES per second-- which is still ridiculous but not as ridiculous. No offense to you-- it's yet another obvious typo in the article summary (using a small "b" instead of a large "B").
    • you forgot UMD player except this UMD will not be compatible with the PSP umd and will require you to install a rootkit on your desktop to get online and update your subscription to ATRAC unlimited in the sony music store
  • PS2 Vs PS3 (Score:5, Informative)

    by eldavojohn (898314) * <eldavojohn@nOsPam.gmail.com> on Monday June 05, 2006 @07:29AM (#15471412) Journal
    Microprocessor Online has some an interesting analysis [ibm.com]. Pay attention to page 8, where the PS2 "Emotion Engine" processor is compared to the PS3 Cell processor. This is an analyst report for the industry of microprocessors.

    If you really want to dig into the details of the Cell processor, check out Sony's resources [scei.co.jp]. You have to agree to a bunch of things to get to the pdfs but there's a lot of information [scei.co.jp] in them. Another place you can find information is IBM's resource site [ibm.com] which contains a lot of stuff including the programming handbook.
  • dev kits (Score:4, Insightful)

    by whereisaxlrose (898923) on Monday June 05, 2006 @07:30AM (#15471416) Homepage
    there is no point in judgin a dev kit. x360 kits were shitty too.
    • This isn't "judgin (sic) a dev kit". This is reading the specs for a processor that will be used in the console. With a projected street date of a little over five months away there's not enough time for Sony to keep taping out new prototype Cells. They have to get near-to-final dev kits in developers' hands asap.
    • Not even a point in judging Sony when they tell the devs to not read from local memory, and doesn't mention upcoming steppings to solve this? That seem to tell me that this will be a real issue. If not, it seems to me they'd rather have been very eager to tell it was being worked on.
    • Re:dev kits (Score:4, Interesting)

      by masklinn (823351) <slashdot.org @ m a s k l i nn.net> on Monday June 05, 2006 @08:26AM (#15471736)
      Actually no, the reports I've seen by devs on dev kits placed the Xbox360 as the best dev kits, the Wii dev kits as good with bonus points for being extremely close to GC's devkits (which means that adaptation is extremely fast for the teams which had previously worked on GC games) and that PS3 devkits were utterly and completely shitty and not helped by the inherently complex architecture of the PS3 (read: PS3 is already complex enough that the devs wouldn't want a devkit making it even harder to dev on it)
  • by Southpaw018 (793465) * on Monday June 05, 2006 @07:31AM (#15471423) Journal
    I'm aware that, in the past, The Inquirer has published questionable articles. However, they've certainly got a revealing picture to back it up here...unless they're outright lying and they photoshopped something, why should we take this story with a grain of salt?
    • by Anonymous Coward
      ... because they might be lying outright and might have photoshopped something!
      • by BenBenBen (249969) on Monday June 05, 2006 @09:14AM (#15472030)
        The Inq does seem to have a somewhat poor reputation on this site and elsewhere; any chance anyone could tell me why? Are there documented cases of the Inq lying, or being deceitful? Of overly shoddy journalism?

        The Register doesn't have this rep, yet they share common DNA and I've seen at least one case [boingboing.net] where they have actually had their integrity called into question.

        As for TFA, we all heard many moons ago that the PS3 was a bitch to program for (the comparison I've seen most often on this very site is to the Saturn, which iirc had 2 cpus), and Sony aren't exactly filling the marketplace with confidence on this one. If the slow speed of this "local memory" to Cell access is irrelevant to any conceivable operation, as most people here seem to be saying, then why is it even mentioned on this slideshow?

        Seems to me there's a good mix of Shooting the Messenger, Ignoring Inconvenient Facts from the TFA and maybe even just a hint of Fanboyism here.

        • The Inq does seem to have a somewhat poor reputation on this site and elsewhere; any chance anyone could tell me why? Are there documented cases of the Inq lying, or being deceitful?

          I think people confuse the Inq with the similar site The Register -- they're both british, both have similar looks, and similar writing styles. Except The Register prints all sorts of garbage, while the Inqurier tends to be right on the money with their rumors.

          Most of their info is not especially interesting chip production deta
        • by MobileTatsu-NJG (946591) on Monday June 05, 2006 @10:33AM (#15472628)
          "The Inq does seem to have a somewhat poor reputation on this site and elsewhere; any chance anyone could tell me why? Are there documented cases of the Inq lying, or being deceitful? Of overly shoddy journalism?"

          I can share with you why I don't go to their site anymore. Check out this page:

          http://www.theinquirer.net/?article=11159 [theinquirer.net]

          This is back in 2003, not long after the Blaster worm hit. The Inquirer requested people send in photos of Windows not working in places such as airports. As a result, they took this photo and told the little story like this:

          WE'RE GRATEFUL to reader Ralph G, who snapped the shot below at Calgary (Alberta) International Airport, and shows that using Internet Explorer on big arrival and departure screens sometimes has its perils.


          My beef with this? It's quite clear from this image that IE is reporting that it cannot find the page. This isn't an IE problem. This is a problem with either the network connection on that computer or the server feeding the page. In other words, niether Mozilla, Netscape, or Opera would have rectified this difficulty. I sent them an email about it, but it went unresponded. (That wouldn't have surprised me except they had responded rather quickly to another enquiry I made that didn't point out their journalistic silliness...)

          I don't know if this is a problem most people would care about. The way I understood it, they were trying to give Microsoft a hard time over serious quality issues of Microsoft's software. That, in and of itself, I don't have a problem with. But this little story basically told me that they weren't serious about being correct about the news they were reporting as long as it fit their agenda. It was then that I stopped bothering to visit their site.

          In the interests of being fair, though, I should point out that this story is three years old, and a lot can happen in that time. It is not my intention to convince you that they are currently behaving this way. Rather I'm just answering your question about their negative rep.
        • by gabebear (251933) on Monday June 05, 2006 @10:46AM (#15472766) Homepage Journal
          They have no credibility because of articles exactly like this.

          They latch on to a fact and twist it. The Cell reads from the graphics card's memory at glacial speeds, so they run the headline "PS3 hardware slow and broken" and fail to point out the fact that you would almost never want to do this in a game.

          A respectable article would have pointed out that this doesn't have any impact on games, but will effect applications. The 256MB of RAM connected to the video card is really only good for vertex data and textures, so you are only left with 256MB to run the executables in. The practical implications of this information means that Linux will only be able to use 256MB of RAM. The RSX(graphics card) can render out of it's own local memory or main memory(almost as fast as local mem), anything that needs to be modified by the Cell must stay in main memory because of this bandwidth issue.

          Luckily, games contain a lot of static models and static textures that will easily fill up the 256MB of local mem on the RSX; stuff that the Cell would never read from....
    • by datafr0g (831498) * <datafrog&gmail,com> on Monday June 05, 2006 @07:45AM (#15471503) Homepage
      That picture could be genuine but could also have been an unprotected powerpoint slide show that anyone could have edited - that's the way I would have forged it if I was so inclined and had the chance.

      By the way, I'm not discounting that it could be real - it's got me curious enough to look on the web for the last 10 mins for some documentation to back up the claims in the story.. I couldn't find anything though.

      Anyone got any real documentation or anything to back up the claim?
    • unless they're outright lying and they photoshopped something, why should we take this story with a grain of salt?

      For the same reason Pons and Fleishman shouldn't have popped champagne corks over cold fusion. A single source is often wrong.

      I'll wait for the equivalent of scientific concensus.

      TW
    • by robosmurf (33876) * on Monday June 05, 2006 @08:06AM (#15471611)
      Because the picture isn't the thing that matters. It's been misinterpreted.

      The picture says that the read speed for the Cell from "Local Memory" is 16Mb a second. Assuming it is true (I've got no reason to doubt it), then it still doesn't matter.

      The "Local Memory" is the RSX graphics memory. The Cell shouldn't need to read this. The PS3 would still work even if the Cell couldn't read this memory at all. This memory is where you store textures and other graphics data.
      • by gabebear (251933) on Monday June 05, 2006 @08:30AM (#15471767) Homepage Journal
        Ah, I was trying to come up with someway that the picture could make any sense.

        The RSX can read the Cell's RAM at ridiculous speeds which is all that matters. The RSX can render out of main memory, so you shouldn't ever be using the Cell to read from the RSX's RAM at all. The Cell will probably be manipulating vector data for the RSX, but 256MB for all executable code and vector data is still more than enough. The 256MB attached to the RSX would have been used primarily for textures even if the Cell could read from it at reasonable speeds
    • by the packrat (721656) on Monday June 05, 2006 @08:28AM (#15471748) Homepage

      This isn't the online IT arm of the National Enquirer, you know.

      The Inq isn't always right, but what the do tend to have is a lot of news-breaking stuff that they're (well, Mike) is willing to publish regardless of the consequences when the corporate heads find out there's a leak. Thats' why Mike got eased out of The Register when it went more corporate to form the Inq in the first place.

      Those who have been following it for a while will remember all the appearances of leaked memos from Compaq (ex-DEC) insiders who were willing to leak happily to someone of the old school who was interested in seeing how the whole fiasco was turning out. Compaq/HP even started internal witchhunts looking for the leakers.

      Regardless, the only real problem people might have with the Inq is they can't distinguish between an opinion piece and direct reporting, or can't accept that while the information as presented might be correct, it doesn't ensure that interpretive parts also follow.

  • by antifoidulus (807088) on Monday June 05, 2006 @07:36AM (#15471447) Homepage Journal
    That Ken Kutaragi let his loser long-lost baby brother design the PS3 without looking at the thing or its price tag until it was unvieled?
  • So? (Score:2, Funny)

    by Tinfoil (109794) *
    Boy, it's a good thing it's not meant to be a gaming only system!

    Yes, that was sarcasm.
  • by sckeener (137243) on Monday June 05, 2006 @07:39AM (#15471457)
    So what is the difference between the local memory 16MB/s and the main memory 25GB/s 'reading'?

    I assume the local memory is not going to be used much for 'reading' and only main memory is going to be used.
  • Hehe, oops. (Score:3, Insightful)

    by chilledinsanity (906404) on Monday June 05, 2006 @07:43AM (#15471483)
    Ah well, it's nothing a complete recall and price increase can't fix...
  • by August_zero (654282) on Monday June 05, 2006 @07:47AM (#15471510)
    This reminds me, I am certain to be cruicified for not remembering this bit of trivia, but the PS3 is looking more and more like that car that Homer designed for his brother....

    What was that called again?
  • Why... (Score:2, Insightful)

    by dosle (794546)
    I can't imagine why Sony would add the text "this is not a typo" underneath the below average local read speed unless they are planning to release the final PS3 public version with much higher read speeds. If you can program a game to run great with the low read barrier then wouldn't you expect it to run ever more efficiently with the gates wide open in a final/public ps3 release? my .02c
    • A MSoft engineer creverly infiltrated Sony.
    • They got a great deal on 1103 RAM chips from Intel (Their first product).
    • Maybe "Local" memory means "Local to the 16550C UART"?
  • by MartinJW (961693) on Monday June 05, 2006 @07:56AM (#15471561)
    I thought we had all boycotted Sony anyway! Or are we on another bandwagon this week?
  • by lbbros (900904) on Monday June 05, 2006 @07:57AM (#15471566) Homepage
    The subject says it all. It's getting really tedious. Why just not wait for the release and then make comments?
  • DevStation? (Score:5, Funny)

    by mustafap (452510) on Monday June 05, 2006 @07:58AM (#15471569) Homepage
    Noticed the logo on the bottom left of the slide. Maybe it should have read

    DeviStation
  • by robosmurf (33876) * on Monday June 05, 2006 @07:59AM (#15471578)
    The "Local Memory" is the memory attached to the RSX.

    That the read performance for the Cell from this memory is dreadful is no surprise. This is exactly the same architecture that has been traditionally used in PCs. Reading graphics memory from the main processor is usually really really slow.

    This memory is where you store textures and other graphics data. The main processor will usually have little need to read from this memory. If it does, then, as apparently Sony says, you just get the RSX to write to main memory instead.

    This is a non-story. People have dealt with this for PC games for a long time.
    • So, just to get this straight, Inq's comment about the consequences of avoiding reading from local...
      This can lead to contention issues for the main memory bus, and all sorts of nightmarish to debug performance problems. Basically, if this Sony presentation to PS3 devs shown to us is correct, it looks like PS3 will be hobbled in a serious way.
      ... doesn't hold any water either?
      • Correct, this should have pretty much zero performance impact. This is how PC graphics cards work (except of course for some of the "integrated graphics" solutions).

        The Inquirer article assumes that this makes "Local Memory" useless. This isn't the case at all, as you use it to store the graphics data that the Cell doesn't need to read.
  • For goodness sake... (Score:5, Informative)

    by hptux06 (879970) on Monday June 05, 2006 @08:00AM (#15471585)
    Does anyone ever bother reading the *IBM* documents for this? Never mind what Sony have managed to do to the cell processor, if you turn to the IBM CBEA developers handbook (page 75), you will see:

    "Load and store operations (LS), 6 Clock cycles Latency". And that's the time it takes for the instruction to complete, not to be issued to memory.

    (3.2Ghz / 6 cycles) * 16 bytes != 16MB/s

    Personally, I'm gonna bet on IBM being right, seeing how they're the ones who made the bloody thing. I don't trust the inquirer anyway, but if those figures are true, the most likely answer is inefficiencies in their benchmarking programs, (Such as instruction starvation, a nasty side effect of using SPU's)
    • "Local Memory" refers to the RSX memory. The Cell doesn't have direct access to this, which is why it's so slow at reading. This is also irrelevant as the Cell doesn't NEED to read this memory.

      PC graphics cards have worked this way for years. Reading from graphics memory has always been slow as it isn't optimised for this.
  • by TerenceRSN (938882) on Monday June 05, 2006 @08:06AM (#15471610)
    I've been hearing a lot of chatter about how the PS3 is difficult to program for, developers don't like it, Sony isn't providing quality libraries, blah, blah, blah. These exact same things were said about the PS2 when it first came out six years ago and it still managed to dominate its generation of console gaming. And it certainly wasn't true that developers avoided the PS2 in favor of XBOX or GameCube. As always the winner and losers of the console wars will be decided by the buying public, in the US, Japan, and Europe.

    I think being too connected to the online debates about this stuff can make you lose sight of what the more average public thinks and bases their purchase decisions on. That's why the only real argument for the PS3's failure so far is the high price, not questions about performance or developer issues.
    • It was true of the PS2 and still is... it is a bear to work on. There was a nother console in history that parallels the PS3, the Saturn. It failed miserably due to development issues.

      I'm no fan of the PS3, but I am also no fanboy of any other system in the game, the Xbox360 is expensive and laborious to work on as well. The Wii is the system I am holding out all hopes of becoming a runaway success and causing a major shakedown of the industry. Before this industry becomes any more Hollywood-like and loses
  • It's a dev kit, first off, second off it's the inquirer, which was formed from register rejects and doesn't have BOFH, and third off, I saw a UC Berkeley benchmark with an emulated cell that would seem to indicate this is a production problem, not a design problem.

    But seriously, WTF should I care? I really don't care which console wins the virtual pissing match in the "ooooh shiny" department, if I was one of the people that did, the PS3 is already into the realm where $500 video card purchases begin to lo
  • by Anonymous Coward on Monday June 05, 2006 @08:13AM (#15471653)
    The The Inquirer article is rubbish and that slide is taken out of context. It seems to imply that the Cell can only read "Cell local memory" (whatever that is) at 16MB/s.

    Memory transfer bandwidth between each SPU and its SPU Local Memory is something more like 25GB/s (gigabyte per second); sustained actual bandwidth between all SPUs is greater than 100GB/s; peak theoretical is greater than 200GB/s (assuming all 8 SPUs present for simplicity).

    If you had access to the full version of the presentation (part of the full Sony PS3 SDK and technotes), you'd realise that that slide is part of a presentation about the RSX (the PS3's GPU). As such, when it refers to "Local Memory", it means RSX's Local Memory (eg graphics memory, video memory, VRAM or whatever you call it in fanboy/ps3/360-is-teh-suck websites). To be understood outside that context, the columns would be better labelled "Main System Memory" and "GPU Local Memory".

    The Inquirer article seems to suggest that this figure of 16MB/s (megabyte per second, by the way, what the fuck is it with journalists swapping bits for bytes? why don't they get their shift/capslock keys fixed?) is some kind of show stopper. No it isn't. It simply means that the Cell processor has 16MB/s bandwidth when reading directly from memory-mapped GPU address space. So what? Unless you're planning on calling memcpy() or some shit to bring your data back then it doesn't really matter.

    On RSX-initiated transfers you have 20GB/s bandwidth to do the same transfer (from RSX local to main system memory). Cell read bandwidth of GPU memory might as well have 0MB/s (ie no connection at all) and it wouldn't matter a bit.
    • by be-fan (61476) on Monday June 05, 2006 @12:07PM (#15473452)
      Thank you for that!

      This article takes the statement completely out of context, and the Slashdot reaction to it is just ridiculous.

      Anybody who didn't know reads from GPU memory are slow turn in your geek card right now! On a PC, even with a 4GB/sec AGP connection, reading from the framebuffer can be as slow as 75MB/sec. This has been true for a very long time --- GPU's don't like anybody else directly touching their framebuffer. That's why Microsoft took direct framebuffer access out of "DirectX". It's a performance killer on modern systems. Sony's "work around" for the situation, using the GPU to handle texture uploads/downloads, isn't news --- it's common knowledge to anybody who has done any graphics programming on modern hardware.
  • Yay! (Score:5, Interesting)

    by rAiNsT0rm (877553) on Monday June 05, 2006 @08:18AM (#15471680) Homepage
    About two years ago I decided to leave my post as a reviewer/tester for Sony. I had close ties with them for over 4 years and I began to have major misgivings on the direction and quality (lack thereof) that was being pumped out. I have been around the gaming industry long enough to know the beginnings of massive problems and they began a few years back.

    Everyone close to me in the industry said I was crazy and that this would all smooth out and Sony would easily retain its market share if not grow more. I wasn't buying it and stuck to my guns, I'm pretty happy about my decision almost daily since day 1 of E3 this year.

    I was against UMD from the beginning, yet everyone claimed that the sales were stellar. Looks like they weren't and they are proprietary, expensive, unwieldy little discs that no one wants to deal with. The "cell" processor was without a dobt my turning point, I have ZERO faith in it or the architecture and it will not become this ubiquitous omnipresent processor as so many claim, even IBM has major problems with it and designing compilers and dev software for their own product. Control schemes have been radically changed from initial proposals, and too quickly to be properly tested... that is a bomb yet to go off. System price and dev costs that are just too high for our current economic situation as well as for widespread adoption. There are more issues, but top it all off with a new unproven media that is also expensive and offers no real consumer advantages and you have the high risk of a catastrophic failure that could hurt Sony and IBM even more than they are already hurting.

    The best that can happen is that companies finally lose the DRM/proprietary/Closed nature of their consumer electronics. Stop treating customers as criminals and start to offer them affordable and accessible entertainment that is convenient. I'd actually prefer consoles to standardize and become built into consumer electronics so that developers and consumers can really get to work on a stable and long lasting platform. Imagine the possibilities. There is a lot to be said for standards.
    • by Viol8 (599362)
      "into consumer electronics so that developers and consumers can really get to work on a stable and long lasting platform"

      Standardisation Software its not a problem , just create a console equivalent of OpenGL
      and stick to it instead of inventing a new API for each new gen console. I don't
      however see how you can standardise hardware if you want to keep making progress.
      You can't say "oh yes , we'll keep this memory/IO/bus model for the next 20 years"
      because next week something better will come along. Yes ok th
      • Re:Yay! (Score:3, Insightful)

        by rAiNsT0rm (877553)
        Actually this is the #1 topic discussed by almost every manufacturer in consumer electronics circles. A standardized game system. Just as a VCR or DVD player. I honestly believe that Sony was close with the PSOne and Nintendo made a strong push with the GC, but both fell short. A couple DVD players tried to incorporate simple game systems in the hopes of hitting on a winner in years past, with no luck.

        This time around Nintendo is, IMO, in perfect position to nail it. Cost, size, standard/mature development,
    • Re:Yay! (Score:3, Interesting)

      by Sinistar2k (225578)

      I'd actually prefer consoles to standardize and become built into consumer electronics so that developers and consumers can really get to work on a stable and long lasting platform. Imagine the possibilities. There is a lot to be said for standards.

      You might want ot look into a little something called "3DO" - it was a standardized console that was licensed out to multiple hardware manufacturers so that possibilities could be imagined.

      It failed miserably.

      • Re:Yay! (Score:3, Informative)

        by rAiNsT0rm (877553)
        No offense, but the 3DO was not a "miserable failure" in fact it sold very well and had some great titles. The 3DO was also not "open" it was a _franchise_ where manufacturers could use the design specs and pay a royalty for each system sold as well as no game licensing restrictions and a royalty of $3 per game.

        Street Fighter 2, NFS, Road Rash, Dragon's Lair, EA Boxing, Gex, and more.

        It was expensive, but offered a high quality arcade-like experience. The lack of licensing also led to a very large library w
  • Typos (Score:2, Informative)

    by daybot (911557) *
    The slide from Sony pictured in the article is priceless: 'Local Memory Read Speed ~16Mbps, No this isn't a Typo.'

    Er, yes it is. The slide says 16MB/s, not 16Mbps, i.e. megabytes, not megabits... 16Mbps would be pretty slow!

  • This is the Inquirer, so take with a grain of salt.

    Do they make grains of salt big enough for the Inquirer? "Look at me, grain the size of a planet..."

  • Awhile back someone had posted a link about the battle within Sony, where someone got passed over for promotion and was put in charge of the PS3 project.

    I guess this was his revenge. The more I read about PS3 the more it becomes obvious that the platform is doomed, considering that the Wii will be on market this quarter.

    It'd be better for Sony if they just scrapped the PS3 at this point.
  • by be-fan (61476) on Monday June 05, 2006 @12:33PM (#15473668)
    It's there for a reason.

    My flame:

    "I'm sure you'll get a lot of these messages, but hell, you deserve it.

    The slow read speed you noted in the slide is for Cell reading from the RSX's local memory. Such accesses are expected to be very slow. If you look at this USENIX article from one of the Linux DRI folks, you can see this quite easily:

    DRI article [usenix.org]

    He shows how painfully slow it is to read from AGP or framebuffer memory (14 and 5 MB/sec, respectively), on a Rage 128 graphics card. For the CPU to framebuffer read, which is the equivalent to what we're talking about here, the read speed is 1/40th the write speed. At 16MB/sec read and 4GB/sec write, the PS3 is actually right in line with what can be expected of modern GPU architectures.

    Reading from the framebuffer is just slow unless you have a unified memory architecture. The CPU and the GPU aren't cache-coherent, which means every access to framebuffer memory (or even AGP memory, which is actually a chunk of system memory allocated to the GPU) must be an uncached access. Uncached accesses are just plain slow, on any architecture.

    The way your article is written, it makes it seem like Cell reads its local storage at 16 MB/sec. That is, of course, bollocks, since IBM has shown benchmarks of the Cell local storage achieving 98% efficiency. If you had any journalistic integrity at all, you'd post a retraction on your site, and a clarification of the technical issues involved."

If imprinted foil seal under cap is broken or missing when purchased, do not use.

Working...