Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

The Wii's Brain Exposed 241

Jon Stokes, at the Opposable Thumbs column, discusses a final revelation of the Wii's technical prowess. Though it's been assumed since the early days of the marketing push that the Wii is basically a super-charged GameCube, a post to Acer's Hardware boards would seem to confirm that. Not, as Mr. Stokes says, that that is a bad thing: "I'm no longer nearly as upset about the implications of this move as I was back in August. In fact, thanks in large part to my DS Lite, I've gone from being disappointed at Wii's underpowered hardware to actually anticipating the new console. I plan to pick one up when they become generally available, and I'm even hoping to hook my (nongamer) wife on it."
This discussion has been archived. No new comments can be posted.

The Wii's Brain Exposed

Comments Filter:
  • by krell ( 896769 ) on Wednesday November 01, 2006 @04:59PM (#16679377) Journal
    Enough about its brain. Show us the Wii's pinky.
  • Supercharged! (Score:4, Insightful)

    by StonedRat ( 837378 ) on Wednesday November 01, 2006 @05:00PM (#16679409) Homepage Journal
    And my PC is just a supercharged 386. So what?
    • Seriously ...

      I don't care much about what a system's floating point performance is, I only care what it can do. There has been far too much discussion on how powerful the Wii is with very little focus on what it means for games.
    • by Gropo ( 445879 )
      On the geek-hormones end of the issue, given the fact that the other 2 major players in the market are pushing 'radical' new CPU architectures--'the Cell' and the 'IBM (9xx-based?) Core Three Trio'--Nintendo's offering seems tame and low-testosterone.

      On the rational end of the issue... As long as it gets the job done who the hell cares.
    • Re:Supercharged! (Score:5, Interesting)

      by Chris Burke ( 6130 ) on Wednesday November 01, 2006 @05:22PM (#16679865) Homepage
      And my PC is just a supercharged 386. So what?

      Well, not really. The ISA may be the same, but the microarchitecture is completely different. Your PC's CPU looks nothing like a 386, it just happens to speak the same language (and certainly some new instructions, if not entire operating modes like 64-bit, besides).

      The point of the article is that the Wii's CPU is really microarchitecturally similar to the Gecko, down to the number of FP pipelines and such, and is basically a 90nm shrink of the old chip with higher clock speeds.

      Now personally I find it hard to believe that IBM would go through the trouble of shrinking the chip to 90nm (which isn't as easy as just applying a scaling factor to your old mask) without tweaking the architecture even if there were no major changes planned. I guarantee there were improvements that they either wanted to add to Gecko but didn't have time/resources for, or flaws in the Gecko that they discovered after it was produced that they would like to fix. The shrink to 90nm is the perfect time to get some of those changes in, so I'm betting they did.

      Which brings me back to your point, which was: So what? Indeed, so what? So it's the same chip, only at a much higher frequency and probably with a small percent boost in IPC performance besides. How is that bad? It isn't. It just isn't a super brand new highly experimental chip that requires new (or, going back to mainframes with slews of I/O controllers, old) programming methods. So for anyone who was hoping Nintendo would have some incredible hardware specs for them to drool over, dissapointment may ensue. Oh well, there's still a good chance it will be good enough.

      Look at the last generation: The Xbox and GC were fighting for best graphics (xbox winning mostly, but GC showing some astounding performances from time to time), and also fighting for 2nd place. 1st place went to the console with the worst graphics, but they were good enough to be part of that generation, and it had the games. The Wii will certainly be representative of this generation of graphics, even if it will be the worst in that regard. Personally I, like anyone who favors a PS2, just hope it has lots of fun games.
      • by xero314 ( 722674 )

        1st place went to the console with the worst graphics...

        Sorry to rehash old arguments but your statements are not entirely true. The issue here is how one rates graphics, since we are talking about a very subjective topic. Too say that the PS2 had the worst graphics is deceptive at best (and flat out wrong at worst). The PS2s top games had the highest resolution (only the PS2 had any 1080i games, such as GT4), the highest polygon counts for characters (I believe, and could be wrong, Jak 3 had the highes

        • Sorry to rehash old arguments but your statements are not entirely true. The issue here is how one rates graphics, since we are talking about a very subjective topic.

          Yep, and subjectively the PS2 had by far the worst graphics of the three. No cross-platform game I played looked as good on the PS2 as the GC or Xbox, and the showcase graphics games for PS2 didn't look as impressive as the Xbox or GC showcase games. For cross-platform, Xbox seemed to eek out GC, but the showcases were very comparable in qual
        • (only the PS2 had any 1080i games, such as GT4)

          Um, no. Dragon's Lair 3D and Enter The Matrix were both playable (well, as "playable" as lousy games can be) at 1080i on the Xbox. Further, there were a bunch more games (mainly sports titles, perhaps most notably a couple of Tony Hawk games) that were playable at 720p.

          Further, trying to argue that the PS2's graphics were comparable to those on the Xbox based on a very few PS2 games that pushed the console hard is disingenuous at best. For example, Konami

      • Changing the microarchitecture would have implications for backwards compatibility with Gamecube software. My personal opinion (as someone who has programmed for Gamecube in the past and is working on a next-gen game) is that they will have made no changes. The new part is just a die-shrunk up-clocked Gekko and there's nothing wrong with that.

        To back your more general point up, although people seem to have a low opinion of what the Gamecube hardware was capable of it's unwarranted. It's true that many games
        • Changing the microarchitecture would have implications for backwards compatibility with Gamecube software. My personal opinion (as someone who has programmed for Gamecube in the past and is working on a next-gen game) is that they will have made no changes. The new part is just a die-shrunk up-clocked Gekko and there's nothing wrong with that.

          No it wouldn't. Backward compatability is an ISA (as in the interface to the chip) feature, not a microarchitectural feature. This is why a modern x86 CPU can run th
          • I am well aware that ISA-level compatibility is enough for application level code, where timing and other performance dependencies are weak to nonexistent. This is not the case in embedded systems especially once you start communicating with other devices on the system bus (eg. a GPU via a scatter-gather pipe). Single cycle differences or similar can and do break code in non-trivial ways. I have seen this between revisions of development hardware.
            • Then even just the change in clock speed is going to break compatability, unless the rest of the system is still identical and has an identical increase in clock speed -- which strikes me as unlikely. Especially if there's a synchronizer between clock domains anywhere in the system.

              I'll admit I'm confused by your usage of 'embedded'. I don't think of 'embedded' as something you stick a CD loaded with software into. And I'm used to games that hit a trouble spot dropping in frame rate rather than choking,
              • The PS2 has pretty much an entire PSone on a chip. It's called the IOP. When it's not doing PSone emulation it's running the I/O subsystem. Even then some games don't work correctly. As they've cost-reduced the PS2 and changed that area of the hardware the list of games that don't work has grown (it includes Tekken 5 on the most recent consumer hardware rev, even). I don't know how PS3 is doing it but Xbox360 uses software emulation and look at the mess of that (because yes cycle emulation is incredibly har
                • The PS2 has pretty much an entire PSone on a chip. It's called the IOP.

                  Heh, right, I forgot about that. Pretty snazy solution if you can hack it. :)

                  I generally consider consoles to be embedded because they do one thing, namely play games, and one piece of software gets complete access to the hardware, down to the metal, while it's fulfilling that role.

                  Good enough for me. Though hacking on my 286 was "down to the metal", even if I did lean on DOS sometimes to handle disk access for me.

                  This distinction is b
                  • Ohh, the WiiConnect24 stuff is a good question - I have no idea how they'll handle that while playing Gamecube games. Maybe they'll just act as if the console is not connected to the network and cache messages at the server. Interesting point though.

                    The locked cache existed in the Gamecube, but the bigger point about allowing general improvements to be turned on via a register is a good one that I hadn't thought of. I should've remembered it because I used exactly that feature during Pin2000! MediaGX's buil
    • And my PC is just a supercharged 386. So what?

      That's more than slightly disingenuous. Maintaining the x86 instruction set does not, in any way, even remotely imply that the processor you're working on right now is just a 386, but faster. There have been fundamental, major evolutions in CPU technology between that 386 and your current CPU, which make them completely different animals that just happen to look (sort of) the same from the OS' point of view.

      This isn't true for the Wii hardware vs the GC hardware
      • by pilkul ( 667659 )
        What with MMX, SSE, etc, it's not even true that it has the same registers and instruction set. Even assuming you could combine the 386 with enough RAM, I'd doubt you could run much contemporary software on it.
      • Call me when I can turn off in-order writes, and they provide barrier instructions so I can control the ordering from software so hyperthreading becomes more than something the P4 engineers thought of as a "compiler problem", without understanding that you don't *ever* run a single compiler-optimized instruction stream to completion without a context switch in a modern OS. You can optimize the non-interrupt code paths in the OS itself, but for apps running *ont top* of the OS, there is no such thing as a "
        • by pilkul ( 667659 )
          I see, modern PC CPUs don't have your pet feature so they're no better than 386s.

          Nice conspiracy theory there, also. I don't know much about this topic but I suspect there is a real reason why your suggestion hasn't been implemented, not just Microsoft convincing Intel and AMD to kill performance because they don't feel like improving their kernel.
          • It's not a single pet feature, it's an example.

            If you want a laundry list, I can provide one, but we can start with this small list of things, which were also true of the i386 as well, making the current CPUs hopped up 386s:

            o Too few general purpose registers (this one's glaringly obvious, and compared to dumping another 2M of cache onto a chip, it's relatively easy to fix, but it's only been partially fixed in the 64 bit implementations, and there it was more or less a matter of maintaining binary compatib
            • by pilkul ( 667659 )
              Hmm, interesting list, thanks.

              I'll note, though, that with the exception of more registers, all the features you want seem oriented towards highly demanding applications like real-time systems and scientific computation. That's not really what the x86 is about. Modern Intel/AMD CPUs aren't necessarily primitive just because the development effort was put into bread-and-butter things like branch prediction instead of features that are elegant but ultimately not so useful for mass-market consumers.
        • Call me when I can turn off in-order writes

          That's a memory consistancy model issue, so of course it is the same. Besides, weaker conistancy models require stronger memory barrier instructions. I think the IA32 "processor consistancy" model is a good trade off, personally. Hyperthreading was just a bad idea, as was the whole Netburst architecture.

          Most of the things you list in your follow on post are just features you wish x86 processors had -- even though some of them do have those features, demonstratin
    • You don't supercharge something and get something that is 1000x faster than without the "supercharging". Unless you mean that you have a 486 PC.
    • by lawpoop ( 604919 )
      A 386!? Why, *my* computer is a turing machine!!
      • by Knara ( 9377 )

        your comment reminded me of the tachikomas from Ghost in the Shell, when they're mischievously investigating a sniper computer and are astonished that it's "a sub-turing machine!"

        (yeah, I know they're referring to the Turing Test for AI and not the Turing machine you're talking about, but I can't always control the way my brain makes these connections!)

  • It doesn't support HDTV. I think this will be their fatal flaw for next gen console rave. With lifespan of 5 years, it will start showing its age in 2 or 3 years as HDTV become the norm. Really think this was bad decision on nintendo's part, what would it costs to upgrade the video chip and corresponding bandwidth to support 720p? Even the ps2 supported 1080i output as seen with the grand turismo.
    • Re: (Score:3, Informative)

      by eln ( 21727 ) *
      I think in 2 or 3 years a significant portion of the population may have HD-capable TVs, but it will be their primary television set. It will be quite a while after that before you see secondary TVs switching up to HD. A system billing itself as a "media center" needs to have HD capabilities because it is likely to be on the main TV in the house. A pure game machine is just as likely to be in a kid's bedroom or some other secondary TV location, where HD will take longer to get to.
      • by miyako ( 632510 )
        The thing of it is, very few people who have HDTVs have CRT HDTVs. Most have LCD or Plasma. I'm not sure about Plasma, but an LCD HDTV looks really crappy showing things in standard definition. It would be nice if the Wii supported some version of HDTV support, even if it's just "render at 480i, upscale the image to 720p and blur it a bit" so it doesn't look as ugly for those of us using LCD TVs.
        • Well, at least it is 480p, not 480i. Not that most people will hook up component cables. I certainly will... to my 720p display.
        • Re: (Score:3, Interesting)

          by BenjyD ( 316700 )
          Gamecube games look great on my 32" Samsung LCD HDTV (1366 x 768, RGB SCART connection). I think a lot of modern LCDs do a good job of displaying SD content.
    • Maybe I'm just poor (I live in New York state, we get taxed to death here), but I don't think HDTV will be "the norm" in 4-5 years. They're still too expensive, and unless people have a compelling reason to buy one (Lots of disposable income, TV breaks) then chances are they're going to keep their older sets around. They work just fine for watching TV and movies. Why spend an extra $1,000+ for HDTV when my cheap $100 box lets me watch things just fine?
      • You can buy pretty nice HDTVs for $500 now. Heck, the Dell 24" LCD monitor can be used as an HDTV. HD video looks fantastic on just about every display I've seen it on. Even on an XGA projector it looks a lot nicer than any DVD can possibly look.
        • You can buy pretty nice HDTVs for $500 now

          And I can find a pretty nice lo-def TV that I already own for $000, and it comes pre-installed.

          I won't deny that HD source video can and usually does look noticeably better than standard def, but I will deny that the average person cares enough about that improvement to spend several hundred dollars on an aesthetic upgrade.
          • by radish ( 98371 )
            Have you ever been to BestBuy on a saturday afternoon? The lines of people loading huge brown boxes containing TVs into their cars would suggest plenty of "normal" people are buying new TVs every day. And over time, and increasingly large number of them will be HD.
            • You don't even realize you killed your own argument. You just stated that huge numbers of people are buying lo def tv's today every hour. so a few more buy HD, but that still means those people who just bought a lo def tv won't be upgrading it again in 5 years.

              10 years from now, when HD and DRm idiots finally settle down, the HDtv will be more commonplace. people who spent $2000 for an HD set 2-3 years ago are going to have to upgrade to the newest standards or else they are stuck with the same lo def si
              • by radish ( 98371 )
                You just stated that huge numbers of people are buying lo def tv's today every hour.
                No, I didn't. I stated that lots of people are buying TVs, I made no mention of what type other than that the proportion which are HD will obviously rise going forward. Your assertion that it's currently a very low percentage is not backed up by my own experience, or what I am reading. As an example, this article [usatoday.com] says that 1 in 6 homes already have at least 1 HDTV. And this one [contactmusic.com] says that HDTV sales are expected to exceed SDT
    • by SetupWeasel ( 54062 ) on Wednesday November 01, 2006 @05:14PM (#16679693) Homepage
      I think this will be their fatal flaw for next gen console rave.

      I don't know. The PS3 has a very low ecstasy to glow stick ratio.
    • by Shados ( 741919 )
      This is the GBA and Nintendo DS strategy rehashed. Both these consoles has (albeit limited in the case of the GBA) the ability to do 3D, but Nintendo basically -on purpose- ommited to make an API for it to be easily done. They want (even if it has to be done against their will) to stop developers from spending too much times with graphics and technicalities, to push them to work more on gameplay. Thats why a lot of 2D SNES/GBA games are better than their 3d counterparts. Or at least thats what Nintendo and
      • by tepples ( 727027 ) <tepplesNO@SPAMgmail.com> on Wednesday November 01, 2006 @05:30PM (#16679999) Homepage Journal
        This is the GBA and Nintendo DS strategy rehashed. Both these consoles has (albeit limited in the case of the GBA) the ability to do 3D, but Nintendo basically -on purpose- ommited to make an API for it to be easily done.

        True of the GBA, but Nintendo DS uses a subset of OpenGL, similar to the "GX" API used by the GameCube.

        • The GBA's "3D" support wasn't all that 3D, anyway. More like 2.5D. It was actually pretty close to the SNES "Mode 7" capabilities, which used scaling and skewing of 2D bitmaps to give a feeling of 3D-ness.
      • Ok, I'm finally gonna bite on the Wii subject.

        Here's the deal: I love old arcade games. I have a PSP, and two of my favorite game UMDs are the Midway and Namco classic collectiosns.(you can call me a chump for not doing homebrew if you must). I also enjoy side scrollers, top scrollers and, though I've never encountered one, I'd probably like a bottom scroller too.

        But I love 3D games as well. I love eye-candy. I love the imersive experience of a Far Cry or NFS title.

        Am I wrong to love both? Should I on
        • by Shados ( 741919 )
          You're not a troll. And I understand exactly how you feel: I'm the exact same way. I recently got Megaman ZX, and I drooled looking at it. I saw the trailer for White Knight Story and FFXIII, and I drooled looking at them too

          Here's the deal: like most members of the human species, game developers are single minded bozos. If you give them a console like the PS3, the ONLY thing they'll do is fancy 3d games. Or almost. So the point is, because of the way they think, you can't really have both on the same co
          • Re: (Score:3, Insightful)

            by Total_Wimp ( 564548 )
            I have no problem with Nintendo's choice, nor with yours. For what it's worth, I had no problem with GameCubes either and often witnessed my daughter's friends shuttling them from house to house to play games like Super Smash Bros (it appears Nintendo knew what they were doing when they built in the handle.).

            What I have the problem with is the people who appear to be insisting that no one "needs" the better graphics hardware and, ironically, that we do "need" the interesting controller hardware of the Wii.
            • by Shados ( 741919 )
              You are correct. Well, for what its worth, I think Nintendo simply did not expect their potential success. Remember originaly, the PS3 was supposed to be "everything", so Nintendo's ONLY hope to make a place for itself in the market, was to avoid competing with the PS3 like plague. So they made SURE their console wouldn't appeal to the same people. One way to do that, was to purposely take OUT all of the big next gen stuff. You are right, it is a 2.5 gen console, that much is obvious, and it was made that w
        • Re: (Score:3, Interesting)

          by LKM ( 227954 )

          but the fact is that the kind of fun it will be capable of producing is going to be limited by it's graphics engine

          No, it's going to be limited by the controller. The Wii's controller is the most fun.

          Or maybe not.

          My statement is only slightly less absurd than yours. "Fun" is most certainly not going to be limited by graphics. Is "Super Mario Bros" less fun than "Charlie's Angels" simply because its graphics are very modest compared to the more recent 3D game?

          The ability to create a Wii-like experi

    • Re: (Score:2, Insightful)

      Nintendo will be criticized for this, but ultimately it will matter as little as the criticism concerning online gaming.

      Just look at the numbers. Many people were screaming about how Live and other online services were going to be the bread and butter of consoles last generation. All three systems launched with promises about their online support. Only one of the three consoles delivered, and it was the one which arguably was the worst when it came to online that won out.

      Today we've finally reahed a point w
    • HDTV is only an issue to the technoliterati. Period. This is a huge mistake people make. Unless you're sitting less than twice the diagonal size of a TV distant from it, HD is not an issue. This is one of the things that really slows functional HD adoption. The vast majority of purchases are still of relatively small screens (20-35") and they're getting placed at an 6-10ft viewing distance, at which point there is very little functional difference between HD and EDTV.

      This is also a social issue. U

    • There's no reason they can't release a souped-up Wii for HDTV down the road, keeping it all compatible. (PC games have supported multiple resolutions for ages.)
    • The Wii does support widescreen though. I'm happy about that. I was more worried that it would be stretched by my TV set.

    • by Bastian ( 66383 )
      With lifespan of 5 years, it will start showing its age in 2 or 3 years as HDTV become the norm.

      I think that's a rather optimistic prediction of how quickly HDTV is going to take over. Given current HDTV prices, HDTV has a *long* way to go before most anyone I know will be willing to buy one. I wouldn't be surprised if HDTV becomes the norm among people with media-and-technophilic tendencies within 3 years, but I'm also not so sure that that's the Wii's target market. Nintendo hasn't really been after pe
    • by DrXym ( 126579 )
      It doesn't support HDTV. I think this will be their fatal flaw for next gen console rave. With lifespan of 5 years, it will start showing its age in 2 or 3 years as HDTV become the norm. Really think this was bad decision on nintendo's part, what would it costs to upgrade the video chip and corresponding bandwidth to support 720p? Even the ps2 supported 1080i output as seen with the grand turismo.

      You forget how Nintendo makes its money. Most of their handhelds have very obvious shortcomings which they con

    • by grumbel ( 592662 )

      I think this will be their fatal flaw for next gen console rave

      I very much doubt it. HD-TV adoption is still very low and will stay there for quite a while. If HD-TV would ever become a must-have in the lifetime of the Wii Nintendo could just release a improved and fully compatible Wii (just like GBAsp or DSlite) with an faster GPU that could do HD-TV and have solved the problem easily.

      While I doubt that HD-TV is an issue, I however think the lack of plain CPU and GPU power will become a problem very q

  • The Gamecube could produce some stunning visuals and the Wii is even better, but both machines are optimized for 480p. There is no justification or reason to push performance when Nintendo is still firmly in the SD resolution department. Once Nintendo commits to HD, when or if that ever happens, well, obviously they're going to have to make a more powerful machine.

    While I absolutely love the visuals from the 360 and PS3, given the still relatively paltry penetration of HDTV sets in North America, the new ma
    • People seem to forget that this is a SD console, not HD. Isn't it supposed to be twice as powerful as the original XBOX? Why would you need any more power than that for SD games? It's probably going to look incredible on SD TVs.
    • by grumbel ( 592662 )

      What would you have said when Nintendo did a improved N64 instead of the Gamecube? When they simply added a new motion controller and left the rest mostly as is? I mean N64 could do pretty graphics as well, for its time. Today however the N64 is horribly outdated and no high-end motion controller would make we want to play that thing when I can play Shadow of the Colossus or Katamary on a PS2.

      The Wii graphics don't look that bad currently, since most people are still left with Gamecube, XBox1 and PS2, so

  • ...and I'll say it again. The Wii doesn't need to be much faster to look good.

    The difference in required processing power to properly render the larger textures and more detailed models at 1080p versus what the Wii needs to do at 480p is huge. All that processing power that Microsoft and Sony will throw into 1920*1080=2073600 pixel is going to be much more than Nintendo has to worry about at 640*480=307200

    2073600/307200 = 6.75. Sony and Microsoft need to be 6.75 times as powerful as Nintendo's console to ma
    • Nintendo is focusing on what they've always focused: the family. Microsoft and Sony are targeting the power user who are easily blinded by the "ooh and ahh" factor.

      The power users are the ones who need the bragging rights of "more power" and are the ones who put emphasis on flash over function.

      The family only cares about having fun together as a family. My daughter and I still have fun with Diddy Kong Racing on our oh-so-dreadfully-inferior {/SARCASM} Nintendo 64. We don't care that it doesn't look
    • by Andy_R ( 114137 )
      Sticking with your wildly inaccurate measures...

      Wii: 1-core G3 @ 729Mhz
      Xbox 360: 3-core G3 @ 3200Mhz

      Roughly 13 times the cpu power, meaning about twice the instructions per pixel. Hmmm.
    • Oh don't you worry Microsoft figured out a trick to that one.

      Instead of rendering at 1280x720 or 1300x768 (whatever the overscan res is?) they instead render some of their games in the 1024x600 range (hi PGR3) and then upsample it claiming it's HD.

      Fun all round guys, fun all round.

  • by Ant P. ( 974313 ) on Wednesday November 01, 2006 @05:17PM (#16679753)
    They're taking one chip design and making it smaller, faster and lower power. Somewhere in the article it mentions that the 90nm version of this CPU takes about 2W at GC speeds. For reference, the DS is rated at 1.6W. You can probably predict where this is going.
    • Yeah...basically my Athlon XP 3200+ is just a smaller, faster version of the original Athlon (which was in the 600 MHz range iirc)...oh noes! That means my processor isn't any better than the 600 MHz one that was released in 2000! Also keep in mind that Nintendo is realistic with their performance numbers. When they released the number of polygons/sec that the GC was capable of, that was in a *real game*, with full hardware lights and 8 texture layers (basically what the hardware is capable of in one clo
      • by Ant P. ( 974313 )
        Uh, what the hell has this graphics drivel got to do with anything? I was implying they were going to make a portable GC.
  • by Anthony Boyd ( 242971 ) on Wednesday November 01, 2006 @05:17PM (#16679757) Homepage
    Aw, now that's just plain mean to put this right after an article about babies' brain stems. The opportunity for misinterpretation is just too high. Sickos.
    • by Minwee ( 522556 )
      The first article was "Wee Brains Exposed", this one is "Wii's Brain Exposed". I don't see any possibility for confusion.
  • I'd be a lot more (excited about the XBox360 or PS3)/(upset about the Wii design) if I believed power were the thing holding back games.

    But I think to make that argument would take some serious rhetorical gymnastics. The problems with gaming clearly lie in the ideas, the general difficulty of executing complex ideas (programming complicated things, gaming or otherwise, is hard), the overemphasis on 3D graphics, and the stereotyping of controls.

    All of these interrelate; in particular the emphasis on 3D graph
    • But I just don't see "power" as our big problem right now; we've got so much to spare that we can make grass wave realistically and make water sparkle and all kinds of other things that are nowhere near as important as the amount of development time they consume would seem to indicate.

      Personally, I see that "Power" could be the biggest problem in the upcomming generation but in the complete opposite way that some people predict. In order to get the "Next Generation Graphics" (that the PS3 and XBox 360 offer
  • Here's the thing I don't get.

    Ok, so we know Nintendo produces some fun 1st-party games. And thats a great thing. If you love Nintendo's games then you know what you want, definitely a Wii.

    The Wii will not be particularly powerful hardware-wise. Some fanboys say, its not next-gen or whatever buzzword you like. Nintendo fans say, that doesn't matter, because art direction trumps graphical muscle, gameplay and plot trump flashy graphics and nice physics. Its a fair argument.

    The thing I wonder about is, t

    • I'm sure Nintendo would have loved to have used an ultrasonic positioning method for the Wiimote if they could. The whole sensor bar thing sounds like it does actually work quite well, but it is, well, messy. I assume the reason is cost. How much can you get one of those ultrasonic controllers for? Can you hook up four of them without interference problems? What about 16 of them? I'm serious, it sounds like a much better tech for this application, so there must be something wrong with it that Nintendo choos
  • Go Wii (Score:2, Insightful)

    by Anonymous Coward
    Wii will prove again that gameplay trumps flash. Just like the DS is bitch slapping the PSP.
  • but will it heat my house too? [slashdot.org] i like multi-in-one gizmos.

It is easier to write an incorrect program than understand a correct one.

Working...