Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

How the Wii Was Born 97

saintory writes "Ars Technica has an article up looking at how the Wii was born. It's a nice overview of how Nintendo's culture came up with the 'new-gen' system." More from the article: "'Diverging from the road map takes a fair amount of courage,' [Engineer Shiota] said, 'especially when we didn't have a clear image of what we were going to do with this hardware.' However, once he saw the power level reduction (from one-third to as little as one-fourth that of current hardware) he was very excited. Instead of competing on 'how many more times the CPU is going to be faster, how much more memory is going to be on the machine, and how many more polygons can be rendered' he saw Nintendo as being able to do something different and unique."
This discussion has been archived. No new comments can be posted.

How the Wii Was Born

Comments Filter:
  • Maybe its just me but Since I live on Long Island with a company Like LIPA I am actually glad that the wii is going for lower power consumption. Lipa likes to raise rates every couple of months. I also dont mind the remote. It obviously has to be good for sony to have added it to their controller.
    • with a company Like LIPA I am actually glad that the wii is going for lower power consumption.

      I'm not from long island (not even in the same hemisphere) so I can only assume that LIPA is Long island power authority or something similar. If a game console is going to make a hit on your power bill then you really do have problems there on Long Island :)

      • If a game console is going to make a hit on your power bill then you really do have problems there on Long Island :)

        This is wandering off-topic, but yes there certainly are problems on Long Island - at least there was 10 years ago when I lived there. At the time, Long Island had the distinction of the most expensive electric rates in the country. The reason for this is that they built a nuclear power plant in Shoreham that they were never allowed to turn on. (The reason being they decided there was no pr
        • Yup now I think the rates are back up to highest in the country. LIPA every couple of months has been raising the rates. Now the thing is that lipa was supposed to have lowered our rates now it did the opposite and made them higher. Having a computer , a 360 or ps3 and a crt tv that both consume a lot of power really would put a lot on the power bill. So if I get the wii and not one of the other consoles that could save me money in the long run especially since it will consume less power then my gamecube.
    • Here in IL, ComEd will be raising our rates by as much as %50 come January 1st.
  • It might not be as powerful as the other two offerings but it sure looks better, very sleek and clean.
    • What bothers me about the Wii is not that it may not be as a powerful. My concern is that it's lacking a hard disk - a device which let the X-Box handle games like Halo, a title which would have required long load times on the PS2. I'm going to wait and see how well the Wii handles loading games before I invest in one.
      • by devmage ( 685080 )
        Is load time really that important? Look at the PS1 whos load times were absolutely horrible and they sold how many? Since the Wii is basically GameCube x2 I'm not worried about load times. There were only a few games that had some bad loadtimes though I can't even recall the name of one. Of course the GC did use different media. Even still Nintendo has always been conscious of load times so I'm sure they are fine. I am very much looking forward to the Wii and playing Zelda soon :)
      • Re:Looks Good (Score:5, Interesting)

        by The Warlock ( 701535 ) on Tuesday October 03, 2006 @09:16AM (#16290533)
        Nintendo is extremely careful about load times. I mean, hell, they went with cartridges instead of CDs for the N64 because of load tims. Gamecube devkits have deliberately-limited transfer rates from the dev hard drive so that the devs need to deal with load times. I'm sure it won't be a big deal here.

        Plus, it does have an internal Flash drive, although I think that's mostly for downloaded stuff.
      • almost all GC games by Nintendo had awesome load times. Smash Bros Melee is a particurily awesome example. There may be things to worry about with the Wii, load times ain't one of 'em :)
      • by tepples ( 727027 )

        My concern is that it's lacking a hard disk

        Wii has a 512 million byte internal flash drive and a slot for SD cards.

        a device which let the X-Box handle games like Halo, a title which would have required long load times on the PS2.

        DVD read speeds [osta.org] have increased since the PS2 came out.

        I'm going to wait and see how well the Wii handles loading games before I invest in one.

        Nintendo's GameCube games didn't need to spend too much time loading. Neither should games on an overclocked GameCube with a remote

      • by hords ( 619030 )
        If the hard drive was that important you would think every version of the X-Box 360 would have come with it. I bought the Pro version that comes with one, but the games don't utilize it since not everyone has one. Prey has long load times for certain on the 360, which perhaps could be better if content was shifted to the hard drive. I doubt that it is so long on the PC, but can't say for sure since I haven't played that version.
      • My concern is that it's lacking a hard disk - a device which let the X-Box handle games like Halo, a title which would have required long load times on the PS2.

        I wonder if there might be room for some imaginative use of the USB ports one day. Not for an external hard drive, I mean, and not specifically with regard to the Wii -- but flash drives are getting cheaper and cheaper. I wonder if someday it might become economically possible for them to, if not supplant, then at least complement, the DVD as the

  • Power to the Wii (Score:4, Interesting)

    by GORby_ ( 101822 ) on Tuesday October 03, 2006 @08:10AM (#16289953) Homepage
    Well, while the Wii certainly doesn't look like system providing next-gen graphics, I guess the article makes an interesting point though... with development costs of modern games going through the roof, it might make perfect sense to design a simple system (from a hardware perspective) like their new console.

    Also, does next-gen necessarily have to mean next-gen graphics? Or does good-enough-graphics with a fresh look on gameplay suffice?
    • Re: (Score:3, Interesting)

      by Aladrin ( 926209 )
      Yeah, I'm afraid next-gen does indeed have more to do with graphics and processing power than anything else. In a sane world, creating a new (?) interface for gaming would be easily classed as 'next gen' because it evolved. Of course it isn't truly new, as motion sensing has been done before. Just not from the ground up.

      Nintendo's 'next gen' consoles and handhelds do indeed seem to lack when compared to their competition, but they really -are- fun. And that's what gaming was supposed to be about. These
    • by XxtraLarGe ( 551297 ) on Tuesday October 03, 2006 @09:09AM (#16290443) Journal
      IMHO, gameplay always trumps graphics. Case in point, Civ 2 vs. Civ 3. (possibly Civ 4). To be honest, none of the Next-Gen consoles offer anything compelling enough to make me want to buy them, at least not at the current price point. I don't have a HD television, and I don't have tons of money to spend on games. If somebody asked me what they should buy for a game console today, I'd tell them PS2. You're getting a console that plays DVD's and has a huge collection of great games for $20, and free online play. If I had to pick a Next-Gen console though, it would be the Wii based on price point and game selection.
    • by Yvan256 ( 722131 )

      Well, while the Wii certainly doesn't look like system providing next-gen graphics[...]

      Well, the Wii is able to display 480p graphics, so it's as good as a DVD and appropriate for non-HD TV sets (which is still the biggest installed marketshare). You have to see Metroid Prime 2 in 480p (via the component cables and a progressive-capable TV) to appreciate the graphics of the Gamecube.

      I'm ok with the Wii having 2-3 times better graphics than the Gamecube. In fact, I really like the idea: don't increase the re

    • Initially, graphics > gameplay. The mass market is just too easily fooled at first.

      After about 2 or 3 years though, you see a big drop in trust. Any video game fan knows what I'm talking about. After the "it'll get better when the second and third gen games come out" and the "those defects are just a fluke, the next batches will be perfect" the market just gets fed up. (And no, I'm not just picking on Sony. The Gamecube was criticized for its lack of good/notable second gen games (remember Mario Sunshine

    • by Kuvter ( 882697 )
      I'll take gameplay over graphics any day
    • by JonPhi ( 1009675 )
      the purpose of games is the game-play, not the graphics.. although acceptable graphics can aid gameplay
  • Power Consumption (Score:2, Interesting)

    by Iwanowitch ( 993961 )

    it was important that the machine stay powered on all the time, so it was designed to operate in a low-power mode that would turn off the fan when it was not being used to play games.

    I always wondered... If this thing is going to be plugged in always, and running always, doesn't it consume enormous amounts of power? I've often hear people say that it's better to unplug your tv, stereo, ... when not in use for 'longer periods' (say, the night) because even the smallest of control lights still uses power for

    • by chrismcdirty ( 677039 ) on Tuesday October 03, 2006 @08:27AM (#16290063) Homepage
      I'm by far no expert, but I believe the reason people say it's better to unplug everything is the miniscule power drain from being plugged in. It won't cost you more than a few dollars a year. But when the entire world is leaving devices plugged in, it ends up being a huge amount of power devoted to doing nothing.
      • by gEvil (beta) ( 945888 ) on Tuesday October 03, 2006 @09:17AM (#16290535)
        But when the entire world is leaving devices plugged in, it ends up being a huge amount of power devoted to doing nothing.

        Not true! How else will I know that it's eternally twelve o'clock in my apartment?
        • Ok, The device isn't completely dormant, but this isn't a power 'leak'. It's a feature that makes powering on devices much quicker and keeps things like the IR receiver for your remote control active so when you hit power on the remote, it works. (and quickly)

          For the minimal amount of power that pulls per year you are far better off leaving it go rather than wasting time constantly plugging/ unplugging your devices, or wire up a master kill switch so you aren't trading off minimal power savings for produc

        • In my apartment it's only intermittently 12:00. Like right... now! Not now! Now! Not now!
      • by LKM ( 227954 )

        Exactly. About 10% (or more, depending on who you ask) of all "household energy" is used by devices in standby mode. It's not a lot of money for every single household, but it is a lot of energy if you add it up.

        As an example, if you use your Xbox for two hours a week and keep it in standby for the rest of the week, it uses more energy in standby than during the two hours you use it - and the only thing you gain from it is that you can turn it on using the remote.

        So if you do have devices which keep their

      • For the past few generations, DVD players, VCRs, Stereos and Televisions have been notorious for power drains when turned ``off''. With the exception of high end items, most of these appliances are not being engineered with being green in mind. Instead they're designed with instant gratification in mind and keep things charged up so as to be ``instand on'' when the remote gets hit.
      • A lot of EU countries are already concerned [slashdot.org] about this waste. IIRC, Europe even has tougher requirements on computer power supplies to help prevent waste.

        They certainly don't look on it as a minor issue. And they're right. Even a little electricity adds up fast when you start talking millions of users, each using dozens of pieces of electronic equipment in "standby" mode.

        -Eric

      • But when the entire world is leaving devices plugged in, it ends up being a huge amount of power devoted to doing nothing.

        But the power plants are making that electricity anyway, and selling it on the cheap because the demand for electricity overnight is much less than the demand during waking hours.

        In fact, if a device uses more power to cold-boot than to wake up from standby, it could actually be MORE wasteful to unplug the device when not in use.
        • by lkeagle ( 519176 )
          That would be completely dependent on the frequency of power cycling the device.

          I challenge you to present a device that you turn on and off so often that it saves a significant amount of energy.

          Now on the other hand, many people leave computers/electronics/lights on because the mechanical stresses of turning them on and off frequently can cause physical damage to the device. If you factor in the cost of replacing/repairing the device due to physical damage, then you may have an argument for a very small s
    • Re: (Score:3, Informative)

      I'm sure a power consumption analysis will be done... it's been done for the current consoles [dxgaming.com]
      • by Ultra64 ( 318705 )
        ^ The grammar in the above post is incorrect for the explicit purpose of pissing you off.

        I think you mean the express purpose.
    • Re: (Score:3, Interesting)

      by Wdomburg ( 141264 )
      If this thing is going to be plugged in always, and running always, doesn't it consume enormous amounts of power?

      The article states the power consumption is "from one-third to as little as one-fourth that of current hardware". Since the Gamecube drew only about 20W, that comes to 5-7W. That would make a full days consumption about the same as having an XBox 360 on for an hour. A years worth of power, assuming 0.14/kWh comes to a whopping $8.58. And that's assuming it draws full power the entire time. T
    • IIRC, Iwata at the E3 2006 presentation stated that, while the Wii is in standby mode, it would draw about as much power as a small LED light. Don't know if that has changed since then.
    • http://en.wikipedia.org/wiki/Obsessive_Compulsive _ Disorder [wikipedia.org] . . .perhaps you've heard of it. . .

      In all seriousness though, I've read somewhere that on stand by mode it uses less than 10 Watts, for comparison that nightlight you and/or your kids (may have) used as a child would use somewhere between 20 and 60. Hell, I bet running your digital alarm clock all night would be about the same as the Wii, and how often do you disable that at night?
      • A 60-watt nightlight? Maybe for a blind person. The old-style incandescent wasteful hot nightlight in my bathroom draws 4 watts.
        • okay so I always slept in the light, my nightlight had to be extra powerful so It show up over the ambient light, you insensitive clod.
    • Re: (Score:3, Informative)

      by wolrahnaes ( 632574 )
      I have a Kill-A-Watt and actually have a gamecube attached to it right now. Here's what I've seen:

      Standby (power off): 0w, no draw
      Idle (power on, no disc): 21-22w
      Game (3" disc): 22-23w
      Game (5" disc): 23-24w

      The various gadgetry on my desk (PC, LCD, IP phone, wireless router, Xbox 360, various chargers, and alarm clock) pull more power as a whole in standby mode than the Gamecube does when playing a game off discs too large to even fit in a stock console.
  • Deja Vu (Score:1, Troll)

    Is there anything in TFA that was not already mentioned in NOA's translation [nintendo.com] of Iwata's interviews [nintendo.co.jp]?
  • by mgblst ( 80109 ) on Tuesday October 03, 2006 @08:28AM (#16290069) Homepage
    for Nintendo. They opted away from the childish design of the gamecube, for something more sleek. I can't see this as a bad thing.

    Always on, could be good too, but could backfire in our green world.

    But a weak CPU, I am pretty sure that developers will always push for a better CPU. One of the ways of measuring a console is to compare those games that run on all three - and this could make the Nintendo look bad, very bad. Risky.
    • Re: (Score:3, Interesting)

      Then again, developers may be happy that they don't have to spend upwards of $50 million just to get a game out the door because Nintendo is forcing them to make sure it works in SD, 480p, 720p, 1080i, and maybe 1080p. Part of Nintendo's strategy is to reduce the amount of money it takes to make new games by reducing the importance of the high-cost items (CPU, GPU, HDD). And from what I hear, the DevKit is very similar to the GCN, since the architecture is merely an upgrade, as opposed to a total overhaul,
      • And as you said it doesn't have to do HD so in the broadest terms possible it only needs a 1/4 of the power under the hood to render the same scene (I know I'm going to suffer for making such a horrible generalisation...) vs a HD rendering.
        • And as you said it doesn't have to do HD so in the broadest terms possible it only needs a 1/4 of the power under the hood to render the same scene (I know I'm going to suffer for making such a horrible generalisation...) vs a HD rendering.

          Not suffer, so much as have something gently pointed out :)

          Standard Definition has a resolution of 320x200 pixels (through an RCA cable), or 64,000 dots. S-Video can carry up to 800x600 pixel, or 480,000 dots. 1080p carries a resolution of 1920x1080 pixels, or 2,073,600 p

          • "Standard Definition has a resolution of 320x200 pixels (through an RCA cable), or 64,000 dots. S-Video can carry up to 800x600 pixel, or 480,000 dots. 1080p carries a resolution of 1920x1080 pixels, or 2,073,600 pixels." Well - your idea that SD takes less to render than HD is correct - but your NTSC stats are WAY off. NTSC is 720x486 pixels. No more - no less. 800x600 is a computer resolution - NOT a video one. DVDs (and MiniDV) are actually 720x480. But also - this is moot - because RCA and SVIDEO are
            • by arose ( 644256 )
              This is the first time I hear 486 instead of 480, any more info on that?
              • 720x486 is standard NTSC full res. When I load uncompressed video into an Avid - the video res is 720x486 - standard NTSC. It has been this way since the dawn of digital video editing in the early 90s.

                720x480 is the standard res for DV or DVD. They lost the 6 pixels for compression reasons - but you'll only notice the difference if you load it into an editing system - you'll never see it on a TV.

                Trivia: The N64 (and PS) only outputted 320x240 and it was upscalled to full NTSC.
      • by hal2814 ( 725639 ) on Tuesday October 03, 2006 @09:07AM (#16290417)
        "Then again, developers may be happy that they don't have to spend upwards of $50 million just to get a game out the door because Nintendo is forcing them to make sure it works in SD, 480p, 720p, 1080i, and maybe 1080p."

        But they're spending that money anyways to get the game to also run on the 360 and PS3. Honestly, the best thing I can see about the Wii is that the new controller will force 3rd party developers to actually think about the port to Wii instead of it just being an afterthought. For example, EA is already talking about what features they can put into the Wii Madden 07. They're not just dumping what they have to console X like they do for every other console. They're actually thinking about how to make the game better using the Wiimote.
        • by Raenex ( 947668 )
          But they're spending that money anyways to get the game to also run on the 360 and PS3.

          Could be that there will be a lot of games made exclusively for the Wii. Especially by smaller, creative studios that don't have a gazillian dollars to spend.

        • by LKM ( 227954 )
          But they're spending that money anyways to get the game to also run on the 360 and PS3

          Interestingly, this doesn't seem to be the case so far. A lot of the Wii games are Wii exclusive, and even most of the franchises that are ported to the Wii seem to be totally different games from their 360/PS3 counterparts.

    • Is it really that risky? Does the success of a console necessarily depend that much on the number of operations per second that can be performed by the CPU? If so, how do you explain the fact that the DS is (if you go by console sales estimates) more popular than the PSP (which has a more powerful CPU)?

      Also, +1 to chrismcdirty; cost really is an important factor -- both for developers and consumers. I'll probably buy a PS3, but not before the price drops to something near US$200. (And I was planning on
      • I'll probably buy a PS3, but not before the price drops to something near US$200.

        You may be waiting a long, long time then.

        There are a rare few examples I can think of where a game console eventually sells new for less than half of its launch price. The GameCube is one: launched at $200, now on sale at around $100. Sega's Nomad portable went from $180 to $80 during its brief lifetime, and the Atari 2600, originally $200, eventually went for $50 (but it took a full decade to get there).

        I don't expect the P
    • Re: (Score:1, Interesting)

      by Anonymous Coward
      But a weak CPU, I am pretty sure that developers will always push for a better CPU. One of the ways of measuring a console is to compare those games that run on all three - and this could make the Nintendo look bad, very bad. Risky.

      I'm not too sure about this ...

      First off, Developers and Publishers are like any type of buisness and produce games where they believe that the reward from producing the game is greater than the risk involved. The main factors which affect risk are Game Complexity, Game Cost, the
      • by Senzei ( 791599 )

        On a side note, I need to ask to what extent will the extra power of the PS3/XBox 360 be used to produce better games and not just better graphics? It seems to me that most AI is still scripted and doesn't take all that much processing power, and few games would have much use for realistic physics of more than a handful of objects at the same time.

        My understanding was that the processor configurations used for the ps3 or 360 (or both) were also pretty bad for doing AI. I don't remember the details, but it

  • For those of you who can't get your interview translations fast enough: it seems that Nintendo Europe has been updating theirs [nintendo-europe.com] faster than NOA.
  • Old News (Score:4, Interesting)

    by Nalgas D. Lemur ( 105785 ) on Tuesday October 03, 2006 @08:44AM (#16290191)
    The interview itself has been posted in pieces over the past few weeks, starting with this segment [nintendo.com]. It's been ongoing, and it's pretty interesting stuff, if you're into that sort of thing. There was a story posted on Slashdot a week or two ago that mentioned it, even, but it hardly had any replies, so I don't know if that's a sign that no one read it, or what.

    Anyway, as I posted on that story back then [slashdot.org], among other things, the interview mentions some things that I've seen people here talking about, like the possibility of distributing independent games via the Virtual Console system. They seem to be considering it and possibly in favor of it as high up as Iwata. It goes into a whole lot more detail than the Ars Technica summary does, and the more recent segments talk about some of the software design elements, not just the hardware side of things. Interesting reading.
  • I don't get how anyone could think going with slower hardware is a GOOD thing, also excusing the Wii's slower hardware using "game development costs" is ridiculous, the cost to develop games will always be changing as game companies look to find cheaper ways to make the latest and greatest games. The fact is if you provide the developers hardware *they will find ways to use it for something* even if that is not graphics!! Exta processing power does not always have to be about graphics... I'm getting a li
    • by hal2814 ( 725639 )
      "Exta processing power does not always have to be about graphics... I'm getting a litle tired of the "so the graphicss aren't as good, who cares?""

      It doesn't have to be but historically, that's where the extra oomph is being used. Games still have generally simplistic AIs and I haven't seen a major physics upgrade to a game in a while. The biggest non-graphics jump I've seen recently is the sheer number of simultaneous objects Call of Duty 2 keeps track of on the 360. I imagine if you cut HD out of the e
    • by Raenex ( 947668 )
      I don't get how anyone could think going with slower hardware is a GOOD thing

      But it is a good thing. I own a 360, and for all it's extra power, it's hardly a huge improvement over the original XBox. On top of that it is more expensive, noisier, and has heat issues. The article mentions the law of diminishing returns, and they're completely right. Nintendo saw this and took a step in a different direction, refusing to play the spec game.

    • I don't get how anyone could think going with slower hardware is a GOOD thing

      Slower hardware isn't a good thing. It's also not a bad thing. It has positive and negative effects. The main negative effect is, of course, that the games won't look as good as on other consoles. Positive effects include cheaper games and more room for smaller games and smaller publishers/developers.

      As with everything in life, it's a trade-off.

      the cost to develop games will always be changing

      Yep, and with the Wii, they're

    • Re: (Score:1, Interesting)

      by Anonymous Coward
      I don't get how anyone could think going with slower hardware is a GOOD thing, also excusing the Wii's slower hardware using "game development costs" is ridiculous, the cost to develop games will always be changing as game companies look to find cheaper ways to make the latest and greatest games.

      Developers do try to find the cheapest ways to make the latest and greatest games they can, and quite often this means that they produce their game on the Gameboy rather than on a PS2. The reality is that most devel
      • To be fair, the PS3 will not require a memory card either (both versions have some sort of hard drive).
        • OOPS

          Mod me "-1 STUPID"... I swear I read PS3, not PS2 :-P
          • I'm pretty sure he meant PS3. Not that this makes much sense, because the stripped PS3 is $150 more than the Wii not $50, but more sense than PS2 because if he's paying $300 for a PS2 he's an idiot.

            So I'd say your point stands. Though it's still $150 more. :)
    • by nasch ( 598556 )
      The fact is if the PS2 and Xbox 360 are with $50 of the Wii at Wii's launch you definitely know an extra $50 is not much of a stretch.
      Sure, and if the new Porsche coupe is within $1000 of a Civic that would be a good deal too. Have you heard anybody knowledgable suggesting that the 360 or PS3 could be within $50 of a Wii?
    • Slower hardware is a good thing only when it comes to cost. Cost is always a concern in everything everyone ever does. EVER.

      But you're really complaining about the wrong system. The Xenon & Cell CPUs were made to be incredibly fast with high clock cycles. They were also designed to be monsters at graphics processing. However, to do this, they chose an chip architecture that is particularly bad at Physics and even worse at AI. In an effort to maximize graphics, they sacrificed everything else. L

  • by neelm ( 691182 )
    Reposting Ars Articles for over 6x10-2 centuries
  • Just read the orginal source material [nintendo.com]. It's more interesting and detailed.
  • And here it was I thought it had something to do with, "...and this little piggy went wii wii wii all the way home."
  • well,the lil game cube could access the frame buffer directly with the processor,and could do multiply add pretty fast

    so... why the heck they did not used it to just scan the entire screen once with a Dot product calc to do normal mapping?

    its that hard?

"When the going gets tough, the tough get empirical." -- Jon Carroll

Working...