Comment Re:Fibonacci (Score 1) 311
You are right, it starts with 0 and 1.
You are right, it starts with 0 and 1.
Actually, no, the Wii SDK is the cheapest of the three. And they even support flash, so you can even start your game without the SDK. The need of having an actual company is just a way of saying "you have to take this seriously"; not a big deal if you really want to make a career of it. Most people who complain about the need to have a company actually have no idea on what goes into making a game. Nobody wants to play your tetris clone that you derived from a tutorial on gamedev.net.
You missed "1.89210568 × 10^20" in between "is" and "millimeters".
I own a Wii, but not a PS3 (because I'm cheap). The difference is not just in the graphics, which are much, much better on the PS3.
The Wii is not that bad; check out Monster Hunter Tri. It is just that Nintendo does not incentives to focus on graphics before gameplay.
It's also that the PS3 includes a hard drive which allows for a lot of downloadable content!
The Wii can do that too. Officially, through SD cards (you can keep all your wiiware games and unused saves on the card), and unofficially through USB loaders and NAND emulators. The later voids the warranty, of course, and there is a high chance of bricking or damaging the Wii if you don't know what to do. Still, I have all my Wii games backed up to an external HDD where I play them, and they load almost twice as fast than from the disc. I only use the disc drive once for each new game I buy. Nintendo should really release an official backup loading channel, makes life so much more convenient.
Wiimote also has rumble and audio (yes, it's 2-way); do the Sony remotes provide this?
Well, at least the PS3 controllers can jiggle virtual boobs.
My biggest complaint about the Wiimote (besides it's imprecision) is the wire between Wiimote and numchuck -- it's shorter than the distance between my hands. I would have preferred 2 separate wireless devices rather than 1 device with a tether to another device; it's just awkward.
There are plenty of wireless nunchuks in the market. I myself don't have a problem with it, I don't tend to keep my arms up and apart while playing, and don't like the idea of having to charge the nunchuks in addition to the wiimotes.
Let me guess: by giving total control to corporations (especially in the old-school entertainment industry).
To be more precise, Colbert always says they are the same show, split in two half-an-hour segments. Jon Stewart is the executive producer of The Colbert Report. While Stewart's character is actually Jon Stewart, Colbert's character is Colbert (with a silent "t"), the opposite of the artist, and almost all of his lines are full of sarcasm. If you agree with Colbert (silent "t") the joke is on you.
Oh, you mean the world that also resides outside the paid for politicians? The officials don't really pay much attention unless it's election time.
Then what if we had elections every 2 months?
There is not a 1:1 correlation, but there might be now. With all physical bits being data bits we could gain up to 100% more data bits on the same area.
For the uniformed: with today's technology, a 1:1 correlation between data bits and magnetic "bits" is nearly impossible. We have to interleave data bits with clock bits, so we are able to count runs of equal bits. So the data bits are encoded on this interleaved stream of data and clock/sync bits before it is actually stored in the physical medium. If the bit-patterned layout doubles as a clock/sync mechanism we can store only the data bits (with error correcting codes too, of course).
When a change to the program can break a piece of code that the compiler conveniently wrote for me, yes, of course it's a language problem. Given the number of articles, web pages and C++ books that prominently mention workarounds for this issue, I'm clearly not alone in considering this to be a trap.
Do you realize that in almost every language you can break the whole program through a small change? Who is this different than say creating an infinite loop by adding a ";" after a while (...) expression?
Overloading numeric types is a nice strawman, and conveniently lets you ignore the stream operator issue that I mentioned. Well done.
It is not a strawman. Operators are overloaded all the time on Mathematics. Words are overloaded in human language. Why is overloading in a programming language so hard to accept?
Again ignoring the issue I brought up. I'll make it a little more explicit. Take a reference to an element of a vector. Add on to the vector until the vector is reallocated. The "reference" now points to garbage. No temporary objects involved. I can guarantee you that anyone familiar with other OO languages would be quite surprised by that behavior.
It is a characteristic of the std::vector then, not the reference's. The same happens if you hold a pointer or an iterator to an element of the container. The standard clearly states that references and iterators to elements of a std::vector might be invalidated after reallocations. Use std::list and this problem goes away. There is no data structure that has no downsides; it is not C++'s fault.
Default assignment operator: All you need to do is add a pointer to your class and suddenly code that you don't see causes a bug. Yes, IF you know about this you can work around it. That's true of anything.
You mean you changed the class definition by adding pointers, without worrying about maintaining the class invariant (which is to protect those pointers) and blame the language? You might want to learn a bit more about OO programming.
Well, yes, when people see an operator, they "think" they know what it's doing. It's interesting to me that in this very first case of overloading, Stroustrup ran into this fundamental problem, and had to choose a somewhat obscure operator to get around it.
I'm sure you enjoy doing str.append("."); in your favorite language with no operator overloading at all. Even funnier must be 4 + 5 and 4.2
References: references aren't what most people think of as references.
Most people I know are aware that references are not smart pointers. Why would anyone think that? They are just like pointers that can't be changed. The only unusual usage is when you use them to keep temporary objects alive. Remind that C99 had to introduce a new keyword, restricted, to mitigate the aliasing problems that always hurt the optimizer; by using references instead of pointers you solve almost all of these problems.
Remember, one of the definitions of cross platform is that it still works after a system restart.
Somehow I doubt the LCD could stand the amount of pressure a typical controller button receives. And who would be able to play without feeling the button? I don't want to have to look at the controller only to make sure my finger is over the correct button.
Not as low-hanging as you seem to think. They would have to buy those mod-chips, do some reverse-engineering, test the updates to make sure it doesn't break any revision of the Wii hardware; and still most mod-chips seem to be upgradeable anyway; and it's not like buying new mod-chip costs more than a Wii game anyway.
In short, it's too risky, will cost too much, and will be mostly ineffective (everyone that bought one mod-chip won't mind buying a second one that is resistant to said mod-chip-killer update.)
The funny thing is the homebrew community does much more to fight piracy than Nintendo. They ban any app that even remotely might be used to facilitate piracy. And still Nintendo goes after the homebrew.
For God's sake, stop researching for a while and begin to think!