I'm very curious about your sig:
If PC gaming is dying, HTPC gaming can revive it.
Considering the HTPC itself doesn't seem to be gaining much traction these past couple years, and consoles have been encroaching (albeit very slowly) on the HTPC space, I'm interested to hear what your view on the topic is.
There are two kinds of real-time multiplayer video game. Some games require one machine and screen for each player; these are historically associated with personal computers controlled by a keyboard and mouse, connected in either a local-area network or through the Internet. Other games allow multiple players to share a screen. Incidentally, this can be done without splitting the screen, as seen in Midway's Gauntlet, Konami's Bomberman series and Teenage Mutant Ninja Turtles (arcade), and Nintendo's Super Smash Bros. series. These traditionally run on arcade cabinets or on video game consoles with multiple gamepads. The historical reasons for this platform divide include the difficulty in connecting multiple gamepads and the difficulty in fitting four players' bodies around one 14- to 17-inch monitor.
But in the late 1990s, the line began to blur. At first, only consoles had hubs called "multitaps" to connect four gamepads to one machine, but starting with the popularization of USB in 1998, the PC has also had hubs that take multiple gamepads. In the early 2000s, more and more PCs have included composite and S-video outputs for a standard-definition television, and high-definintion televisions have included VGA-style video inputs, solving the screen problem. The rise of home theater PCs has led to demand for multiplayer games designed to fit an HTPC.
Yet even in 2008, this demand has not been met, and the stigma of one PC per player remains. A minority of PC titles, such as Serious Sam, Lego Star Wars, and Midway Arcade Treasures, allow two players on one screen, but not much more. Even cross-platform games whose console version works with more than one gamepad tend to need one PC per player. The landscape of HTPC gaming is so barren that some people have recommended loading up an HTPC with emulators to run unauthorized copies of console game ROMs.
Much innovation in software comes from microISVs, or small businesses that develop software and distribute it on the Internet. These are often home-based businesses and in some cases are run more as a hobby or moonlighting enterprise than as a profit-seeking day job. Some microISVs make their money by developing proprietary software, distributing a trial version at no charge, and selling copies of a version with more features. Others, especially developers of free software, just take donations and advertisements. But the console makers have consistently excluded microISVs from the market. For example, from Nintendo's developer qualifications for Wii and WiiWare: "In addition, an Authorized Developer will have a stable business organization with secure office facilities separate from a personal residence ( Home offices do not meet this requirement )".
Imagine that the head of a microISV has written a design document for a video game intended for two to six people in one room looking at one screen. His team has developed a playable prototype that runs on Windows. For which platform should he and the rest of his team develop and market the final version?