Link to Original Source
Link to Original Source
Link to Original Source
says the guy who bought a 980 just before Christmas. Yeah... hypocrisy much.
However, be aware that minimum specs for games are in a bit of a state of flux at the moment. In some senses, it's not before time; they've only risen very slowly for many years, as development of most games was targeted first and foremost at the Xbox 360 and PS3, with PC versions usually not receiving much more than a few cosmetic upgrades. For quite a few years now, a reasonably recent i3/middle-aged i5 (or AMD equivalent) and a sensible Nvidia 400-series (or AMD equivalent) would have done you fine.
Since the summer of 2014, we've seen a rise in the number of games developed primarily for the PS4 and Xbox One and then scaled up for PC, or indeed, developed for PC and then scaled down for the consoles (Alien: Isolation a fairly clear example of the latter). And as this has happened, there's been a trend for rapidly rising specs.
Shadow of Mordor, Call of Duty: Advanced Warfare, Far Cry 4 and Dragon Age: Inquisition have all needed substantially higher specs to run sensibly than was the norm a year ago. CPU, GPU, RAM and, frankly, even hard drive speed have all been pushed quite hard by the games above - you're now talking about wanting at least a recent i5 and a 780 if you like 1080p max settings. It might be that things will level out again soon. Or it might be that the increase will rise for a bit further yet. It will level out, when developers find a sweet spot that makes it easy to cross-develop between current-gen consoles and PC. But it might be worth waiting for performance analysis of how The Witcher 3's final build works before committing to a hardware upgrade - that's looking like the most technically demanding game on the horizon.
That was certainly true for the Wii and Wii-U, but I'm not sure it holds up for Nintendo's other consoles. The Gamecube hardware was, by all accounts, good. Better than the PS2's and not far short of the Xbox's. It's still slightly amazing that the PS2 did as well as it did, given it was both underpowered and a complete dog to develop for.
The N64 was more complicated; most of its hardware was pretty decent, but the decision to stick with cartridges rather than move to a CD format for games doomed it in the race with the Playstation. That was probably the most significant point in console-history (I'd rank it above even the Atari-crash, which was strictly a US phenomenon) - the moment Nintendo decided, on the basis of piracy fears, to part way with almost all of its significant third party developers (and also to massively annoy Sony, who had done a load of development work in partnership with Nintendo on CD-based console technology). If the N64 had used CDs, chances are the industry would look completely different today.
In the early days of the 360, MS spent a lot of time and money love-bombing Japanese developers to get them to make games primarily for the Japanese market (though many of them got exported to the West). Blue Dragon and Lost Odyssey - the two best Japanese RPGs of the first few years of the last generation - were funded by MS, developed in Japan with Japanese as the primary language and English translations provided later. So language was no issue for those. Similarly, MS pumped a lot of money into Cave, making sure that the 360 got ports of a lot of their most notable arcade machines.
All of which did next to nothing. I'm tempted to say MS did absolutely everything it reasonably could to break into Japan. It still didn't work. I wasn't surprised therefore that they've barely even bothered to try this time around with the Xbox One. The Japanese home console market is in a bad way anyway, so it probably doesn't matter anything like as much as it did a decade ago.
Going off this it seems to have managed 8 million sales in 2 months. That's certainly got to be a contender for "fastest selling over 2 months". The PS2, Wii and PS4 all might have been able to manage faster, as might some of Apple's portable devices, if they hadn't been constrained by supply shortages.
Of course, Kinect sales flatlined after the first few months, nobody's disputing that. But there is certainly a defined period over which it seems to be "fastest selling".
That whooshing sound you hear is the irony rushing right over your head...
Wow, bitter much...
Kinda guessing you're not a fan of the Xbox. Possibly even that you're a bit of a fan of one of its rivals? Remember that blind brand loyalty (or blind hatred of a brand) is self-defeating on the part of the consumer.
Microsoft does not love you and does not have your best interests at heart.
Sony does not love you and does not have your best interests at heart.
Nintendo does not love you and does not have your best interests at heart.
Valve does not love you and does not have your best interests at heart.
The fanboy-arguments between the various sides in the console war are more bitter this time around than I've ever seen them before. Which is ironic, really, given that the actual practical differences between the PS4 and Xbox One are vanishingly small and only really apparent to hardcore enthusiasts.
What you say is technically correct for a very narrow span of time, but also one of the most pernicious myths about the finances of the gaming industry.
The article you link is from when the 360 first went on sale in 2005. The 360 remained MS's "main" console until late 2013. Production costs fall wildly over that time. Indeed, in the traditional MS/Sony model of selling consoles, you sell at a loss for about the first 12-18 months, then as unit cost reductions and economies of scale start to work in your favour, you keep the console selling at a more or less neutral level for the rest of its life-span, reducing the retail price as costs fall further.
Where do they make the money from? Xbox Live subscriptions, first party games etc are a small part of it, but only a small part. Most of the money - and it is a lot of money - comes from third party game fees.
See, when you buy a console game as "new" (rather than pre-owned), a large chunk of the sale price goes directly to Microsoft, Sony or Nintendo. On a full-priced game, this tends to be in the $10-15 range. Historically, this has explained the price differential between console and PC games - though with Valve now taking a similar cut of most PC game sales, who knows how long that will last.
The platform owner has spent next to nothing on those third party games; in most cases, it only gets involved at the certification stage. So it is, for the most part, "free money". And with series like Call of Duty, FIFA, Madden etc racking up the sales they do, it is a lot of free money.
So the trick is attracting third parties to the console. To do this, you need to have either a large current installed base, or the promise of a large installed base to come. This is why console manufacturers are happy to sell at a loss for the first year and often to take a loss (or at least a risk) on funding first party or platform-exclusive third party games - the Halos, Gears of War, Killzones and Gran Turismos of the world. Those are the bait to lure in the early adopters to get the installed base growing to get the third party developers on board.
The other business model is the one that was previously (but not currently) used by Nintendo. In the SNES, N64, Gamecube and Wii generations, as well as with its handhelds up to and including the DS, Nintendo sold platforms at a profit from day 1 and focussed much more on first party games development. This actually worked pretty well for a long time; they made megabucks on the SNES (which also had a lot of third party support, so win-win there) and even when the Gamecube ended up with poor sales, they were still able to turn a profit on it.
But around 5 years ago, this model started to break. The Wii was essentially dead by 2010; console sales were slowing to a trickle (after a few phenomenal years) and despite the huge installed base, most Wii owners (a different demographic to that on other platforms) did not buy many games, so third party developers abandoned it. Then came the 3DS launch.
The 3DS is doing ok now. Well in Japan, so-so in the US and Europe. It's on course to be a kind of PSP-level success, which is ok (the PSP actually did much better than is generally realised, largely on the strength of Japan). But the 3DS's launch was actually a bit of a disaster. For months after launch, the damned thing just wouldn't sell - and price was a big part of it. So Nintendo reversed historic policy and slashed the price; for the first time in its history, selling console hardware at a loss. It didn't remain at a loss for long; only 6 months or so until it got onto a neutral footing - but it was enough to bury Nintendo's historic strategy. Console sales improved, third parties moved in (particularly Japanese developers, many of who shy away from the high cost of developing for home consoles) and Nintendo's losses (the first in the company's history) were reduced. When the Wii-U was launched, it was launched with a traditional Sony/MS style pricing strategy; sold at a loss at first, before moving to neutral pricing after a year or so. In the Wii-U's case, for a variety of reasons, that failed to get it a good installed base and Nintendo now has an outright disaster on its hands - the hardware isn't profitable, third parties have left and the only business left is selling first party games to a relatively small user base.
The fun thing about this cycle is that following the poor launch of the 3DS and the disaster-launches of the Vita and Wii-U, some developers bet against the PS4 and Xbox One succeeding. 2k, in particular, committed itself to a strategy of ignoring the new consoles, while focussing development on 360, PS3 and PC. That's cost them a lot of money, with their sales significantly down on a few years ago. Borderland: The Pre-Sequel in particular has been a bit of a sales disaster, with a belated port to the new consoles jsut announced.
What 2k (and others) forgot is that installed base is important, but so is the propensity of that installed base to buy games; and early adopters of new hardware tend to buy a lot of games.
But yeah, in the big picture, installed base is critical and the fact that console manufacturer's take a loss for the first 12 months or so isn't particularly relevant.
Not bullshit at all. Kinect's first couple of months on sale were extremely successful. In fact, MS made a very nice slug of money from it; unusually for the console business, there was a hefty chunk of profit margin on each unit sold. And it sold a lot of units very fast, because it was never supply constrained; unlike many new console launches, if you wanted one, you could walk into a shop and buy one (supply shortages have limited early sales of the PS2, Wii and PS4 to a large extent, early sales of other consoles to a lesser extent).
Of course, the Kinect basically went on to traverse (on a slightly smaller scale) the same kind of curve of the Wii. Lots and lots of early sales, but faltering when people started to realise that the only games you could practically play on it were short-lived party-games. So after the first few months on sale, sales fell of a brick and games releases dried up. But MS had a lot of sales and made a lot of money in the window before that.
And in what the hell sense is the Xbox brand a dismal failure? Ok, it's never taken off in Japan (basically because Japanese consumers are highly protectionist), but it's generally been a surprising success. The original Xbox managed just over 24 million sales. That's a long way behind the PS2's 150+ million, but ahead of Nintendo's 22 million, despite Nintendo being an established brand at the time and essentially being able to sell in 3 major markets (US, EU, JP) rather than Microsoft's 2 (US, EU).
The Xbox 360 managed 83 million sales until the point where MS stopped reporting sales (the unit is actually still selling). By comparison, the PS3 managed 80 million and the Wii just over 100 million (though the Wii got most of those early in the cycle - both console and game sales dried up in the second half).
And this time around - despite the "disaster for MS" narrative, the Xbox One isn't doing too badly. Sales data is a little hard to compare at the moment, but it looks like the PS4 managed 20 million in a year on sale, the Xbox One 10 million in the same time and the Wii-U around 8 million over two years. The Xbox One is in second place, but set against previous generations, it has sold fast in its first year (remember that console sales tend to accelerate in their second and third years, as prices come down and more games become available).
So MS has a successful console brand on its hands. What it doesn't have is the kind of "single device living room dominator" that Ballmer hoped the Xbox One would be. The new management seems content to settle for "successful games console", though there's a real question as to whether MS will want to be in that space in the long term.
Most 3d games use most of the controls on a standard controller, though L3/R3 (which as you say are awkward) are generally avoided where possible. The Cube controller was missing enough buttons that games needed serious redesign. The Classic Controller was closer to being fully-featured, but was an optional peripheral anyway.
In the early part of the last decade, I was housemates for a while with a guy who worked at a middle-budget developer whose niche was putting out reasonably good (but not exceptional) games based on other people's licenses across the major platforms - at the time, PS2, Xbox, Gamecube and sometimes PC. His commentary on the state of cross-platform development at the time was interesting.
The Xbox was a delight to develop for; nice simple architecture and reasonable power. The PS2 was horribly tricky and all kinds of compromises had to be made, but its installed base was so huge that you couldn't commercially afford not to release for it. What was inside the Cube was perfectly nice to design for, but the controller limitations meant that entire sections of their game had to be redesigned for the Cube version, and features sometimes cut. So some movement abilities would have to be automated, or combat simplified, which meant difficulty had to be retuned and significant additional QA testing was needed. Towards the end of the cycle, when the Xbox notably overtook the Cube on installed base (having more or less level-pegged until then), they dropped Cube development; redesigning games to fit the controller was costing more than the money was justifying.
The "share" button also needs to be changed into something a bit more genera -purpose. I know that the whole game-streaming thing is big right now, but the simple fact is that the majority of gamers - self very much included - will never actually record gameplay footage interesting enough to be worth sharing with others. By all means, have some kind of option in the OS to enable recording and uploading of footage, but you do not need a controller button set aside for it. That's just pandering to narcissists.
The range of functions available on console controllers is actually massively significant. It's every bit as important as the hardware inside the box in determining how difficult it is for a developer to produce a game that works across a range of platforms. If you change the functions available on the controller, you will require changes to gameplay for a large number of games. This is one of the reasons why third party support for the Gamecube was so poor, despite it having a similar installed base to the original Xbox and fairly easy hardware to develop for; its little controller had fewer buttons than the Dualshock 2 or the various iterations of the Xbox controller, so games would have had to be redesigned to fit on it.
The PS4 controller isn't a total disaster; as the touchpad also doubles as a button (which is all most games use it as) you can still have functional equivalents to the "start" and "select" buttons. But it's still an unhelpful step back in a world that had been moving towards controller standardisation.
And just for fairness's sake - the Xbox One controller's layout is fine, but its build and materials feel cheap and nasty compared to the old 360 controller. And the Wii-U gamepad does at least have the right number of buttons and sticks (unprecedented for a modern Nintendo controller), but is even larger, heavier and more uncomfortable than the first-gen Xbox controller and has an awful battery life.
It should be a priority - because if it isn't, it will start hitting revenue. I'd gone years without using adblocking software, on the grounds that I knew a lot of sites I liked depended on advertising income.
When Yahoo! ads starting redirecting to ransomware-pushers a couple of months ago, I reversed my policy fast.
Can't you just nuke the recovery partition with dban or something similar? I've removed Dell recovery partitions that way in the past.
Agreed it's moronic. But this is Nintendo we're talking about. Region locking isn't about the money; it's about a combination of their messed-up corporate structure (the various international companies are only loosely integrated) and nasty control-freakery. They have a long history of liking to say "title X does not fit with our irrationally conceived stereotype of region Y, so we won't release it there, or will cut it to hell first". Region locking is one of the tools they use for that.
The whole "region locking for differential pricing" thing at least had a simple motive behind it ("more money"), but it doesn't work all that well (markets where you need to sell cheap tend to have too much piracy to be worth it anyway). Most people who region lock for that reason are moving away from it now (Sony and MS have ditched it entirely).