Jumping genres for a moment...
A decade ago, in my early/mid 20s (while I was a post-grad student), I was a fairly high level Counter-Strike player. Not one of the greats, but certainly good enough to pull my weight in a team which managed to take home the occasional bit of prize money in tournaments. However, three things happened which meant that I moved on from that phase.
First, I finished studying and got a job. While the hours I was working were probably only slightly longer than the hours I'd been studying (postgrad can be harsh), I now had much less flexibility over which specific hours I worked. I also had a commute that ate up another couple of hours every day.
Second, I started to really dislike the online gaming scene. I got tired of the foul-mouthed kiddies on the public servers and the up-their-own-backside sponsor-obsessed "pro" players. As well as being a player, I was also a league admin and organiser, so I spent a lot of time dealing with this and the bigger "pro" gaming got, the more toxic the high end community got.
But most importantly for the subject at hand, I realised that I'd hit a plateau in terms of how well I was able to play the game. My aim and reactions were probably good enough to allow me to progress further. Not to the very top tiers, but certainly to a higher level than I was playing at. But my judgement and temperament weren't suited for it and resulted in a lot of mistakes of the kind that you can't afford at that level of play. So while I never went cold turkey, over the 6 months after starting a new job, I basically scaled down from being a hardcore competitive player to being an occasional dabbler in public servers. And then over the next few years, I basically gave up on competitive multiplayer entirely (continuing to play a lot of singleplayer and co-op games).
And then, last summer, for a brief window, I got into the Counter-Strike re-release.
Somewhat to my surprise, I was still very good at the game. However, when I recorded some replays and then went back and watched them, it was clear that in my mid-30s, I was good at it in a very different way to how I'd been a decade earlier. My aim was still ok, but my reactions were lethargic compared to how they'd been in the past. I had, however, gotten a lot more patient and a lot sneakier. The kiddies hopping around the levels at full speed could not doubt have picked me apart in a face to face fight, but I was making sure they never got the chance.
So yeah... I suspect that as one set of skills fades with age, some players will develop other traits and skills that offset that. A decline in clicks-per-minute with no corresponding decline in match results in Starctaft 2 would seem to fit that pattern.
I lived for a few years living around the New Cross/Bermondsey area (south of the river, but similar in demographic to the areas in TFA) and there were always a few electronics shops whose existence seemed fundamentally implausible if their business was founded on anything other than handling stolen goods. I avoided them like the plague, but they were generally pretty resilient businesses - and if one closed down, another would spring up a few streets away. I'm not saying that any business which looks a bit grungy is dishonest. I've made some good purchases at backstreet computer stores which get good prices on the back of low overheads and connections with legitimate suppliers (though such places are rare these days since the online boom). But there's a certain type of business which is offering games consoles or other commodity goods at the kind of prices that just make you go "hmm".
Hell, even going back well before that, I can remember independent video games stores "Ooop North" (from the tail end of the period before the big chains drove most of them to the wall, around the early PS1/N64 era) who were well known among my teenaged peers for staying in business on the basis of a combination of modchipping and fencing stolen goods. In fact, I remember one very close to my school being raided by police and shut down (presumably after crossing some nebulous line into their visible spectrum). Provided a fascinating distraction during the middle of an otherwise dull day at school.
As the whole modchipping thing implies, these have never been businesses run by people without a degree of tech-savvy. It's no surprise that they've moved onto circumventing mobile phone protections. And I bet you'd find similar businesses in, at the very least, Manchester, Birmingham, Liverpool, Newcastle and Glasgow.
There have even been suggestions - though I offer no comment as to their veracity - that a well-known red-logoed chain of second hand electronics stores with a presence in almost every town in the UK might sometimes be less than choosy about checking the provenance of the goods it accepts.
There's been a "karma" system for Xbox Live accounts pretty much since the launch of the 360. You look at somebody's gamer card and they have a star rating out of 5 clearly viewable. The change here is that, for the first time, they're making it have actual consequences.
A lot of the posts in this thread so far are about the potential for abuse. I've played on on Xbox Live on and off since the days of the original Xbox and have seen the old "consequence free" system in operation for a while. By and large, my experience so far has been that it tends to average out reasonably well over time. I'm sat on a reputation of around 4.7/5.0 and most people on my friends list are in similar positions. The only guy who is significantly lower (just under 3.5) plays a lot of Call of Duty. My experience is that spending any significant amount of time playing the big spunkgargleweewee games is a good way to get karma-bombed even if you are the most charming player in the world, due to the general level of anger and immaturity in the communities for those games.
You've summarised the hardware purist argument pretty well. However, Sony and MS both had good reasons for pitching their technology at the level they did.
First, they'd waited more than long enough already to replace their old hardware. The 360/PS3 generation was the longest console generation on record and almost certainly ran longer than was good for either Sony or MS's business. It gave PC gaming (remember when that was dying) a shot in the arm to the point where it started eating the consoles' lunch and it resulted in a sales-fatigue for games that did a lot of commercial harm to a lot of developers. The story of the last 18 months of the 360 and PS3 was "new title launches, sales massively underperform previous game from that developer/previous game in the series". If they'd waited another year or two, home console gaming might actually have died - or at least, MS and Sony may have lost their place in it (Nintendo are functionally irrelevant now anyway).
Second, they have to think about hardware unit price. Push the spec too high and your unit price rises to the point where consumers lose interest. Sony have been burned before with the PS3 on launching with a high price tag and taking too long to get sales momentum as a result.
Third, you have to think about what games developers are actually capable of producing. The jump to the 720p average on the 360/PS3 was horribly difficult for most developers and the increase in costs wiped many out. The jump to the XB-One/PS4 hardware will be hard enough for developers. Few, if any of them, are in a position to finance games that would make good use of 4k resolutions.
The advantage that this has over the Oculus Rift is that, by the time it ships, it will work on a "plug in and play" basis with a mass-market games console which may quite reasonably have an installed base of 20 million+ by then. Sony basically "won" the BD vs HD-DVD battle by turning every PS3 into a BD player - this has some potential (though as I'll come onto, it's not guaranteed) to manage a similar victory over the Oculus Rift.
The big problem, of course, is that optional peripherals for consoles have a poor track record. Most of them vanish without trace. A few - Wii Balance Board and Kinnect - sell strongly at first but lose relevance quickly due to a lack of third party support. Developers target large installed bases and a "platform within a platform" can be a risky proposition.
I'm not 100% convinced that the exact specifications of resolution and framerate will be all that necessary. Terms like 1080p60 get bandied around a lot on forums, but I do wonder whether the average gamer really notices.
I got my first CD-RW drive in 1999. Some of the discs I wrote on it still work perfectly. Others are completely unreadable. There's no pattern to it - no particular manufacturer's media has fared better than another's. I have cheapo 20-for-a-dollar discs that still work and expensive ones that don't - and vice versa. I also have discs written much more recently which have become unreadable. For all I know, the discrepancies are as much down to which disc was stored on the top of the spindle or in the outer-most pockets in the wallet as to anything in their manufacture.
Which means that as a long-term archival solution, optical discs are just too erratic.
Ten years ago, I had a pretty large DVD collection. I still do, I guess, though it's archived in big folders now rather than the original cases, for space reasons. I was in no way unusual in that; almost everybody else I knew at the time had a DVD collection.
Today, I actually have a relatively large blu-ray collection. But nobody else I know does. In my case, I have the large blu-ray collection because I watch a lot of anime and support for that on streaming services is patchy (Crunchyroll isn't bad, but older shows do vanish from it with no notice sometimes). But if I wasn't interested in niche stuff, there'd be no practical (as opposed to philosophical) reason to continue to collect physical media.
With a large collection of the movie-buying public having looked at blu-ray and gone "meh", I think the challenge of trying to movies to a new generation of optical media is probably insurmountable.
And the other uses of optical media?
The newly launched games consoles have blu-ray drives - but I suspect they're the last generation to support optical discs. More and more sales are shifting online and that proportion will only grow as broadband speeds improve. Even for online-only refuseniks, Vita-style memory-card distribution may prove more convenient in the long run. I honestly cannot remember the last PC game I bought via a physical copy. Probably the Wrath of the Lich King expansion for World of Warcraft - because I guessed that Blizzard's download servers would die on launch day.
And for data archival? My experience of writable CDs, DVDs and BDs is that they're time-consuming to write to, physically fragile, space-inefficient and unreliable over time. If I want a local backup these days, I pick up an HDD, fill it up and then store it away.
So yeah, this all feels a bit like nugatory effort...
TFA is, I'm sorry to say, complete drivel. It ignores two key considerations.
First, Valve's platforms - Steam-on-PC/Mac and the forthcoming Steambox console - are home platforms. Where the pay-to-win model has achieved some success (and even there, the successes are outweighed 100-to-1 by the failures) is on the mobile platforms, where people play for snatches of a few minutes here and there. PC and home-console gaming remains dominated by more substantial offerings, with more significant development budgets and (frankly) a more discerning audience.
And the second point is just that; games cost money to develop. Quite a lot of money, these days. We're already seeing an increase in the RRP for games on the new consoles, which, irritating though it is on one level, is probably something the industry has needed to do for a while now. Long story short - nobody is going to be rushing to give these games away for free. If Valve wants a console, retailing at a per-unit profit, whose selling point is a mass of free titles (and I don't believe for a second that it does) then it will need to throw a massive, unprecedented subsidy at game developers. And that's just not going to happen. We've seen what happens when you try to launch a console whose selling point is the kind of games you actually can give away for free or near-free. It's called the Ouya.
Which, as we all know, is doing just splendidly. Or not.
What Valve's move does unlock the possibility of is smarter and more responsive pricing for games. And this is where there's real potential for the industry to do better.
Historically, we've sold games as though they were movies. There's basically one price point when they're new and another for when they get a budget re-release. Ok, indies and the like have always played around outside that system, but the actually relevant commercial developers have had very fixed price structures. What Steam has moved towards - and seems set to move further towards - is pricing that can price games more accurately reflecting the value they offer, their review scores and their week 1 sales.
Bricks and mortar retail stores sometimes try this, but the way in which they purchase stock and are insured on those purchases makes it a last resort for them. The ability to flex prices rapidly at the publisher level is much more useful. If you have an Elder Scrolls style RPG with a huge development budget and hundreds of hours of game-time, then go at $80. If you have an average sized shooter, perhaps in the $60-70 range. If you have a 2d platformer or sh'mup, then perhaps you should be thinking more about $20-30 for your first release.
Nintendo, in particular, desperately need to learn this lesson. My theory on the unnoticed reason behind the Wii-U's continuing disaster is that it's just too obvious that Nintendo's pricing is vastly out of whack with the value their games offers. Ok, the $60 price-point might be ok for something like Super Mario 3d World, but is it really appropriate for 2d platfomers (Donkey Kong, New Super Mario) or HD remakes which sell for $30 on other platforms (Zelda: Wind Waker).
No long slashdot post would be complete without a car analogy, so I'll say that game pricing needs to be less like movie pricing and more like car pricing. It should have a much wider range and be more responsive to features like production costs, quality, features, brand and image.
Yes, WoW has evolved substantially since its launch. To all intents and purposes, Burning Crusade was "WoW2", Lich King was "WoW3" and so on. The game changes more through expansions (and even through some of the larger patches) than... say... Call of Duty changes between entire games.
Before WoW, levelling up was almost the whole point of MMOs and end-game content was something only a small proportion of players ever saw. In Final Fantasy XI or Everquest, many players still wouldn't have reached the level cap after playing for a year or more.
WoW's great innovation - and one of the big reasons for its success - was to cut down the length of the level grind and make end-game play (which tends to involve more skill and more social interaction) available to a much larger pool of players. Vanilla WoW was a huge shift in that direction compared to older MMOs and Blizzard have shifted the game even further that way with every subsequent expansion.
Almost every MMO launched since WoW has tried to duplicate that formula, but failed to add enough of a distinctive twist to it to lure people away from it in the long-term. If the re-launched Final Fantasy XIV - which is very impressive indeed and the most successful MMO launch since WoW - has one really killer feature, it is that it shifts the tone and nature of both levelling and end-game content substantially away from the WoW model (without ignoring WoW's evolutions of the genre entirely).
Pay-to-win isn't - quite - what's on offer here. Blizzard haven't yet gone that far.
If you've played WoW for any time, you'll know that the game only really "begins" once you hit the level cap. Certainly, there isn't much point in comparing yourself to other players until you hit the maximum level. What Blizzard are selling here is the opportunity to skip the extended tutorial/storyline hybrid that comes before the game starts in earnest.
Genuine pay-to-win would be the sale of any kind of advantage, be it gear, increased access to instances (such as a waiver on weekly lock-outs) or any kind of character power-boost or income-boost once at the level cap. So far, Blizzard have not gone in that direction (though many other MMOs do). I think Blizzard still understand that would be a step too far for the player-base they've built up and would likely kill their cash-cow. MMOs that do use that model tend to have relatively short lifespans, while WoW is still going strong after the better part of a decade on the basis of a subscription model.
In fact, the pure subscription-model is by no means as dead as many people seem to think. There was a real worry, after the disaster of the initial Old Republic launch, that the model was no longer viable in a world of free-to-play-pay-to-win. But the re-launch of Final Fantasy XIV late last year was extremely successful (and remains successful several months after launch) on the basis of a subscription model with no microtransactions at all.
Seriously, it's nothing like 100-hours game time to get a character to max level and it's much less if it's not your first character. There are a few factors that affect how long it will take (if you do marathon play sessions it will take longer than if you play in bursts with rested state), but I'd estimate no more than 60 for a first character, as of Mists of Pandaria. And a good chunk of that will be on the final 5 levels, which (last time I checked) hadn't yet been accelerated in the same way as the pre-Pandaria content.
I've done alts in under 40 hours of playtime, through a combination of rested state and heirlooms. Combine those and the little xp-progress bar absolutely shoots across the bottom of the screen. Plus levelling an alt is actually kinda fun, particularly with the group-finder making low-level dungeon runs a much better way to level up.
I played World of Warcraft on and off for a few years. I was a pretty hardcore player from the launch of Burning Crusade through to near the end of Lich King and came back casual for a while for late Cataclysm and early Pandaria. I know the game pretty well and have friends who still play it.
So I can say with confidence that you would be absolutely mad to pay for a boost up to level 90 with prices like that (and if you are a new player, mad to pay at all).
There are two types of people now who might be starting out at level 1; new (or returning-after-a-gap-of-years) players starting their first characters, or veterans levelling an "alt" (a secondary - or indeed tertiary or beyond - character).
If you are a new player, then going through the level-up process is important and you should not skip it. First of all, this is where you learn how to play your character. Most end-game content involves group-play and if you have a brand new player at the level cap staring at a hotbar full of unfamiliar abilities, it will be a long time before you are actually competent enough to play alongside others. The level-up process, during which you are introduced to abilities one or two at a time, takes you at least part of the way along that learning curve for your character. It also exposes you to a lot of the game's lore, if that's your bag (I always found WoW's lore a bit boring and juvenile, but some people like it).
And if you're a veteran player, then there are lots and lots of things you can do to accelerate the level-up process for an alt without handing over real-money. I levelled up three alts while never taking them out of "rested" state (meaning they were getting double xp from kills). Heirlooms allow you to boost the rate of xp gain even faster, to the point where 1-80, by the launch of Pandaria, was just stupidly fast. I doubt even a brand new character takes over 100 hours of game time (or indeed, anything like it). Alts certainly take much less.
So yeah, I can't imagine Blizzard would have too many takers for this. Or at least, I hope they won't.
The point about France was regarding the fact that the country has an official regulator for its language. A regulator which has quasi-legal (though thankfully no longer legal) powers to prohibit the use of languages other than French in public communications in France.
The blasphemy laws point has been an active point of debate in many EU countries over the last few years - ever since the mohammed-cartoons controversy. There was a major debate in the UK around the Racial and Religous Hatred Act 2006, which, in its original form, would effectively have criminalised any speech that offended somebody on religious grounds. Happily, the bill was amended (against the Government's wishes) as it went through Parliament and ended up somewhat diluted; though it still arguably has a chilling effect. There are still active campaigns by religious groups (primarily though not exclusively Islamic) for legislation that would duplicate the intention of the original bill.
The last time Germany had the presidency of the EU in 2007, it used that power to ensure the Commission (a terrifyingly unaccountable organisation) began the process of introducing legislation that would have effectively made Germany's censorship of video-game content mandatory Europe-wide. Happily, the clock ran out on it and the Portuguese presidency which followed was, depending on who you listen to, either more liberal-minded or more distracted by the looming economic crisis, so the whole thing dropped. Germany doesn't get another Presidency until 2020, but the smart money would be on them trying again - or leaning on another country to try again. Particularly if the Eurozone financial crisis does blow over, allowing social issues like this to return to the prominence they had in the middle part of the last decade.
I'm extremely familiar with the workings of EU institutions and, indeed, have spent time working in Brussels. They do have some positives and produce the occasional outbreak of common sense, but if you wish to delude yourself that they are perfect - or even more good than bad - then that's your mistake.