There is no GamerGate. Not really. There are several amorphous bunches of people who hang around on some forums. There's a similarly amorphous bunch of people who use a twitter tag, often for wildly differing reasons. There are a handful of internet petitions. There's a fictitious mass-movement that some juvenile "supporters" have imagined up because it makes them feel good. And there's a fictitious shadowy, sinister organisation that a bunch of equally juvenile opponents have imagined up because it gives them a convenient straw man to attack. Really, nobody comes out of this well.
During my postgrad years (so going back to the start of the last decade now), I had two online roles back to back which gave me a fair amount of visibility to a lot of angry people online. First, I was the head admin of a major European Counter-Strike league. Then I was an oper (and one of the public-facing ones rather than the backroom code-tweakers) on a very large European-based IRC network. Both of those roles involved telling angry people things they didn't want to hear. Things like "somebody in your clan was caught cheating, you are now banned" or "no, I will not gline somebody just so you can have your usual nick back".
And so, on a pretty much daily basis, people threatened to kill me. They were going to find out where I lived and kill me, or maybe track down my parents and kill them or maybe rape my grandmother or my sister (I don't have a sister, but hey), or whatever, or something, or they just hoped I got CANCER or AIDS and would go away and DIE. To be honest, I suspect anybody who's done a similar role (or worked in customer service in certain types of company) also gets similar on a daily basis.
Were the people making those threats good people? Hell no. Even the fact that they were angry doesn't excuse behaviour like that. But what did I do about it? In the Counter-Strike role, if they clan wasn't already banned from the league, then it sure was the moment they made death threats. In the IRC role, it takes only a few seconds to apply a gline and suspend accounts with network services, but the warm fuzzy feeling that follows can last an hour or more. Did I ever actually feel in danger? Did I ever feel I needed to call the police? Hell no. Talking shit online is, unfortunately, pretty much as old as the internet itself and I had no particular political axe to grind.
So yeah, immature idiots on one side and professional grievance-mongers trying to inflate trash talk out of all proportion on the other. Nobody comes well out of this, for the most part.
Actually, the one thing that did strike me about this was how much the whole thing was a product of the indie gaming community. Almost every AAA publisher or developer out there either stayed silent or distanced from it as quickly as possible (which was the only sensible course of action). 2014 is really feeling like the year AAA gaming got smart and indie gaming got dumb.
Region locks are vile practice. It's infuriating to see them creeping into PC gaming (historically a region-free platform) at a time when two of the three console developers have ditched them and the third (Nintendo) is considering dropping them. That said, it's worth reflecting on why they exist. There are, historically, two reason behind this.
The first is plain old-fashioned cultural stereotyping (which somebody being less diplomatic might call "racism"). This is the classic Nintendo reason. Big paternalist companies like Nintendo (they're not alone in this, but are the worst offenders) have this weird outlook that says that they should function as some kind of moral arbiter of what should and should not be available in each territory. Hence certain games are "not a good cultural fit for some regions" (usually a view based on offensive broad-brush stereotypes... or racism, if you prefer the more honest term) or "require alterations to be culturally appropriate" (meaning "we're going to cut the game to hell on release in some territories, because REASONS"). Happily, this particular driver behind region locking is on the decline. Sony used to buy into it every bit as much as Nintendo, but have completely washed their hands of it. Even Nintendo are considering getting out of this game. I should add that a few territories (a handful of religious-wacko countries, plus Germany and Australia - what good company they find themselves in) set up their own barriers that require these kind of locks on occasion. In those cases, the blame rests with the Governments of those countries, not the platform owners/publishers.
The second reason is more complex and is down to differential pricing. Not every currency is of the same strength or stability. The last few days have made that pretty clear, if it wasn't already. And by and large, a lot of those countries which have weak and/or unstable currencies also tend to have very high piracy rates. A lot of companies (Microsoft used to be particularly bad in this respect, but have been stepping back lately) operate under the delusion that if they sell their products really really cheap in those territories, they can get people to buy legitimately, rather than pirating their products (all the evidence to date shows this doesn't work). Problem is, when you do that, you create a huge reverse-import problem; why would a US or European consumer pay the going rate in their territory for a locally-bought copy, when they could import a Brazillian or Russian or Vietnamese copy for a fraction of the price (which probably has English-language support anyway)?
Now, in a pure free market, one of two things would happen. Either the company selling the product would have to drop its price globally, or else it would have to accept that customers in those marginal economies just couldn't, for the most part, afford its products. But we live in a world where they're allowed to circumvent the free-market at will - via region locks. So first-world consumers get to subsidise producers (usually fruitless) speculation in developing-world markets.
There's a curious mirror image of this around one particular market; Japan. See, Japanese consumers are willing to pay massively over the odds for media (movies, games, TV series both live action and animated), particularly when said media is domestically produced. Seriously, you think UK or Australian consumers pay over the odds? It's nothing to what they'll pay in Japan. And because Japan has a large media industry which has grown accustomed to being able to milk this unquestioningly loyal (and seemingly happy to be exploited) domestic market, a good chunk of it is desperate to keep said market behind a walled garden, with reverse importing from the rest of the world locked off.
So yeah... region locking... a few reasons for it, none of them good for the consumer. Truly sad to see it come to Steam (though it's been creeping in at the margins for a while now). The only alternative? Fix all regions' price to the dollar (allowing for differences in local sales taxes, which is the major difference, for instance, between US and UK prices). But then a good chunk of the world wouldn't be able to afford to buy anything like as many games.
Shouldn't be the customer's problem. A big part of the root cause of the Kickstarter backlash is that it makes it into the customer's problem.
I know people who have taken entry-level development positions with large games developers. That's certainly one way to get experience. Yes, the pay, working hours and culture will probably suck, but that's the price you pay for wanting to work in an over-subscribed field. Any large developer will always be carrying a lot of people with little to no experience of games development - the things that suck which I just mentioned mean that this is a field with high turn-over. What matters is that there are experienced people in the right places.
In fact, given that the main skill that seems to be lacking in failed Kickstarter campaigns is project management, you could argue that relevant experience doesn't have to come from games development. Delivering (or at least making a major contribution to) any complex technical project within fixed time and budget constraints is good experience, regardless of field.
Launching a Kickstarter or Early Access for an implausible game design and taking people's money for a project that you then mismanage horribly and fail to deliver any product is not a viable business model. Unfortunately, a lot of people seem to think it is.
You're right that "backers" need to realise that Kickstarter is not a pre-order mechanism. But developers also need to realise that turning to crowdfunding means, by necessity, a different kind of development model to a "traditional" game.
If this game was - as is more usual - being funded by a big publisher and Frontier decided that the offline mode wasn't working out, then that would be the cue for them to begin a negotiation with the publisher. The publisher might be fine with the change. It might not be. The publisher might want to change its funding committment. It might even want to walk away and leave the project looking for a new publisher. But at the end of the day, it's a commercial negotiation.
Now generally, when a game Kickstarter goes horribly wrong, the root cause is that the developer was a "two men and a dog" team with little to no experience of games development. That's not the case here; Frontier are an established studio with a long track record of delivering games (even if most of those games for the last decade-and-a-bit have been low-profile franchise tie-ins). But they're attempting to behave here as though the absence of a traditional publisher means that they have licence to do what they want without the usual accountability to backers. There's no possible world in which that is reasonable.
So it's no wonder backers are upset.
So let's give Ubisoft the benefit of the doubt for a moment. I'm not going to slate them for the fact that you need a top-end graphics card to get good performance with all the bells and whistles. I actually quite like to see developers showing a bit of ambition when it comes to pushing the envelope on PC graphics. Let's even assume that something went badly wrong in the AMD optimisation. It's not completely unknown for things to go wrong with a GPU manufacturer at the last moment - the PC version of Rage was a hideous mess on PCs with Nvidia cards when it released, because a driver update that was anticipated between the game going golden-master and hitting the shelves turned out not to be what the developer was expecting.
But even allowing for that, how does it explain the console versions being such a mess? There are detailed performance analysis reports out there showing frankly shocking levels of performance on both of the console platforms (Playstation 4 and Xbox One - no last-gen releases for this game). Both platforms fail to hold even a consistent 30 fps, with the Playstation 4 version (which in theory should be the better of the two, as the console does have a little bit more horsepower) having some truly shocking moments where the framerate dips into the teens.
If you're used to playing games on a PC, this might not sound too shocking. After all, unless you have a particularly old PC, you can almost always salvage a playable framerate by dropping your graphics quality. But that option isn't there on a console. For action oriented games on a console, a locked 60 fps rate is the "gold standard" and is becoming almost mandatory for twitch-shooters, precision driving games and other genres that rely on rapid response times. The popularity of the Call of Duty series, generally inexplicable to PC gamers, has largely been driven by the fact that the series has long adhered to the 60 fps standard on the consoles, meaning that it has felt tighter and more precise than its competitors.
But if you can't manage a locked 60 framerate, then the general consensus is that a locked 30 framerate is an acceptable fallback. It won't feel as precise, but it at least eliminates the disconcerting impact of framerate fluctuations (particularly unpleasant when you're playing on a controller). For a console action-game to fail to manage even a locked 30 fps is pretty shocking these days. For it to be dipping into the teens suggests either misguided design choices or terrible optimisation (or both).
Plus, yeah, the whole "falling through the floor" thing is happening on consoles as well as PC. The game's broken and it's not (entirely or chiefly) down to a particular brand of graphics card.
Agree with you, until your final sentence. EA makes some utter crap. They also make some fantastic games. EA published Dragon Age: Origins, Mass Effect 2 and Dead Space, which were some of the finest games in recent memory. Dead Space, in particular, was a huge commercial risk and the kind of game that only a company with deep enough pockets to experiment would dare to take.
They also put out some utter crap, as well as their lazy annualised franchises and unfinished spunkgargleweewee like Battlefield 4. They also, at times, behave like complete shits in their attitude to their workforce and their willingness to push the boundaries on issues such as DRM and pay-to-win mechanics (though the latter seems to be dying now, thank god).
But they're a big company; big enough and containing enough people that trying to tar the whole thing with a single brush is a bit futile.
I think of EA as being a bit like the National Lottery we have here in the UK. Almost all of its players are from the lowest rungs of our social and educational ladders. And a fair proportion of the money it raises is used to subsidise the kind of "high art" (opera, theater, galleries) that struggles to be commercially viable on its own. So it's basically a tax on stupidity that funds some pretty great stuff as a by-product. Yeah... that's EA.
In fairness, while they'll probably get away with it this time, recent history suggests that with major franchises, you can fool people once, but you pay the price on the next game. Some examples here:
Final Fantasy XIII: sold extremely well on the basis of hype and the brand. Was a terrible game in almost every respect. Final Fantasy XIII-2 is a rather better game. Lightning Returns (the third installment) is actually a very good game. Both sold terribly, due to reputational damage from their predecessor.
Resident Evil 6: near-universally panned. Sold pretty well on the basis of a massive marketing campaign. Resident Evil releases since then have had a much better critical reception, but much lower sales.
Call of Duty: Ghosts: Its predecessor, Black Ops 2, was actually a pretty interesting game, integrating RTS elements and branching storylines. Ghosts was a lazy, by the numbers pile of spunkgargleweewee. Its sales weren't fantastic by Call of Duty standards, but were still insane. The latest installment, Advanced Warfare, is much better, but is the slowest selling installment in the franchise in years.
So if Ubisoft put out another Assassin's Creed next year, expect it to tank in sales terms, no matter how good it is.
There's certainly plenty of evidence by now to suggest that games with review embargoes tend to be poor, or at least not as good as they've been hyped as. Aliens: Colonial Marines was the big example from last year - review embargo until launch, then reviews mostly in the 4/10 to 6/10 range (with a fair few even lower). More recently, Destiny (critical consensus "fairly good but not even close to justifying the hype") and Driveclub (barely works, and underwhelming even when it does work) have been good examples.
By contrast, when a game is sent for review well in advance of release, the reception is usually much more positive. Recent examples include Bayonetta 2 (reviews 3 weeks early in some cases, near-universal praise), Alien: Isolation (America hates it, rest of the world loves it) and Dragon Age: Inquisition (not actually released yet, but reviews near universal in their praise).
The lack of pre-release reviews is generally a very strong indication in its own right that a game is not going to be good.
Link to Original Source
The two are hardly competing for the same market. Waitrose is aiming for the aspirational middle classes. Asda is... not.
If you have the kind of household budget which means you shop at Asda, then making the switch to Waitrose is probably not a realistic option.
Though on the few occasions I've eaten Asda food, their meat has had this weird texture, like it's already been digested once.
That's how London Underground, as well as other highly congested services in London (Overground, DLR and, increasingly, some of the short-distance "heavy rail" commuter trains) are configured. Crowding levels during the morning peak are intense and removing seats is a way to cram more people on.
By and large, the way it works is that if you are commuting from one of the outer zones (5 or 6) into the center, your train won't be as busy when you get on it and you should be able to get one of those seats, which is lucky as with the Tube being a full-stopping service, you are in for a long journey. If you're commuting from one of the more central zones (2 or 3) you are much more likely to have to stand, but on the other hand, you do have a shorter journey.
Obviously, it works better on some parts of the network than others. And it's a fairly brutal environment to commute in, particularly if you have a particular reason (disability, pregnancy) that makes standing uncomfortable - somebody might offer you a seat, but it's the exception rather than the norm at rush hour. Personally, I think people who live in north London and commute via the Tube are mad. I'm south of the river in Zone 5 and get a seat on a nice, non-stop "heavy rail" train that gets me to the center in 20 minutes or so every morning.
London also has driverless trains on its (more recent) Docklands Light Railway.
The reason it's news when driverless trains head to the Tube is nothing to do with technology and everything to do with industrial relations. London's Tube Drivers are extremely militant - it's normal to have a couple of strikes per year (sometimes over "normal" industrial disputes like pay, sometimes because, I suspect, they just want to remind people they can do it).
The current Mayor, who has been in post for around 6 years now and who is, to put it mildly, no friend of the unions, has been making threats about automation on and off ever since he was first elected. It's a dangerous game to play, because even the mention of automation is sometimes enough to trigger strikes - you can get rid of the drivers eventually (though probably keeping - lower paid - train attendants), but they can cause you a hell of a lot of pain during the transition.
Because personal freedoms, including the freedom to pursue "wants" as well as "needs" are kind of a cornerstone of Western civilisation.
From an intellectual standpoint, I agree with you.
From a real-world standpoint, the problem of the political response in terms of adaptations and mitigations isn't going anywhere and means that almost nobody will do what you suggest. You may not care about the politics, but in practical terms, they are probably the most important thing. With a range of responses in the public debate from "do nothing" at one extreme to "throw away Western civilisation, start living in organic yurts spending our evenings knitting underwear out of hemp" at the other, there's a lot of emotion and political capital invested in this debate. It's only made worse by the number of people who have latched onto the issue as a means to push almost-entirely-unrelated political agendas, mostly far-left, but a few far-right as well.
So in practical terms, this report provides a touch of ammunition to the "do nothing" camp and has the potential to slide opinion slightly in their direction. But, as you say, this time tomorrow, the position may well be reversed and the "organic yurtists" may hold the advantage.
And the last thing either side is going to display is a touch of humility. Useful though that might be.