Unfortunately, it's still very much alive and out there. The parents PC contracts it regularly (my dad has appalling security and browsing habits). A friend of mine (who I generally regard as more IT literate than I am) just spent a weekend cleaning an infection of it off his (fully-updated, Macafee-profected) Windows machine.
And now for a gratuitous side-rant:
The source of my friend's infection was apparently a minor video-hosting site carrying game-walkthroughs. On balance, I believe him on this, because I'd had warnings from AVG about such sites myself in the past.
The trend over the last few years has been for game-walkthroughs to shift from text-format to long sequences of videos. Personally, I hate, loathe and despise this trend from a convenience point of view (try searching 30 videos for how to find that pesky item you're missing, compared to doing a quick search on a text file). But it's had some other unpleasant side effects.
See by default, these videos go on youtube. Thing is, however, game publishers sometimes object to complete video walkthroughs of their games being hosted there and do DMCA takedowns. So the videos then crop up on less notable video-hosting sites. Many of which appear to be malware infested hellholes.
So the moral of my (horribly off-topic) side rant: video walkthroughs suck. They're difficult to search, they're inevitably narrated by some idiot called "Tad" who feels the need to say how stoned he is roughly every 30 seconds and - they're turning into a really horrible malware vector.
You're right, of course.
But there's another, related lesson in all of this; one that's more for businesses than consumers. The irony is, it's not a new lesson; it's one that has been well known for decades (centuries?), but which seems to have been forgotten recently in a good chunk of the gaming industry.
That lesson is: "Your brands matter. Protect their value."
I'm sure that on one level, EA understands this. In fact, I suspect a few parts of the company (mainly those who handle its cash-cow sports titles, which remain well-received and commercially successful) understand it very well. It spends a fortune on advertising. It's known to throw its weight around when major releases get lackluster reviews. But at the same time, it has worked very hard in recent years to take some of the most potent brand names in gaming and drag them through the mud. And then set fire to them. And then take a dump on the remains.
A few examples: their acquisition of Bioware looked at the time like a bit of reputation control. Their name was in the crapper, so they tried to associate themselves with the halo surrounding one of the most highly regarded developers in the business. However, with Dragon Age 2 and Mass Effect 3, that brand very quickly ended up tarnished. Now, views will vary on EA's responsibility for that (personal view; extensive in the case of Dragon Age 2, but Mass Effect 3's failings felt mostly inflicted by people within Bioware). It wasn't just the disappointing games either; the day-1 DLC, starting from the original Dragon Age onwards (and I'd never accuse that game of being disappointing) did a lot to erode consumer good will and cement a reputation for nickel-and-diming customers who had purchased already expensive games. In fact, many of the post-launch DLC packs for EA/Bioware games have been good value, but the reputational damage is done by the day-1 stuff.
Or take Command & Conquer; one of the absolutely core franchises in the history of PC gaming. Actually, EA's history here is more complicated than it might appear. Westwood had itself done all it possibly could to tarnish this brand, with C&C2 and Red Alert 2, both of which felt years behind the curve at the point of release. EA's first move on acquiring the franchise was a bit odd and bewildering - sticking the name onto Generals - a title that clearly had little to do with Command & Conquer (which isn't to say that it was bad, just that it didn't look or feel like a C&C game). However, EA then seemed to buck its ideas up; C&C3 and Red Alert 3 were both, in their own ways, high quality titles and felt like a return to grace for the series. So what a pity that the usual EA self-destructive tendencies were allowed to take over; C&C4 was clearly rushed to release and was crippled by barely-functional always-online DRM. Since then, all we've seen has been some craptacular gestures towards the pay-to-win market.
And then there's SimCity. I won't dwell on this at length; the discussion is live across many, many gaming sites at the moment. But again, EA has taken a loved and respected franchise and smeared it in excrement. In fact, in this case, EA's reputation was already bad enough that I didn't make the mistake of buying this title.
The result of this? At one point, Bioware games - and games with the C&C or SimCity name on them - would have been guaranteed purchases for me (and, I suspect, for a lot of other people). As of now, though, I would sniff carefully around the reviews of a Bioware game, and wouldn't touch a C&C or a SimCity with a barge pole. The brand value has been substantially diminished or outright destroyed. There are other examples too; I loved the old (early 2000s) Medal of Honor games - but the first of EA's recent reboots was grim enough that I didn't touch its sequel and they've now canned the franchise again because a lot of other people clearly felt the same.
Funny thing is, EA aren't (quite) the worst in the industry at this. Dire though they are, I don't think they can quite match Square-Enix in terms of sheer self-abuse.
At one point during the PS2 generation, some estimates put Final Fantasy as the most profitable franchise in all of gaming. But as that generation went on, Squenix's focus shifted from producing relatively infrequent but high quality games to churning out handheld and mobile titles of variable quality. Over the course of the current generation, we've seen the underwhelming FF13, the dismal FF14 (currently suspended pending a relaunch) and a large number of handheld and social games whose quality ranges from "reasonable" to the "completely shit". Indeed, with the likes of "Final Fantasy: All the Bravest" (which has some hilarious reviews out there), Squenix have made clear that what they want from the franchise are basically low-cost pay-to-win exploitation machines.
What too many companies have done in recent years is hitch their biggest and most important franchises onto every obnoxious new trend as it comes along. By and large, gamers hate day-1 DLC. They hate always-online requirements for games that aren't proper MMOs. They are mistrustful of the annual release cycle for formerly "high quality" franchises. Outside of Japan, they are gradually losing interest in the dedicated gaming handhelds. And while they may be taken in from time to time by pay-to-win games, they do tend to wise up to them sooner or later. And once a franchise has been dragged through one or more of those particular cesspits, its value is gone and will be very, very hard to recover.
Perhaps one day we'll have Star Trek style holodecks. And that will be great. Until the point - roughly 10 minutes after the first trial - when people realise that if they're really bad at running around doing atheletic stuff in real life, they're also going to be really bad at it on a holodeck like that.
I think controllers which try to make games more immersive by having them mimic real life activities are (with a few exceptions I'll touch on later) missing the point.
That isn't to say that games shouldn't try to be immersive and that controllers don't have a role to play in immersion. However, given that in most games, the player is doing things he wouldn't be able to do in real life, simply trying to translate real-life controls into the game isn't going to work. In most genres, the best thing the controls can do is let the player forget that they are there at all. They need to be the most efficient means possible of translating the player's will into the behaviour of his on-screen avatar.
Every time a player dies (or otherwise fails, depending on genre) in game due to control issues, the immersion is broken. I can think of some really awful examples here, going back decades. Remember Ultima VIII, as it was at launch? Those jumps across the moving platforms, where a mis-step meant death? Remember how you could see precisely what you needed to do to get across, but how the atrocious point and click control inputs made each and every jump an exercise in trial, error and sheer luck? And remember how much it broke the immersion every time you failed - reminded you that you weren't the Avatar exploring a strange land, but a player wrestling with a cumbersome interface and control system? That one was bad enough that they eventually patched it (turning it from "atrocious" to "just about tolerable").
Or more recently, take the Super Mario Galaxy games. I enjoyed both of these immensely - until the point at which it became necessary to use the spin-jump to make certain jumps. See, "spin jump" was mapped to "waggle the Wii-mote". And "waggle" is not, on a Wii-mote, a precise input. There's actually a good bit of variation in just how much and how hard you need to waggle before the game will accept that, yes, you have waggled (and I can't believe I've just typed that sentence). So all of a sudden you have a precision platformer which is dependant upon a non-precision input. And even though it's only for one single input, each time you rack up an unnecessary death due to that input going wrong, the immersion is broken.
Or sometimes a game uses a "normal" input device, but because the game adapts itself to that device badly, it still ends up feeling broken. Resident Evil 6 is a case in point here. I've played this on the 360 and the PC and found the 360 version effectively unplayable, due to control issues. I don't normally object to playing shooters on a console controller (though I'd prefer mouse and keyboard), but the shooters in question need to make concessions to the fact that they're being played on a device less suited to precise aim. Actually, many console shooters these days do that well; snap-to aim, relatively generous hitboxes and slow-moving enemies may not always make for the most exciting game mechanics, but they do take a lot of the pain out of playing a shooter on a console controller. Resident Evil 6 makes no such concessions; in a game where only headshots do appreciable damage to enemies, aiming at these tiny, fast bobbing targets on a console controller is nigh impossible and the abiding impression I took away from my 360 version was that my in-game character actually had worse accuracy with a gun than I myself would in real life (which is saying something). After that, playing with mouse and keyboard on the PC was a complete revelation - while the game itself still has flaws, it was an order of magnitude better than the console version. By contrast, the recent Tomb Raider reboot makes such good concessions to aiming on a controller that I played it on PC using a 360 controller-for-Windows, as the platforming felt more natural that way.
To be immersive, a controller needs to be three things.
It needs to be ergonomic, so that the player can access all of its buttons and functions quickly without physical discomfort. Modern controllers have made a lot of progress here, though some issues still remain to be sorted (finding a convenient way of doing L3/R3 in particular is a problem that still needs to be solved; clicking down on the analogue sticks really doesn't work, as I think a lot of developers would acknowledge).
It needs to be precise. Or at least, it needs to be precise enough to keep up with the mechanics of the game it's being used for. See above points about Resident Evil 6 and Tomb Raider.
And it needs to be consistent. The player needs confidence that when he makes a particular input, it will translated into the expected action from his on-screen avatar. This was a big deal-breaker for me with Zelda: Skyward Sword. I got heartily sick of sitting there shouting "no, I did a vertical slash you stupid thing, don't try to tell me it was diagonal". By contrast, Dark Souls (a game with very, very similar block/dodge-and-counter combat mechanics) did that that feeling of consistency to the inputs, which made the game far more satisfying and immersive.
I look at input systems like the ones in TFA and I'm not particularly convinced they satisfy any of the three requirements above particularly well. I can't imagine that trying to play any existing fps on it - let alone a competitive online fps - would be anything other than pure frustration once the initial "oooh cool" factor wore off.
Is there any role for stuff like this? Yes, possibly. There are subsets of games out there where the whole point is the player's physical activity; exercise software.
Exercise is boring. Really, really boring. It's one of the reasons we have an obesity crisis; in an era where average calorie burn from work-related activity has fallen through the floor from where it was a few decades ago, most people just don't have the willpower and tolerance for boredom to do equivalent exercise outside of work hours. Sure, there are plenty of ways of trying to make exercise less boring; the obvious one is to turn it into "sports". But then, even there you run into a couple of problems; first, a lot of people (like me!) find sports even more boring than just staring at a wall while they exercise and second, a lot of sports don't actually involve very much calorie burn.
I think this is why exercise games have taken off so much in the last few years; they're another route to making exercise more interesting. Titles like Wii Fit and Your Shape: Fitness Evolved are pretty poor considered just as games, but for many people (like me!) they do just about enough to make sustained exercise tolerable. They're certainly the only thing I've ever made really substantial use of my Wii or my Kinnect for.
So something like the tech in TFA might have a role here; if we accept that the purpose is to make exercise more interesting by giving the player the chance to zap aliens while he jogs or whatever. The games that make the best use of this technology will never be the same as - and never, considered on an equal footing, be as good as - games which are designed to be played on traditional controllers. But there might be a niche to be exploited there.
There's also the point that, in my opinion at least, IE has closed the gap with the other browsers quite a bit in recent years. I'd been using Firefox since 2004, but had grown increasingly irritated with a number of its quirks and foibles.
Got a new PC a couple of weeks ago and decided that was a good spur to check around and see how other browsers measured up. Having done so (and slightly through gritted teeth), I actually settled on IE.
Five years ago, somebody who was using IE was either ignorant or browsing from an office PC where they had no choice. I just don't think that's the case any more.
I wonder if the controller issue is driven by a desire for regional standardisation. There's a general consensus (wonder if it's actually true?) that Japanese gamers prefer a smaller controller and US gamers prefer a larger one (though obviously not one as large as the original Xbox controller). As a Japanese company, Sony will always be more exposed to feedback from its home market.
But yes, while I'm generally positive about the PS4 reveal, the controller does stand out as a bit of a sore-point.
I'd agree with you for the most part, but...
The 360 offering is substantially less attractive than it was two years ago. The new "third gen" dashboard UI is a big step back from the previous one. It's not just the sheer quantity of advertising, but also the irritation and number of navigations involved in trying to get to actual game content. Bizarrely, it's also a worse UI to navigate using Kinnect gesture/voice controls than the old "second gen" dashboard was.
The other issue, of course, is that while many frustrations remain around the PS3, Sony have raised their game in some respects. The PS Store is much better now than it used to be (admittedly that's a low bar) and PS Plus is actually a genuinely good service for people who don't have a massive amount to spend on games and don't care about always having the latest titles available, but just want a steady stream of games to play.
The controller issue, of course, is very real. The Sixaxis was awful and while the Dualshock 3 is better, it still has big drawbacks next to the 360 controller. It's too small for many people (including me), it offers poor grip and the shoulder buttons lack the precise analogue sensitivity of the 360 equivalent's.
And don't even get me started on mandatory game installs, patches and goddam firmware updates. At least Sony have realised that particular situation cannot continue on the PS4.
I think the other point that dropped out of the discussion in this particular case (though plenty of people have brought it up elsewhere) is that people don't so much fear always-online requirements because they're worried their net connect might blip out (though that's a perfectly fair concern), but rather because they can see the thin end of the wedge approaching and recognise always-online as a direct underpinning for blocks on used games and rentals.
MS may be getting a lot of pressure from game developers to implement those blocks, but to do so would be absolutely suicidal given customers have a choice to jump to an unrestricted PS4 instead.
I'd have to disagree with you on the gap between console and top-tier PC visuals. Bioshock Infinite isn't as PC-optimised as some games out there, but I found the contrast between the PC version and the 360 version (which seems to be running on loop in my local game store) pretty huge. For something like Crysis 3, it's hard to believe that the PC version is even the same game (though it is an absolute system killer). In that case, I'd say the difference is not far off "Quake 2 software vs Quake 2 opengl".
"Gamers and developers aside" is an absolutely huge qualification. If you're leaving that aside, you may well be missing a big chunk of the reason for the decline in "home" PC sales.
Until somewhere around the PS2/Xbox/Gamecube generation, PC and console game development existed, for the most part, in separate worlds. Games which appeared in both worlds were the exception rather than the norm. But sometime in the middle of the last decade, rising development costs meant that cross-platform development increasingly became the norm.
Until that point, PC game system requirements had progressed on a kind of steady evolutionary curve; you really needed an upgrade every 2 years (maybe 3 at a pinch with some interim upgrades) or so just to be able to run the latest titles at all. Hell, you can more or less track the history of the PC update curve from looking at a few key titles; Wing Commander, Strike Commander, Wing Commander 3, Quake, Quake 2, Quake 3 (for example).
With PC game development linked at the hip to console game development, the hardware cycle doesn't work like that any more. The PS2/Xbox/Gamecube cycle wasn't too bad; it was a pretty short cycle and by the time PC development really started to get locked in with it, people were already talking about the successors.
But what we're in now is pretty much the longest console cycle we've seen (and it won't truly end until the PS3 launch at the end of this year - the Wii-U is definitely not next-gen in hardware terms). It's absolutely no coincidence that, until very recently, the iconic question about whether a PC will cut the mustard in gaming terms was "will it run Crysis". The original Crysis - a rare game not locked to the console cycle - was released in late 2007. Until very recently, it was still the most demanding PC game around, if you wanted to run it on max settings. A Crysis-capable PC in 2007 cost a lot of money; but by around 2009 or so, you could get the equivalent in the sub-$1000 range. With that PC, you could run pretty much anything released on max settings. System requirements did a small amount of very gradual upward drift, as developers learned the console hardware and were able to do slightly more ambitious things on it; but compared to the preceding decade, it was negligible.
This has started to change a bit recently; as we get into the very late stages of this console cycle, developers do start to push the PC versions of their games significantly beyond what the consoles can do - largely because they want the practice for next-gen development. I think I first noticed it with Bulletstorm - it wasn't huge on that, but it was clear that this was a game whose PC visuals were being optimised beyond what we'd become accustomed to. Battlefield 3 went quite a lot further (it is depressing that spunkgargleweewee tends to be the go-to genre for pushing system specs these days). More recently, the PC versions of Crysis 3 and (to a slightly lesser extent) Tomb Raider have felt like what we should expect to see from next-gen console games - and if you want to run them in max detail, then they do have much more demanding system requirements than what we've become accustomed to.
I'd expect to see PC sales (to home users) rise again over the next couple of years, as we get an upward lurch in system requirements to fit with next-gen console game development.
Christ almighty. You really are utterly fucking thick.
You must have missed the part where I said I lived in London and had a perfectly good net connection myself.
Judging by the stats you quote, I'm guessing you also missed the part where I said that I was primarily talking about parts of the world other than the US and Western Europe.
But don't let that get in the way of a good angry rant.
Â£60/month covers broadband, phone (including all calls except international) and TV package. Not cheap, not extortionate either. If you live in the right area, the UK's actually very good for broadband. The problem is that most of the country doesn't count as "the right area" yet.
I'm not in the US, so maybe I'm missing some context here, but...
How on earth are either of the links you've just posted examples of hate speech? The first is a line on the abortion debate that we've seen many times over the years. I'm not going to pick sides in that one; but if you approach the debate (as some people do) with the starting point that foes "life begins at conception" then abortion is infanticide. I think a lot of the lack of civility around that particular debate stems from the fact that neither side recognises just how high the stakes feel for the other side in it.
The second link is a fairly silly take on the gun control debate that somehow slides into an odd reductio ab absurdum take on the gay marriage debate. But again, incoherent though it is, is it really hate speech?
If somebody says "All members of (ethnic group x/social group y) are scum! Let's (kill them/throw them out of our country/deprive them of their property rights)" then that feels like hate speech. That's a hell of a long way from either of the examples you link to.
As a test, let's take an example from a left-wing perspective of somebody linking a (generally supported - the UK public consistently backs a tougher line on welfare in polls) Government policy to murder. In this case, it's the murder of the disabled rather than the infanticide, but I think that's still pretty emotive. So: from the UK's Guardian newspaper. Is that hate speech?
If you answer "yes", at least you're consistent. If your answer is no, then it looks more like you're just demonstrating totalitarian instincts to suppress speech that goes against your own values.