Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Slashdot Deals: Deal of the Day - Pay What You Want for the Learn to Code Bundle, includes AngularJS, Python, HTML5, Ruby, and more. ×

Comment Re:Awwww thats so cute (Score 1) 302

I actually still have one of the BT/yahoo email accounts I mentioned in my GP post (even though BT haven't been my ISP for a decade now, I pay a small fee to keep the email account because of the faff associated with changing all the accounts linked to it). I can confirm that it does indeed have advertising if you aren't running an adblocker, though for the time being at least, it raises no objections to me using Adblock Plus when I log in.

There was a fair old bit of fuss some years back, when BT migrated its users from the bespoke email service it had been providing onto the reskinned Yahoo accounts. Unfortunately, as is so often the case, BT and Yahoo just decided to tolerate that "bit of fuss" and carried on regardless.

Comment Re:Awwww thats so cute (Score 4, Informative) 302

Yahoo provides email services for quite a number of big ISPs. Certainly, the email services for BT (which is still, I think, the UK's largest ISP) are provided by Yahoo and just given a light BT-specific reskinning.

So there might be quite a lot more people out there using Yahoo mail accounts than you would suspect. Some of them probably don't realise it themselves.

Comment Re:Is Windows10 a thing? (Score 1) 191

Usual caveats apply, but the latest stats I can find have Windows 10 on around 8% market share. Its rise seems to have been accompanied by a significant fall in Windows 8 market share, which I'm guessing indicates that a lot of the people who bought a PC that came bundled with Windows 8 have made the jump.

I've made the switch myself on two machines; one updated from Windows 7 and one new-build which I stuck straight onto Windows 10. There is a bit of faff required to turn off the telemetry nastiness, but once that's out of the way, I generally prefer it to Windows 7. There are some good UI improvements, plus there will be directx 12 support down the line. That said, I've noticed occasional odd behaviour on the updated machine (and disappointing results from fastboot); it seems that clean installs really are the way to go and, if I cared enough, I would do that.

That said, when my parents tried to update their aging laptop from Win7 to Win10, it locked the machine in an infinite reboot cycle, requiring me to make a 400 mile round-trip to fix it and (eventually) get it back to Win7. Turns out that Win10 doesn't like some old laptop integrated graphics setups. Would be nice if the compatibility checker tool had actually picked this up. It doesn't actually seem to do much compatibility checking, but rather just to push people towards the update.

And no, people under 30 aren't just using phones and tablets. Cousin's daughter tried to go the "iPad only" route when she started university last year and gave up and bought a PC after a couple of weeks. Tablets and phones are pretty toys and are fine for web-browsing and watching youtube, but you can't yet turn them into a credible substitute for a laptop or desktop.

Comment In all honesty... (Score 1) 373

Go with "which platform has more of your friends using it" and "which controller do you prefer".

The irony with this generation of the console wars is that the fanboy bitterness is greater than ever, but the differences between the rival platforms have never been smaller. Yes, there are some minor spec differences, but for the most part, you need a magnifying glass to spot them. And single-platform exclusives are in decline; the cost of making games these days means that most games are multiplatform.

My first preference would be to recommend "just get a gaming PC", but I'm guessing that's not an option for one reason or another, or that you already have one, but want a console to go with it.

If you want to split hairs, then the PS4 tends to be better than the XB1 for running multiplatform games, but the XB1 currently has a very slightly better lineup of exclusives. The only real "stand out" exclusive on the PS4 right now is Bloodborne, which, judging from your post, isn't really what you're after (though it is an excellent game). That said, Japanese developers favour Sony platforms and if you like Japanese RPGs and the like, the PS4 is already slightly ahead in that category and its lead will grow. But the XB1 has clearly had a better holiday season for exclusives; Forza 6, Halo 5, Rise of the Tomb Raider and Gears of War Remastered, vs just the Uncharted Collection on the PS4.

Funny thing is, the exclusives tend to more or less balance each other out anyway. Gears of War vs Killzone. Halo vs Uncharted. Forza vs Gran Turismo. Though I will say that Forza has been much, much better than Gran Turismo for several iterations now.

As for non-gaming functions, the features are more or less parallel now. Both consoles have been patched with DNLA support, though the PS4's interface for it is rather better. That said, the XB1 can act as a throughput device for your cable-box, if you are short of HDMI ports on your TV. So it's kind of even there as well.

tl;dr version - it doesn't really matter. Neither is quite as good as a decent gaming PC, but both are fine in their own right. Which platform gives you a larger friends-list and which controller you prefer are the biggest factors.

Comment Games from discs (Score 2, Interesting) 150

The big unanswered question is whether Sony will allow users to play PS2 games from their original discs. On the basis of what we've seen so far, there would appear to be no reason why this isn't feasible.

The worry, however, is that Sony wants restrict the system to online purchases made via a PS4, so that people who want to play PS2 games on a PS4 need to purchase the titles again, even if they own the original discs (and with probably only a tiny portion of the PS2's library being available for purchase).


The War On Campus Sexual Assault Goes Digital 399 writes: According to a recent study of 27 schools, about one-quarter of female undergraduates said they had experienced nonconsensual sex or touching since entering college, but most of the students said they did not report it to school officials or support services. Now Natasha Singer reports at the NYT that in an effort to give students additional options — and to provide schools with more concrete data — a nonprofit software start-up in San Francisco called Sexual Health Innovations has developed an online reporting system for campus sexual violence. One of the most interesting features of Callisto is a matching system — in which a student can ask the site to store information about an assault in escrow and forward it to the school only if someone else reports another attack identifying the same assailant. The point is not just to discover possible repeat offenders. In college communities, where many survivors of sexual assault know their assailants, the idea of the information escrow is to reduce students' fears that the first person to make an accusation could face undue repercussions.

"It's this last option that makes Callisto unique," writes Olga Khazan. "Most rapes are committed by repeat offenders, yet most victims know their attackers. Some victims are reluctant to report assaults because they aren't sure whether a crime occurred, or they write it off as a one-time incident. Knowing about other victims might be the final straw that puts an end to their hesitation—or their benefit of the doubt. Callisto's creators claim that if they could stop perpetrators after their second victim, 60 percent of campus rapes could be prevented." This kind of system is based partly on a Michigan Law Review article about "information escrows," or systems that allow for the transmitting of sensitive information in ways that reduce "first-mover disadvantage" also known to economists as the "hungry penguin problem". As game theorist Michael Chwe points out, the fact that each person creates her report independently makes it less likely they'll later be accused of submitting copycat reports, if there are similarities between the incidents.

Comment Not touching this one (Score 4, Insightful) 126

It's a multiplayer only game (ok, ok, you can play against bots, but that doesn't count), selling for a price that is, if anything, slightly higher than the average, where the developers have been quite open to quickly divide the community between those who are willing to pay an extra large sum on top of that for the DLC/season pass, and those peasants who just want to pay for the basic game.

In a game with a proper single-player campaign and a season pass for multiplayer content (eg. Tomb Raider, Call of Duty), I can happily ignore the season pass. In a game with a proper single-player campaign and a season pass for single-player content (eg. Fallout 4, The Witcher 3), I can make a situational call on whether to pay extra for the additional content, knowing that the original game isn't diluted if I don't want to splash out. But in a multiplayer-only game, I know that if I don't spend extra for the season pass (or buy each piece of DLC piecemeal), I'm going to get rapidly shunted into an online ghetto.

Battlefront's season pass is a particularly expensive one.

The game is pretty but looks like a rip-off. The better reviews have all highlighted that while fun for a short period of time, there is little depth to the gameplay and it gets old very fast. There have been a huge number of quality releases in the last few weeks that I have barely scratched the surface of (StarCraft 2: Legacy of the Void, Rise of the Tomb Raider, Disgaea 5, the expansions for Witcher 3 and Bloodborne and, flawed though it is, Fallout 4). On that basis, I am happy to pass up this particular rip-off.

Comment Re:Back in the old days (Score 2) 393

I'd say that in the UK at least, there's a twist on this.

Our graduate numbers have soared since the early 1990s, as a result of the policies of successive Governments (most notably the closure/rebranding of the vocational-focussed polytechnics and the Blairite policy of 50% of teens going to university). Alongside that, the average quality of graduate employment and the size of the average "graduate premium" on salaries has fallen sharply.

That said, when you look at the detail, the situation has changed a lot less than you might think. We have a very distinct hierarchy of universities here in the UK, with, I would say, even sharper differences than the whole Ivy League thing in the US produces. At the top of the pyramid you have Oxford and Cambridge, the elite of the elite. Then you have a small number of other "elite" universities (including several of the London ones). Then you have the traditional "red brick" universities; good, but not elite. Then you have the "new" universities and former polytechnics. It's also worth noting that the difficulty of degrees varies massively between those institutions. A BA from Oxford or Cambridge may require 15-20 hours of lectures or supervisions per week, a similar quantity of reading time, plus, depending on subject, an essay, piece of translation work, small coding task or other such exercise per week, as well as a more substantial piece of project work over the course of a term or a year. Fail the exams at the end of any year and they generally throw you out, unless you were seriously ill. At the "new" universities, on the other hand, there may be a couple of lectures per week, an essay or two per term and virtually unlimited resits on offer for exams.

When an employer looks at a CV, the expectation these days is indeed "degree by default". So the first thing said employer looks at is which university awarded the degree (the hierarchy of universities is pretty much imprinted in the minds of the British professional classes). The second thing is the subject the degree is in (anything with "Studies" in the name is taken as a bad sign).

Once you apply those filters, you can see that the impact on the employment prospects and salary increments of those going to university today who would have gone to university under the standards of 30 years ago have not actually fallen all that much (if at all, at the top level). The only difference is that with more people going to university, the grants system is effectively dead, so they are graduating with a lot of debt they wouldn't have had 30 years ago. That said, student loan repayment terms in the UK are exceptionally generous. Meanwhile, those who would not have been able to go to university 30 years ago, who go to the "new" universities, in most cases are gaining next to no benefit to their employability (and in some cases an outright disadvantage as they are competing with their peers who didn't go to university and have been in the workforce getting experience for 3 years), but are saddled with a similar level of debt.

Comment Enjoying anger (Score 4, Insightful) 119

I think a lot of this is driven by the fact that many people have realised, consciously or otherwise, that they enjoy being angry. That they get some sort of validation or self-worth from it.

A few months back, I dropped out of participation in a TV/movies forum I'd been a member of for years, largely due to a growing trend in "hate watching". This is where people would pick a show they hated, sometimes for artistic reasons but more commonly for political reasons, watch it all the way through and post in great acerbic detail about everything they hated about it. This, of course, led to people who liked that show jumping in to defend it and launching their own retaliatory "hate watches" and meant that more or less every thread broke down into a flamewar.

Previously, people had just not watched shows they didn't like beyond the first episode or two. Everything was a lot more live-and-let-live. Problem was, of course, the forum's moderators realised that the hate watch flamewars were producing masses and masses of page-views and therefore advertising views. So instead of trying to dampen things down, they did everything they could to encourage it.

This is part of the problem; the current financial model for most of the web (and social media in particular) is based around ad-views. As anger and outrage lead to lots of page-views, the financial incentive is to keep people in a state of perpetual quivering outrage.

That's just part of the explanation, of course. I'd look to colleges for most of the rest.

Comment Re:Game chat (Score 4, Interesting) 202

You do actually have a point. Trying out Metal Gear Online with a couple of friends a week ago, I found myself stopping to think about just how dodgy our conversation would sound taken out of context. Hell, I remember conversations from my Counter-Strike days about where best to plant the bomb and how quickly we should be aiming to rush to the nuke. If the NSA really are listening in on everything we do on a "keyword" basis, then the average online game must be a hilariously massive flood of false positives for them.

Comment The 360 library (Score 2) 63

The initial list of games supported through back compatibility is relatively disappointing. That said, MS have acknowledged they are working to improve it and, in particular, to address the current problem with support for multi-disk games. On that basis, MS deserves at least two cheers for making a good start.

But in terms of prioritising titles for support, I'd like it if MS could reflect on those titles where there is the greatest historical interest in ensuring they remain playable on new hardware. I traded in my 360 almost two years ago and was surprised at how little reluctance I felt in doing so. There was only a tiny range of games that I felt regret about not having access to any more.

In truth, there are relatively few 360 games that would genuinely benefit from preservation. The two categories that I would de-prioritise are:

Multi-platform games which remain available on modern hardware. This might be because they've had a "remastered" version for a current-gen console. In many cases, however, it is because PC versions remain available, on sale and playable. So a lot of the big cross-platform shooter franchises would not, to my mind, be a priority as they are generally cheaply available on Steam and/or Origin.

The next category of games are the instalments in iterative franchises which are effectively replaced by later instalments. So the FIFAs, the Maddens and, at a push, the Forzas (Forza 6 is a valid successor to Forza 4 in a way that Forza 5 wasn't). These are the games with no storytelling component, whose features are improved on a year-by-year basis and where there is really no particular reason to go back to the older versions.

That actually covers a remarkably large portion of the 360's library. I was struck when I traded mine in by how few really significant exclusives the 360 had; its strength was always as the "best console to play multi-platform games on" rather than as an exclusives machine. That said, there are a few titles I would dearly love to see rescued:

- Lost Odyssey (arguably the best JRPG of its generation and, in some respects, the "real" Final Fantasy 13. It's multi-disc, so that issue would need to be sorted.)

- Blue Dragon (not quite as good as Lost Odyssey, but still a seriously good JRPG).

- Ace Combat 6 (the final decent instalment in the series when it was still unafraid to be wacky, before it turned into Call of Duty with planes).

- Gears of War 2 and 3 (almost certain candidates for an HD remaster at some point, I guess).

- Deathsmiles (technically also available on mobile platforms, but the 360 port is the definitive version and one of the best bullet-hell shooters available for home consoles).

There is a much larger list of PS3 titles in need of rescue. My original highest priority, Valkyria Chronicles, is thankfully now on Steam, but I would dearly love to see the Ratchet & Clank games and a lot of those PS3-exclusive or best-on-PS3 JRPGs (such as Eternal Sonata) make it out from under the wire.

Comment Ok for a year or two post-graduation (Score 5, Insightful) 412

The problem with this idea is that people will be fine with it for a year or two post-graduation, but it's going to start to suck fairly quickly after that.

It's not unusual for people to cling to elements of their student life after they graduate and get their first jobs. I did the same myself; moved into a shared house with a few people I'd known at university and tried to keep a student-ish lifestyle running alongside a full-time job.

It lasted 18 months. Then I gave up and rented a place on my own.

The demands of being a full member of the workforce are very different to the demands of being a student. When you're having to get up at a set time every morning (and generally pretty early), find yourself getting older and needing a regular sleep-pattern, needing a quiet space to do work that actually matters (rather than essentially being for your own benefit, as your work as a student was) and so on, the whole shared-living thing breaks down pretty rapidly. Irritations about your cohabitees different body-clocks, cooking smells, personal hygiene and expectations of reasonable noise levels all start to feel much more important than they did when you were still studying. And as you get more and more irritated with them, they are getting more and more irritated with you.

On top of that, this is generally the time when many people are going to be getting into more lasting romantic relationships, which might eventually lead to marriage and kids. This is not easy when you're sharing accommodation with a bunch of other people and personal space is a scarce commodity.

I guess they might make this work as a commercial proposition if it's a short-term rental affair. The problem is that if you get longer-term residents who age significantly past the incomers, this is going to turn into a vision of hell pretty fast.

What this certainly isn't is an alternative to providing sufficient quantities of decent quality new housing suitable for long-term occupation and family life. That's what we're very short of here in the UK. The issue here for Millennials is that whether or not they want to live like this, they may well have no choice. The option of renting my own place that was open to me more than a dozen years ago (let alone buying one, as I later did) is a lot less accessible now, due to rising rents.

Submission + - Fallout 4 release raises questions about reviews of buggy games (

RogueyWon writes: Fallout 4, the latest instalment in the long-running video-game series and one of the most hyped titles of the year, was released on 10 November. The game has generally been reviewing well, currently holding a Metacritic score of 89. However, a number of reviewers have noted the very large number of bugs present in all versions of the game and have, in some cases, reflected on the difficulty that these pose for reviewers, despite still awarding positive overall write-ups. Can it be ethical to recommend a product to consumers on the basis of its strengths, despite knowing that it contains serious faults?

Comment Re:Exclusives (Score 1) 86

Exclusives have been in decline for a couple of generations now and have indeed become very scarce indeed in the current generation. To understand why, you have to understand why games go platform-exclusive in the first place. The four main reasons are:

1) Because the platform owner is developing the game themselves (sometimes as a loss-leader) or through a partly or wholly owned subsidiary, usually so as to grow the installed base of their platform and tempt third party developers to their platform.

2) Because the platform owner has paid a sum of money to a third party developer/publisher in exchange for exclusivity.

3) Because the platform in question is so dominant that porting the game to other platforms doesn't make financial sense.

4) Because the platform in question has unique features that are integral to the design of the game.

Of the four reasons listed above, only 1) isn't in significant decline. All three platform owners invest in first and second party development to varying degrees. It's a bigger deal for Nintendo than the others (historically revenues from hardware and first-party games have been their main revenue source, though this has been in serious decline for about 5 years now), but all three of them do it. This is why we get the iconic platform-exclusive brands like Halo, Forza, Killzone, Gran Turismo, Mario and Zelda. Don't forget that in some cases, these games don't make a huge profit (particularly the ones developed as launch titles, when the installed base is still small). What they do is sell hardware and convince third party developers that they want to support the platform. For a platform owner, a healthy third party ecosystem is the holy grail; other people sink the development costs and take the risk on each title, while you just cream off your licensing fees.

Reason 2) is still around, but is in rapid decline. This is largely a result of the rising cost of game development. Third party exclusivity deals are unsentimental affairs; the money that the platform owner is putting on the table has to be greater than the money that the developer/publisher would make from sales on other platforms. In a world where AAA games routinely need 3-5 million sales or more to break even (up from around 500k to 1 million sales a decade ago), the cost of these exclusivity deals for the platform owner is getting very high. This is why franchises such as Final Fantasy and Metal Gear Solid have drifted out of exclusivity deals.

Reason 3) has largely declined due to the fragmentation of the market. It was only ever really a factor at very specific points, during the SNES era and, in particular, during the PS2 era, when one platform had a commanding lead over its competitors. When the PS2 was busy stomping the Xbox and Gamecube into the dust and when PC gaming was a fairly niche affair, there was often little point in small and mid sized developers aiming for anything other than the PS2. In a world where Sony, Microsoft and Valve all have user-bases of the same order of magnitude, going cross-platform tends to be a no-brainer (and the recent convergence of both hardware specs and controller designs helps a lot as well).

Reason 4) has always been a slightly odd proposition. Certain genres which depend on mouse and keyboard, such as RTSes and hardcore MOBAs, remain largely PC-only propositions. But Microsoft and Sony's hardware is, outside of the fanboy wars, pretty standardised these days. The "unique features", such as the Xb1's Kinect and the PS4's touch-pad just end up not being used. The Wii-U, meanwhile, is dead to most developers; its underpowered internals, lack of sufficient buttons on the controller and awkward touch-screen make multiplatform development for it very hard, while the small installed base guarantees limited returns. As for developing games around gimmicks like motion control and voice activation; too many developers got burned by commercial flops in that space during the previous generation. The concept of alternative-controllers is effectively dead for the time being (though VR may give it a shot in the arm).

So yeah, console exclusivity is thin this time around. Certainly, I have far fewer XB1/PS4 games than I had at the equivalent point in the 360/PS3 cycle. In a world where I can buy almost everything for my PC and get a better performing version for less money, that's what I'm going to do. I only have one current-gen console game which wasn't an exclusive at the time I bought it (Assassin's Creed 4, which came bundled with the PS4).

I have a theory that it's impossible to prove anything, but I can't prove it.