Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
Note: You can take 10% off all Slashdot Deals with coupon code "slashdot10off." ×

Comment Re:One possible solution... (Score 1) 131

I am a researcher in the UK, and this is pretty much happening here. It's required by the national body which the government uses to fund research (some info here) and that requirement gets passed on to the subject area-specific research councils.

Basically, work I publish needs to at least be available on a University-level preprint server ("green" open-access); many publications allow this now. For publishers that don't, the research councils have arrangements with research institutes to pay the fees for the final published versions to be publicly available ("gold" open-access). It's not ideal as we're still overpaying the publishers, but it's a compromise that sets a pretty clear direction. In addition, I'm required to make research data readily available. Rumour has it that my research council will soon be picking random papers, trying to get the data, and kicking buttocks where they encounter a problem.

I don't think legislation is necessary; policy at the funding level seems to be doing the trick, and is a bit more flexible.

Comment Music analogy is good (Score 1) 205

The music analogy is more correct than they realise. A huge proportion of music is poorly served by cherry-picking the most appealing tracks; any kind of suite or conceptual work is much better understood when you experience the whole structure. If you can pick a couple of tracks from an album and get a comparable or better experience than someone who listened to the whole thing, it probably wasn't a very good album to start with. Similarly, a course with light depth and populist subjects might be well-suited to cherry-picking, but this would be a symptom of a shallow course.

The solution of course is to create analogous "Double A-Side" and "EP" courses, which are short, stand alone, and add some breadth to the student experience at a reasonable level of quality. The standard should still be the LP.

Comment Re:Throw the book... maybe literally at him. (Score 1) 220

Hi, I use research supercomputers. There are rarely many idle cores, as they use queue systems to allow many researchers to simultaneously submit multiple jobs with different requirements. Scheduling software attempts to manage the parallel job sizes to minimise unused nodes. When there are idle cores it is usually for a reason (i.e. running down certain queues at certain times to support certain patterns of use) and it wouldn't be possible to easily squeeze in a little bitcoin crunching.

Comment Re:Throw the book... maybe literally at him. (Score 1) 220

I don't buy it, but I do use it. The UK academic supercomputer ARCHER has a handy cost calculator:
http://www.archer.ac.uk/access/au-calculator

150,000 USD is about 90,000 GBP; playing with the numbers here that would buy a non-partner research council about 1200 hours of 3072-core jobs; about 3.7 million core hours on 2.7 GHz 12-core Ivy Bridge CPUs with 64GB of RAM per 2-CPU node. A non-trivial amount of computer time!

Given that this is what would be charged to national and European and academic research projects, and is run by a UK research funding council which will have taken some national funds to set up the facility in the first case, I doubt the pricing is wholly unreasonable. It's a shame the article is not clearer on this point, but the most obvious assumption would be that they are talking about $150k at research council rates. In which case the 'fantasy' is in the form of ignored subsidies to capital costs.

Comment Re:GoG? (Score 1) 373

PlayOnLinux also makes it pretty easy, and explicitly supports a lot of GOG installers... Currently enjoying Neverwinter Nights from the GOG Insomnia sale on my Linux music production rig. Still, native versions are nice, and I won't buy a game from them if I have reason to suspect a native version is available.

Comment Re:Unfortunate Card Naming (Score 1) 142

I happen to like being able to choose a video card based on specs. I can find what I want at the price I want.

The difficulty is in understanding what you want. If I sometimes get choppy performance in a game, does that mean I want faster memory or more memory? If I want good rendering performance in Blender using OpenCL, what is the break-even ratio of core clock speed/core number?

System going down in 5 minutes.

Working...