Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 internet speed test! ×

Comment Re:Lets help the readers of /. ;-) (Score 1) 225

Intel were forced to place that notice there as part of the remedy for a criminal antitrust investigation. Putting it there as an image is a way of sticking two fingers up at the judge - I'm surprised they were allowed to get away with that.

Intel charge a not inconsiderable amount of money for their compiler. Try going to their website or one of their resellers and buying a license, and see how prominently that (image-only) notice features. Hint: it doesn't.

Comment Re:Permanently disabling? (Score 1) 225

Hmm, again this bogus argument about "support". I'm not saying that Intel should "support" AMD CPUs. Intel's compiler should produce the very best code it can for Intel CPUs - they are under no obligation to optimise the code for AMD CPUs or even to test the code on AMD CPUs. It's AMD's problem to ensure compatiblity. However, Intel shouldn't *add* code whose only function is to nobble AMD CPUs.

If Ford opened a chain of gas stations, and optimised the exact formula of the gas to the properties of Ford engines to give them better mileage, that's fine. If they then secretly installed a widget on each pump that detected whether the car was a Ford or not and added water to the fuel if it wasn't, would you say that was acceptable? Would asking them to remove that widget be "Forcing them to support a competitor's product"?

Intel have always sold their compiler as a generic x86 compiler. If they had always been straight up and said that it only compiled code for Intel CPUs, and code compiled with their compiler just refused to run on AMD CPUs, then (a) I wouldn't have a problem, and (b) they'd have zero market share. So in that sense I agree with you: Intel shouldn't support AMD at all if they don't want to. They just shouldn't pretend they do and then secretly degrade performance.

Comment Re:Permanently disabling? (Score 2) 225

Sorry, but that is bullshit. Intel started doing this very early on (in version 8 of their compiler), and none of their CPU capability checks looked at the specific architecture at all. The only thing they checked was the CPU capability flag, and they deliberately skipped that check unless the chip was from Intel.

They even cocked this up with their first iteration, such that instead of producing binaries that ran slowly on AMD chips it produced binaries that segfaulted on AMD chips. See for the details.

This idea that "Ooh, although this chip advertises that it supports SSE2 we'd better not actually let it run any SSE2 instructions just in case they run more slowly than generic 386 code" is a valid excuse is just crap.

Comment Re:The Intel compiler still anti-competitive (Score 5, Informative) 225

Oh come on, this has been beaten to death. Nobody is saying that Intel has to optimise for AMD products. There is a standard mechanism (introduced by Intel!) to query a chip to find out what instruction sets it reports. Intel's compiler uses this mechanism to decide what code branch to run, but *only* for Intel chips. The literal code path is

if (Intel chip) then if (supports SSE2) then run SSE2 code else run non-SSE2 code else run non-SSE2 code endif

All people are saying is that the code path should be

if (supports SSE2) then run SSE2 code else run non-SSE2 code else

See the difference?

Yes, you can get extra speed by ordering the instructions differently for different architectures, and Intel's compiler quite rightly does that to product Nehalem-optimised code or Skylake-optimised code. I don't expect the compiler to produce Bulldozer-optimised code, but I expect it to allow me to run the Nehalem-optimised code on a Bulldozer. Where does this meme that this request is "forcing Intel to optimise for the competition" come from? I want Intel to do *less* work, not more - all they need to do is *remove* a small amount of code from their compiler and I'd be happy.

Comment Re:HypnoToad says (Score 3, Informative) 174

*sigh*. If you're going to quote the scientific literature in support of your argument, you need to at least make some effort to understand it first.

The paper says that cosmic rays strongly correlate with ozone depletion. The data point to cosmic-ray driven reactions of halogenated molecules as being the cause of the correlation. The *only* halogenated molecules present in the stratosphere in any significant concentration are CFCs. I'll repeat that: where the paper talks about "halogenated molecules", it's talking about CFCs, HCFCs and other man-made chemicals.

Hence, this paper is presenting an alternative explanation to *why* CFCs damage the ozone layer. The prevailing hypothesis is that photolysis of CFCs (i.e. UV from the sun breaking them apart) is what kicks off the ozone-depleting catalytic cycle. This paper says "Nah, it's not photolysis, it's cosmic-ray-induced ionisation of the CFCs that sets the whole thing off".

From the paper:

In the CR-driven mechanism, the O3 -depleting reactions depend on halogen concentrations, CR intensity, and PSC ice (to hold the electrons) in the stratosphere [6,8]. From 1992 up to now, the Antarctic O3 loss has shown a clearest correlation with the CR intensity. This is because the total halogen amount of the stratosphere, particularly those of CFCs, is nearly constant in that period of time [30]; thus the regulating effect of CRs on O3 loss becomes manifest. In contrast, such a time correlation is hardly seen in the enlarging spring polar O3 loss during 1980s, since at that period of time, the halogen loading increased dramatically and thus ozone showed a drastic decreasing trend blurring the CR-O3 loss correlation. And in the pre-1980s, no significant halogen loading was found in the stratosphere, and thus no significant O3 loss was observed.

Summarising that: since 1992, there's been loads of CFCs in the atmosphere, and hence the rate-limiting step in how much ozone gets broken down is how many cosmic rays there are. Before 1980, there were no CFCs in the stratosphere, and hence cosmic rays didn't destroy any ozone. Your bet is thus meaningless: this paper is part of the argument over *why* CFCs cause ozone depletion, not *whether*.

Comment Re:I trust (Score 1) 910

If, instead, parents were given freedom of choice in schools and teachers, the good ones would be oversubscribed, the poor ones undersubscribed and laid off / fired, and quality would improve dramatically and quickly.

So how are you going to choose who gets into the oversubscribed "good" schools? Ability to pay? What's actually going to happen is that the "good" schools all end up in nice little wealthy enclaves, which pushes up the property prices there just nicely. The "poor" schools end up even more starved of funding than they are now, so can only afford to hire the useless teachers who can't get a job anywhere else. Congratulations! You've just effectively ended any chance of social mobility!

Comment Re:Crowdsourcing (Score 1) 556

Kickstarter works very well for small projects. It seems to work OKish for medium-size projects - there was a recent Slashdot article about someone getting ~$1M for development of a sequel to a popular old computer game.

However, developing a new drug is going to cost you at least $100M, and probably closer to $1B[*]. How the hell do you think you're going to raise that much?

[*] Yes, I know there are articles out there by loons that claim the "actual cost" of drug development is $2.50 plus the handful of small change that we found down the back of the couch. They're talking crap. AstraZeneca spent US$24B over the last 12 years on R&D, and has released 3 new drugs in that time. if you want another data point, Cancer Research UK, an enormous UK-based charity, spent >£200M last year on research. It's spent roughly that much every year for the last 20 years (accounting for inflation). Do you know how many drugs they've developed? None.

Comment No, because there's no money in it (Score 2) 140

Developing a new drug takes hundreds of millions of dollars. Suppose you spend that and you some up with a completely new class of antibiotics. You've now got two problems. Firstly, for 99.9% of the infections out there the existing antibiotics work very well and are now generic, meaning low cost. Quite apart from that, no clinician is going to prescribe your new "last resort" antibiotic for someone coming in with a sniffle: you keep it in reserve for the people with vancomycin-resistant MRSA. The combination of these two means that your total market is maybe a couple of thousand people per year. You've just burnt>US$500M on development costs, so how are you going to get that money back? The solution to the antibiotic problem is to quit misusing the ones that we already have. Nobody's going to develop new ones any time soon.

Comment Re:Worrying state of affairs (Score 1) 374

You're missing something - VAT is only charged on the final sale, to the consumer. All businesses (above a certain size: ~$100K turnover) are VAT registered. This means that they pay no VAT on things that they buy, but have to collect VAT on things that they sell, unless they are selling to another VAT-registered company. It's basically the same as the state sales taxes that you have in the US - business-to-business transactions don't have to include state sales tax.

Comment Re:We need *new* antibiotics. (Score 1) 193

OK, let's crunch the numbers. A new class of antibiotics is going to cost you round at least $1Bn to develop[*]. Suppose you spend that and discover a totally new class, with no existing bacterial resistance. So, the clinical choices here are (a) prescribe it to every schmuck who thinks it might make his flu get better (and feed it to cows as well - why not?), or (b) give it only to people who are dying of MRSA. Option (a) is stupid as within ten years MRSA is now resistant to your new antibiotic as well, and the FDA (quite rightly) won't sanction it. Option (b) means that your total market is maybe 5,000 people per year in the western world. How are you going to recoup your $1Bn? This "pharma aren't interested in making you better" meme is a pile of crap. Pharma are interested in anything that will make them money. The first company to bring a cure-all for cancer to market is going to make so much cash that they'll drown in it. However, antibiotics are a place the the wonderful free market just fails. Unless there is some sort of subsidy, significant numbers of new antibiotics aren't going to be developed unless the drug resistance problem gets a whole lot worse.

[*] Yes, I know there are pointless Salon articles claiming that the real cost is 47 cents. They're talking bollocks.

Comment Re:Patents aren't helping (Score 1) 437

Their $55M estimate is a complete fantasy, though. If it were that cheap, there would be an awful lot more new drugs out there. There aren't. For example, go to and look at the graphic. Over the last decade, there have been on average 20 new drugs approved per year. If each one only cost $55M, then any one of the twenty biggest pharma companies could have afforded the lot out of small change. Given the financial trouble that all of them are currently in with existing drugs coming off-patent, you have to wonder why they haven't been doing that.

Comment Re:Ambiguities (Score 2, Informative) 110

And how is this different to the USA refusing to recognise or honour European copyrights for most of its history? Charles Dickens' novels were widely published in the US without any payment whatsoever to him. It's only when the US developed a big enough internal copyright industry (who wanted their copyrights recognised in Europe) that any attention was paid to any non-Americal "intellectual property". Before that, the USA "took it all for free whilst sat on their ass", as you so delicately put it.

Slashdot Top Deals

C'est magnifique, mais ce n'est pas l'Informatique. -- Bosquet [on seeing the IBM 4341]