Become a fan of Slashdot on Facebook


Forgot your password?

Comment Re:But that's not the real problem. (Score 1) 1651

Of course there are. The statistics most people do is 2-variable, but statistics generalises perfectly well to multiple variables. The quality of a study may need a subjective assessment, but you can assign it a value and it becomes another dimension to your data, then find out how it correlates with other dimensions in your data. Some of the stuff I do on finding hidden structure in large, complex graphs/networks uses techniques that also are used in multi-variate statistical analyses.

I agree there may be a need to filter for medical treatment studies. I agree pharmaceuticals have manipulated the public record (from which meta-studies must draw) with poor studies. However, they have also manipulated the public record by _withholding_ studies from the public record. (By co-incidence I'm reading Ben Goldacre's "Bad Pharma" at the moment, which takes issue with that withholding trick and its distorting effect - very interesting book so far).

One of the problems with judging helmet efficacy of course is that it is *very* hard to do well controlled studies. intrinsically, if we want statistics that measure real-world efficacy of helmets, we're going to have to work with extremely noisy signals. The best way to deal with noise is to use as much data as possible (and control for the differences in methodologies).

BTW: The limited-time period trick has been used by studies that found helmets had a large positive effect too. ;)

Comment Re:But that's not the real problem. (Score 1) 1651

I'm using bias in a non-pejorative, technical, statistical sense. Clearly either: a) There is some underlying bias why studies where helmets were less effective were also less likely to meet Thompson's Cochrane review standards OR b) There is a bias in the Cochrane review OR c) There is a "biased" co-incidence (random chance that creates a pattern that looks like bias).

FWIW, I have both studies open. They both have good discussions on this. I think we'd both be better off just reading them and deciding from that. I'm sticking to my belief that bias is best dealt with by gathering and aggregating more data and using mathematical tools to deal with any differences in methodology and results, rather than using more subjective "quality" criteria to exclude data-points.

One thing, the Elvik meta-study a different data-set to the Thompson Cochrane Collab. paper, which is interesting. The Elvik paper is updating a 2001 meta-study, Attewell, with new data-points. The Cochrane paper uses some of the same, and additional ones. There are primary-papers in the Attewell study though which the Thompson paper did not find at all - even though it found the Attewell meta-study. Similarly there are primary-papers in the Thompson paper, which pre-date Attewell significantly, but which Attewell does not mention. This makes me think the search strategy in both those meta-studies might have been sub-optimal - unless there's some important factor I've missed. If I havn't missed anything, it'd be interesting to see a meta-study with search criteria that caught at least all the primary works in those meta-studies.

Also worth noting is that several of the studies excluded from the Thompson paper were published in Accident Analysis & Prevention, as was the Attewell meta-study.

Comment Re:But that's not the real problem. (Score 1) 1651

I don't have expertise in or much knowledge of the medical world, but my mathematical knowledge is not comfortable with "less is more" wrt statistical analysis. The answer to noise is to aggregate over more data, not have humans apply their judgement as to which signals are and are not representative. That path is certain to result in misleading conclusions at times, even with the best intentions from the best experts.

Comment Re:But that's not the real problem. (Score 1) 1651

And it remains curious that those studies that the Cochrane study felt it had to be excluded have such an effect on the result. Whether it was that studies that show less benefit and/or increased other injuries in helmets are biased towards less rigorous methodology, or whether it was selection bias by the Cochrane study authors, I don't know. However, the Elvik study strongly suggests there appears to be some kind of bias *somewhere* in the previous work (the primary research and/or the meta-study).

Comment Re:But that's not the real problem. (Score 1) 1651

I don't doubt him. Still, we don't know which of the two meta-studies is more representative of reality. I'm going to stick to my general view though that these questions are best answered by aggregating more data, rather than less. I look forward to future meta-studies revising the results of these two with even more data, when it becomes available.

Comment Re:Erroneous explanation correction (Score 1) 1651

I'd love to know where you live, because I really want to go there to see these somersaulting 90-year old grannies. Or perhaps you're not being honest about how you cycle, or you're confusing and mixing together different parts of your life. Maybe your beloved helmet didn't quite protect you as well you believe from all those potentially fatal impacts you've had while doing geriatric style free-style moves? (laugh: that's humour - hope you can see the funny side).

Slashdot Top Deals

Everyone can be taught to sculpt: Michelangelo would have had to be taught how not to. So it is with the great programmers.