Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:Two to five YEARS??? (Score 1) 180

In the UK, there is already an accellerated progression through the trial stages for acute, terminal disease. Several cancers fall into this category (e.g. late stage pancreatic and hepatocellular, IIRC), meaning that patients suffering from the late stages of these very-hard-to-treat diseases are brought together with researchers who're keen to test their new treatments.

It seems like a great idea in principle, but it has a slightly weird consequence. It's now much easier to run a trial on late-stage, terminal cancers than it is to run one on early-stage cancers. This means that:

(a) New treatments are much more likely to be tried only on patients whose disease is already advanced and resistant to standard treatments, and who're likely suffering from a load of secondary problems. As a consequence of testing therapies only on the most difficult patients it seems likely that we're dismissing novel therapies that might've provided benefits at an earlier stage, or that would've been an incremental improvement for the patients already helped by current therapies.

(b) When applying for research funding and planning out projects, groups are biased toward treatments that are likely to show a benefit in late stage patients (because that's where the first trial is likely to be) at the expense of research into intervening at a relatively early stage, or improving the lot of patients who respond to current treatments but with horrible side effects.

My impression is that this accellerated/compassionate licensing is probably a significant net benefit for patients, but it's important to think about how shifting the regulations about trial design will modify the pressures on -- and therefore output of -- the research community.

Comment Original paper? (Score 2, Insightful) 180

Can anyone find the original journal article? From a fairly quick PubMed search, James' group last published on TRIM21 back in 2008. There have been a few papers on TRIM21 in 2010, but they're not from James' institution and they don't share any authors with James' 2008 paper.

Or is this being reported before the paper has been published? Do we know that it has even been properly reviewed?

This is really cool if it's true and it's relevant to my research, so I'd love to see the original paper.

Comment Re:Long nursing shifts (Score 1) 520

I know a lot of doctors and medical students in the UK (although they're all from two or three hospitals), and can tell you that these communication skills are a huge part of the modern curriculum. Hearing the younger doctors talk to each other about work, it's obvious that some of the information exchange is formulaic; they've had standardised patterns of communication drilled into them to make sure that everything about a case is put across. In formal settings, this "protocol" includes error-checking, i.e. making sure that the recipient has understood the message.As part of this, the responsibility for communication has shifted: If e.g. information is lost when a doctor talks to a nurse, the doctor is responsible for failing to communicate effectively, rather than the nurse being responsible for misunderstanding.

Comment Re:Stop animal testing - cruel and ineffective (Score 4, Informative) 59

Really, really, no. I've co-authored a paper on a stochastic model of a particular biological system, so I have some insight here. Think about weather forecasting: we have a firm understanding of the underlying physics, the environment isn't terribly complex (air and moisture of various temperatures, flowing over landmasses and seas, heated by the sun) and yet we're absolutely shit at it. We simply don't have enough information or processing power to build a decent model of this relatively simple but chaotic system and see where it's going to go.

Now scale this to a human cell. The environment inside a cell is enormously complex, containing millions of proteins, nucleic acid structures, lipids, carbohydrates, etc of many thousands of different types. For the vast majority of these, we have no idea what they do - no or incomplete guesses about their function, shape, charge distribution, stability, etc. or how any or all of this changes in response to pH, temperature, binding to one or more other proteins/carbs/lipids/etc.

Now scale this up from a cell to a section of tissue. We don't have a clear understanding of all the signals that cells send and receive between themselves, how they sense the extra-cellular environment and what their reactions might be. We have a huge amount of solid evidence, but we know that there's a lot going on that we can't currently detect or understand. Now scale up to a whole organ, a whole biochemistry, a whole patient...

Computer modelling is coming along, but a model of a system can only ever be as good as your understanding of that system. As the computer types, say: Garbage In, Garbage Out. Our understanding of biology is in a period of truly inspiring growth, but still woefully incomplete. The paper I worked on was a bit of a breakthrough in the techniques it used (it wasn't my breakthrough, I'm not a mathematician), but for the model itself we had to make some really ugly assumptions and omissions, and had to start with some very dubious input data.

Fantastic advances are being made and it's a tremendously important field of research, but it's limited by the progress of "proper" biology. I'd bet patients' lives on the weather forecast before I bet them on the current state-of-the-art biological computer models.

Comment Re:Stop animal testing - cruel and ineffective (Score 5, Insightful) 59

Animal testing has never really worked. Animal tests proved penicillin deadly, strychnine safe and aspirin dangerous.

In fact, 90 percent of medications approved for human use after animal testing later proved ineffective or harmful to humans in clinical trials. It is humbling to realize that the flipping of a coin would have proved five times more accurate and much cheaper.

Animal testing has never worked perfectly. I can't find citations for your claims about those three drugs, (although I happen to know that the first use of penicillin was in mice injected with staphylococcus - it saved the mice and led to a very rapid research programme that culminated in large-scale production and saving many thousands of soldiers' lives in WWII, and ultimately in all the antibiotics we rely on today) but I'll cheerfully concede that drug tests in animals can give misleading results. A lot of this is arguably because the results are misinterpreted, but there's no denying that our biology differs in various ways. Some of those differences are well-understood, others occasionally take us by surprise.

Your other point is an obvious statistical fallacy. It may be true that 90% of trials fail post animal testing. What's important to know is how many unnecessary trials of useless dugs have been prevented. Without animal testing, instead of 90% of human trials failing the number would be more like 99.999% failure. Even ignoring the astronomical costs of these trials (in terms of both money spent and extra lives lost while waiting for a cure), while some of these failures would be benign others would visit terrible side-effects on the volunteers.

Animal-tested drugs have killed, disabled or harmed millions of people and lead to costly delays as well.

Probably true. However, animal-tested drugs have also saved many, many more. Gigantic net benefit. As a side note, the eradication of smallpox directly killed thousands of people, through reaction to the vaccine (the earlier versions were less safe than the modern versions). But we still say it was a good thing, because it has saved many millions more. Like it or not, public health is a numbers game, where all we can do is shoot for the best net benefit.

We have spent billions of dollars to cure cancer in mice, but so far have failed to replicate human cancer in any animal, let alone close in on a cure. All but a very few diseases are species-unique, and the only efficient and effective way to discover cures and create vaccines is through the use of the same species cells, tissues and organs.

Cancer is, at best, a family of diseases, not a single disease. There is not and will never be a single "cure for cancer". There are, however, excellent treatments for certain kinds of cancer, many of which (chemodrugs and oncolyic viruses) could not exist without extensive work in animal models. Animal models teach us a huge amount about cancer development and progression, the tumour micro-environment, interactions with the immune system, the kinetics and diffusion properties of drugs, etc. You can join the argument that the data we get isn't perfect, but everyone involved already knows this. The counter-argument it that we have a choice between this and nothing at all. "Efficient" and "effecive" might be true if we had an unlimited supply of human tissues, organs and whole people to experiment with. Sadly, the ethics board in my university are all up-tight and like to see that *something* living can tolerate and show benefit from the treatment before we start injecting random chemicals into cancer patients. Killjoys, I know.

The use of animals as models for the development of human medications and disease almost always fails, simply because humans and animals have different physiologies.

Different in some ways, very, very similar in others. The trick is to work out which ones are which, and the people running multi-$million research institutions are often pretty smart. Not perfect, but this objection is something that has occurred to them and that they strive to take into account.

yet you can't visit a laboratory and see how the government has spent your money.

Largely because they're worried about the constant stream of protesters and/or random uninformed members of the public constantly disrupting and occasionally destroying the work in progress. A possible additional factor is that animal labs are generally run as quarrantine zones to prevent infections getting into the stocks. A stream of untrained people, even with the best of intentions, would make contamination rates skyrocket. For what I imagine are broadly similar reasons, there are all sorts of places I'm not allowed to go to monitor my tax spending... I can't turn up to inspect my local hospital, monitor my local school, play with the machines in my local air force base. Money-grabbing govt conspirators!

Animal experimentation is a multibillion-dollar industry fueled by massive public funding and involving a complex web of corporate, government, and university laboratories, cage and food manufacturers, and animal breeders, dealers, and transporters. The industry and its people profit because animals, who cannot defend themselves against abuse, are legally imprisoned and exploited.

Animal research is not perfect, and has never been claimed to be so. Even with all the strict regulations and procedures in place to minimise animal suffering (the propaganda from animal rights groups is generally decades out of date at best, deliberately misleading or lies at worst), it's ethically and emotionally troubling: everyone involved in it regards it as a necessary evil. But there's no question that, without it, improvements in our understanding of and treatment for disease would grind to a virtual halt. At the end of the day, you have to make a choice: animals or people. I've made my choice, and am willing to work on animal models in the hope that the fruits of my work will save human lives.

Comment Re:Free papers (Score 4, Informative) 59

Scientists would love for all of our papers to be open-access. Even ignoring the big ideological reasons (what's the point of discovering this stuff if we can't tell everyone?), our career progression is almost entirely dependent on people's recognition of our published work. We want as many people as possible to read, build on and cite our work, because that's how we build the reputations we need to get funding, jobs and groupies.*

The problem is that a big part of the way our publication record is assessed is whether our work was published in "high-tier" journals, i.e. the journals that print the most often cited (therefore deemed to be best quality) papers. These journals are almost all closed-access (Nature, Science, Cell, etc.). Worse, they demand that you transfer copyright over to them so you're forbidden from giving copies of your papers away.

A few larger organisations have managed to negotiate better terms. For example, work funded by various governments (most or all of the EU states, USA, etc) or big, influential charities (e.g. Cancer Research UK) can (and must) be released for free, generally at least six months after initial publication. This sort of negotiation is possible for influential funding bodies, who could otherwise insist that labs receiving funding boycott closed journals. However, an individual scientist can only try to fight the system by submitting their work to open-access journals. This is noble but, without work published in high-tier journals, they're really destroying their chances of getting ahead in a fiercely competitive funding and job market. A lot of scientists hate the current publishing system but, really, they have us by the balls.

*I can dream. Shut up.

Comment Re:Special 2-D glasses needed (Score 3, Informative) 495

Yes. Or at least, my red-blue colourblind dad could.

It makes sense because the colour filters are used to make sure that each eye only gets light from one set of lines on the image; each eye is effectively just measuring intensity of light that gets through the cokour filter of the glasses, and has no need to distinguish between colours.

Comment Re:Justice (Score 5, Informative) 353

I can't speak for the rest of the EU, but in the UK the "fit for purpose" law is a surprisingly powerful bit of pro-consumer legislation. As well as requiring that a product actually does what the manufacturer claims that it does, the law also covers:

a) Functions that any reasonable person would expect the product to have, based on the advertising but also on similar products on the market. This doesn't obviate the customer's responsibility to do some research, just covers too-obvious-to-check things like if your brand new DVD recorder didn't include a DVD playback function

b) A robustness and lifespan that any reasonable person would expect the product to have. In the UK, all electrical goods worth more than a certain value (and some other classes of goods) are automatically garuanteed for one year, as part of the customer's statutory rights. But more interestingly, each type of product may also be garuanteed for a longer period based on what seems "reasonable". For example, a washing machine or cooker would be expected to last for several years under regular use before needing replacement or major repairs; if it fails within that timespan the customer can return it (Making those rip-off "extended garuantee" offers doubly useless). Better yet, the onus is on the shop to show that the failure to prove that it was due to your misuse, not you having to prove that it was a poor design or manufacturing defect.

Surprisingly few people know about these rights, and for good reason. If a product lacks features or develops a major fault too quickly, it's the shop's responsibility to replace the product or offer a refund to the customer; the shop owner is then left with the problem of getting that money back from the manufacturer. As you might imagine, they're not exactly keen to be in this position and so consumers are never told about it.

If the shop says "no" or tells you that you need to talk to the manufacturer yourself, they're either ignorant or lying. In which case, your next step is to get in touch with the Citizens' Advice Bereau and/or the Trading Standards Office, who are responsible for advising people about and enforcing the relevent laws, respectively.

Comment Why Not? (Score 1) 194

I've never really followed the arguments behind why everyone hates software patents. I'm not trolling here, please help me understand.

As I understand it, the idea behind a patent is to encourage an inventor to invest resources in R&D and then to share their new techniology with society, in return for a time-limited monopoly on exploiting that new technology. This is arguably a bit broken at the moment -- largely because patents seem to be overly broad and to last too long -- but the basic idea seems sound.

If I invent a new physical device -- an array of levers and cogs to build something, or a new chemical process to manufacture something -- I can patent it. I've put loads of time and effort into finding a new way to manipulate physical objects to either perform a new process on them, or to peform a new function. If it's useful and novel,.I can submit my plans and society grants me a patent.

However, if I invent a new algorithm or piece of software, society isn't willing to make the same deal with me. I see these as analogous to inventing a new machine part or a new device for someone's home. My invention is manipulating information instead of physical objects, but it's still useful and novel, and it's still improving a process or performing a new function. It's also still the result of considerable investment of time and resources.

I've seen the argument that information isn't patentable because it's easily copied; This doesn't work because it's the plans that are patented, and the blueprints for a machine part are as easiliy copied as a new algorithm or search routine. I've also seen the argument that patenting an algorithm harms companies that need to use that algorithm in their products, but I don't understand that either: obviously a really broad patent for e.g. "using subroutines" shouldn't be awarded any more than one for "using levers" in a physical device. However, a new technique for manipulating information with a specific and narrowly-defined purpose seems more analogous to patenting the coaxial escapement, an innovative improvement to the efficiency of a machine part with wide application. That seems pretty reasonable to me. Finally, I've seen the argument that the field of software development moves too quickly for patents to have a net benefit effect; this may be true, but seems like an argument to shorten patent life rather than abolish them entirely. All the other arguments I've seen are basically along the lines that the system is poorly administered and should therefore be removed entirely; why not push for a better-administered system, instead of pushing the baby out with the bathwater?

So I don't really see many advantages to destroying the current system, but for improving it instead. Conversely, there do seem to be advatages to keeping software patents. For example, let's say that tonight a radical new process for handling search results comes to you in a dream. You could put in time and effort to research it, hone it, prove that it works, and prepare your product. In a world with a functioning software patent system, you can then sell your IP to Google and live out your days on a private island populated entirely by scantily-clad people of whatever gender floats your boat. Without patents, you could put in all that time and effort, but the only way you could benefit from it is by starting your own google competitor (good luck) and praying that no-one else ever works out or steals your algorithm to immediately copy it (again, good luck).

The software patent system may be in need of repair, but is it really worth throwing the baby out with the bathwater?

Comment Re:Thank you Facebook (Score 1) 375

Does the fact that you uploaded the data onto their system give them ownership of it in perpetuity?

IIRC, when I signed up to FB, the Terms and Conditions for explicitly said that they own the rights for anything you upload: pictures, text, video, etc. I think they said it wouldn't be sold to third parties, but they do have the right to keep it forever, show it to other members and use it for any other purpose, including advertising the site.

If you believe the Consumerist and similar slightly-hysterical sites, the newer T&Cs do give FB the right to sell your data, e.g. selling photos to image banks if FB ever goes bust. I haven't looked at the newer agreement in detail, though, so I can't vouch for this.

Comment Re:Long winded troll (Score 1) 429

A lot don't. I work in biological science, and even with my mediocre maths education (I had four lectures on stats during my undergrad, plus one afternoon at the start of my PhD; everything else I've had to teach myself) I see a lot of people talking about statistical tests that they clearly don't understand.

It's sad but true that a lof of people end up in biology because they love science but can't handle the maths required by physics or even advanced chemistry. While there are plenty of exceptions, there's a very strong tendency to treat statistical tests as black-box tools: plug in the numbers, get an answer and don't worry too much about whether it's an appropriate test or what the answer actually means. The article's example of people misunderstanding the meaning of a p value from Student's T-test is actually distressingly common. Other things -- like designing and drawing conclusions from experiments without ever considering power calculations -- crop up a lot too.

The best area I've encountered so far is bioinformatics, which tends to be the realm of programmers and statisticians who've become interested in biology, rather than the other way around. I'm not in a position to give an informed assessment of their work, but the sheer pain on their faces when advising maths-impaired biologists on study design is a pretty solid sign that they're used to a much higher standard :).

Comment Targeting is the big problem (Score 4, Informative) 97

This is a cool variation on a basic idea that's been used before, and will make a great payload for cancer treatment. However, killing cancer cells is not all that difficult; rather targeting cancer cells is hard. It's all about the therapeutic index, i.e. the ratio of damage done to cancer cells against damage done to healthy tissue.

Talking about cancer as "a disease" is a big misnomer; at best it's a huge family of diseases (really nice explanation in this comic). Patterns do emerge -- certain tissues tend to have similar patterns of gene expression between people and therefore tend to give rise to similar cancers -- but each cancer that arises comes about in a different way, and evolves in response to different selective pressures within the body. The biggest of these pressures are fairly obvious like the need for neutrients (so "successful" cancers are the ones that evolve the ability to encourage blood vessels to grow around them) and evading the immune system. So, almost by definition, the outside of a cancer cell is forced to look as similar as possible to the outside or a healthy cell in the same tissue, to avoid detection.

There are some exploitable internal differences. Most cancers (but by no means all, or even close to all) express hTERT, a gene responsible for repairing the telomeres, whose degradation would otherwise limit the cells' replication. So some researchers (including my former lab) are working on techniques to exploit that e.g. viruses that can only kill cells expressing hTERT. The downside is that some legitimate cells also express hTERT, most notably your stem cells (bone marrow, some other tissues).

Another popular method is just targeting all cells that are highly metabolically active. Cancer cells tend to be working unusually hard (most cells in your body just sit there gently ticking over most of the time), so some cancer therapies target any cells that are burning through a lot of glucose (e.g. radiolabelled glucose is used as a source for imaging techniques like Positron Emission Tomography) or that are doing a lot of DNA replication as part of cell division. Again, though, this targets many cells in your body which are working this hard as a normal part of their programmes.

So, yeah, this is a cool payload but targeting is the hard part. If we knew what ligands to tie these particles to for targeting and how to persuade these huge particles to move against a pressure gradient and through a dense, disorganised extra-cellular matrix, cancers wouldn't be half the problem that they actually are. We could be using targeted viruses (piece of piss to do if you know what you're targeting and the surrounding tissue isn't too dense), metal nanoparticles, targeted liposomes (little hollow balls of fat) containing toxins or toxin precursors, modified antibodies to alert the immune system to the cancer cells, etc, etc.

Curing a cancer would be pretty easy: throw enough researchers and resources at one patient's specific tumour and we'll come up with a damn fine treatment. But curing all cancers -- different tumours arising from different tissues in different patients -- is seriously hard. We'll see fantastic advances in treating specific cancer types, but I seriously doubt that "a cure for cancer" is possible within our lifetimes. Although, heh, if you prove me wrong I won't be too upset :).

Comment Not their fault: scientific publishing model sucks (Score 2, Insightful) 67

Given the choice, all scientists would probably publish their research freely; it's actually pretty common practice in physics and maths. However, in other fields -- including biology -- this isn't realistically possible.

A scientist's career and a department's funding are entirely dependent on their reputation, which is almost completely dependant on getting your work published in high profile (a.k.a "high impact factor") journals. In order for these journals to accept amd publish your work, you have to sign over copyright to the publishing company, and agree that you won't distribute the article for free.Scientists get completely shafted in this system: We raise money, do the work, write the article, sign over copyright to the publisher then pay for the privilige of them selling our work for their own profit. Then we're contractually forbidden from passing on copies of our work to interested colleagues (or potential employers, etc), much less the wider world.

There are some exceptions to this. In the UK, certain funding bodies and research charities insist that all work funded by their money must be made freely available, either at time of publication or, more commonly, after a delay of half a year or more. In the USA, work funded by the NIH must be made freely available. This is still generally restricted to the researcher's own version of the paper (i.e. without the journal's professional typesetting), but at least the information gets out.

Scientists hate this system, but an individual scientist simply doesn't have the bargaining power. You want to negotiate with a journal? They'll simply refuse your paper and run one of the tens or hundreds of others competing for your spot. Want to make a principled stand and only submit to open-access journals? You can, but you can basically kiss your career and funding prospects goodbye. So it's simple pragmitism: not many people are willing to risk throwing their careers away in the fight to let non-professionals (and a huge number of cranks, if you've ever read the Nature comments boards) read their article for free.

Comment Re:Fix how it handles tabs (Score 1) 223

I use saved tabs as a sort of transient, rolling favourites folder for sites that I'll need next session but probably not after that. For example:

1) Today I installed OpenSuse for the first time. I've had lots of tabs open on wikis, FAQs and HowTos while sorting out various issues (Take pity, I'm a n00b). Every time I reboot or log out then log back in, all of those tabs re-open and scroll down to where I was last reading them. Very handy.

2) In work I need to read a lot of scientific papers. My normal pattern is to run a few searches and open up all the likely-looking articles in new tabs, then screen them for the articles that I actually have access to. Finally, when I have the relevent articles for the points I want to write about (10-30 tabs), I start reading them one by one. When I get to the end of the day I need to shutdown the computer but don't want to lose all my latest searching/screening results. I could store them as favourites but I'll probably never need 99% of them again; saving the session is ideal as it'll remember all the tabs, what order they were in, where I'd scrolled to and which one I had open. It's perfect for the way I work, analogous to leaving my textbooks, printouts and notes open on my desk for the next morning.

Slashdot Top Deals

"One lawyer can steal more than a hundred men with guns." -- The Godfather

Working...