Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Wider problems in how to design tests (Score 2) 109

The article made good points: the issues are even wider. There is much more besides the type of test-animal strain that can make for problems in effective testing.

Nowadays some new pharmaceutical product-candidates are designed and intended to work by specifically interacting with some very human-specific features of materials present in the eventual treated patients. Sometimes product-candidates of this kind are not expected to interact with non-human animal substances in a corresponding way at all.

An example lies in the specificity of human-antibody-related products (some of them intended for use against types of cancer). Their effects may be hard to mimic and test in any non-human animal subject whatever.

This makes for much harder problems in test design than in the more straightforward old days of (for example) testing candidate antibiotics. That involves checking that the material does kill the target bug and does not damage the treated animal or human subject, and in the past, observations of tested animals often gave very good indications of what would happen next when the substance was given to humans. (Caution is still needed, and clinical trial regimens accordingly have to include careful human safety testing as a follow-up to successful and careful animal safety testing.)

But when the product candidate is supposed only to interact in a special way with very human-specific substances, somehow its safety and efficacy has to be effectively tested before it gets to humans -- but how? -- when no non-human animal can be expected to show the same type of effects whether wanted or unwanted.

This new twist to the problem of test design has not always been addressed successfully. A tragic example occurred a few years ago, when a modified antibody with a design incorporating very unusual and specifically human-human interactions passed the animal safety testing that was decided on, but then went on to injure severely the first few human test volunteers by causing major acute iflammatory effects not seen in the animal tests.

The issues go well beyond selection of strains of test animals and sometimes the solutions may have to be developed on a case-by-case basis..

-wb-

Comment Re:Overprescribing and sources of drug resistance (Score 2) 433

I believe you're making some unjustified presumptions. It wasn't relevant in the original post to go into so much detail, but the patient in question was an insulin-dependent diabetic, and the first physician certainly should have seen a clear reason for using antibiotic in an infected -- or even a probably infected -- patient of this category. There was no mistranslation or misunderstanding by the patient, but there are frequently a whole lot of unjustified putdowns of patients by folk who assume with much hope but little reason that the physician is always right.

btw, you also mis-represented my earlier post, which did not mention any role of 'not finishing the prescription': What I did mention was antibiotic courses being too short. There are two causes, not one, of antibiotic courses being too short, the patient may not finish the course, as you suggest, or the course prescribed may not be long enough. So here is another example of presuming in the absence of facts that the patient is to blame: it seems you could not conceive of any cause of the too-short course other than the patient not taking the medicine as prescribed.

-wb-

[]ColdWetDog wrote:]
I suspect that the 'ideologically taken' physician is relatively rare (never say never in medicine). It is much more likely that the first physician did not see a clear reason for using antibiotics, the second doc already had the benefit of the first doc being 'wrong' and he/she did the obvious second thing - start the antibiotic. The rationale gets mistranslated or misunderstood by the patient - or simply was never explained in the first place. In that respect, the system did it right - try to avoid the antibiotic if at all possible, if not use it wisely.

Comment Overprescribing and sources of drug resistance (Score 1) 433

The article misrepresents the position - antibiotics don't "encourage bacteria to develop new ways of overcoming them", they just leave behind bacteria that have more resistance. It's therefore very important to go Darth Sidious on their ass and "wipe them out. All of them.", or the few that remain will multiply, unobstructed by their cream-puff peers who are all dead now.

Yes, the parent post makes an important point -- at least, the chance of fostering resistant bacteria is higher if a treatment course with antibiotics leaves any of the bacteria still alive, which might happen if the course was too short, or if the dose was not high enough to reach lethal concentration for the bacteria in some place in the body that the bacteria were using as a hideout, with poor blood circulation (e.g. sometimes tooth or bone). Then the surviving bacteria descended from those exposed to sublethal dose _may_ have more resistance.

There's a problem, though, with existing stories of how to discourage the growth of bacterial resistance to antibiotics. The stories tend to concentrate on human antibiotic prescriptions, while they often ignore the role of the enormous use of antibiotics for non-human animals, especially in agriculture. Human physicians sometimes forget the veterinary role in causation. Some of them are so ideologically 'taken' with the message 'be restrained in your prescriptions' that I have seen a patient who really needed antibiotic actually refused it on environmental grounds. She got sicker and sicker until another physician acted at last with some common sense, and then (thankfully) the effect of the medication was as dramatic as in the early accounts of antibiotics from the mid-20th c. when their use first spread.

-wb-

Comment Would switch, kept with Windows by one function (Score 1) 1880

Much of the discussion about why/why not switch to linux seems to stay with very generic arguments.

For me, the generic argument has been made, and it does come down on the linux side. But the specifics then simply compel me to take the (opposite) decision. It might be useful if I describe how this happens.

I run double-booting systems with both Windows and linux, but I find myself booting up linux less often instead of more often. There is one specific function that I can't work without: I can't get a linux system to do it. No matter that else linux will do etc, not doing that one specific thing is what bars my way from exiting Windows for linux -- which otherwise I would like to do. Maybe it is not a very difficult function, but it seems to call for more cooperation between sub-systems than actually exists.

Maybe the details don't matter so much in this thread but here they are in case they show more than what's just been said.

It's only about viewing and printing pdfs. When I want to print to my laserjet printer from a pdf that is basically a scanned image, I want to see a thumbnail of the image as it will be on the page, that reflects the current state of the image magnification control. Then I often adjust the control of the magnification, to adjust the fit with the page, before I hit the 'print' button.

Only certain s/w combinations in Windows do this, the rest (and all the linux measures that I have tried) provide instead uninformative symbols that do not include the needed adjustable pre-print thumbnail. My questions in various times and places have got me nowhere: answers have been either generic ('use xpdf' -- which I've tried), without relating to the specifics of the question, or no answers at all.

This looks to be a function that no-one else is interested in. So by itself, it obviously can't account for any slowness of linux adoption. But if there are other folk who might change, and they have their own different individual wants, and they meet similar obstacles, then in the aggregate this lack of process could account for a lot.

-wb-

Comment js problems (Score 0) 185

The reason Javascript is the most popular is obvious (to me at least): the web is based primarily on three languages - HTML, CSS, and Javascript.

The article said it is overrepresented in SO queries. That's like being 'popular' in something of the way that a highly prevalent disease might be called 'popular'. (I don't deny it may be actually popular as well.)

But the features or bugs of js that make its problems overrepresented on SO are probably linked to the characteristics of js that make me groan whenever I see that a webpage demands the use of it.

I wish it could be outlawed!

-wb-

Comment Re:Frivolous patents and lawsuits (Score 1) 151

Ok, Novell put up a stalwart defence and won it, yes, kudos to them. And as you say, it took 6 1/2 years.

But did they get SCO nailed for bringing an action that was frivolous or vexatious in the first place? I don't recall reading that this was part of it? So we might still be looking for any example of that.

-wb-

Comment Frivolous patents and lawsuits (Score 5, Interesting) 151

>> "there should be stiff penalties for frivolous lawsuits"

> There are, if you can prove that its frivolous and/or using the court systems as an anti-competitive hammer. If the court really decides that youre a nuisance, they can nail you pretty hard.

But it's so costly and difficult to run that particular legal marathon, hardly anybody has ever completed the course. (Really, has anybody _ever_ actually completed it?)

The problem is more fundamental: "The grant of invalid patents is a serious evil insomuch as it tends to the restraint of trade and to the embarrassment of honest traders and inventors..."
That was the Fry Committee in 1901, recognizing that fundamental truth. The same applies, of course, to other forms of IP as well, not only patents. But which policymakers and legislators in power remember that now?

-wb-

Comment Misuse of the law (Score 1) 297

Still doesn't make it right to get a secret court order to inflict willful damage on a person's property.

I agree. Reads like a sickening abuse by the complainant company that asked the court to grant this disproportionate remedy. I've deleted their s/w application off my two computers that had it, and I will watch out to make sure I don't make use of any product from them again.
 

Comment Version information can be important (Score 4, Insightful) 683

Maybe the developers want me to have the latest version, but it's not always what I want, and above all, whether latest version or not, I want to know what I've actually got.

From my pov, this will ensure that I never go back to Firefox (after abandoning it a while back because of the memory leaks and denials that there was a problem.)

-wb-

Comment 'Verifiability, not truth' stops it working well (Score 1) 137

In areas where it "works" -- science, engineering, other technical subjects, reference information (e.g. documenting the stations of a country's rail networks) -- Wikipedia has vastly increased the consistency, coverage, and quality of easily-available information on a huge number of subjects.

It would be good to think that it "works" at least in those areas --- but I'm afraid that's not true without qualification.

There are many technical and scientific topics where there are popular misconceptions around. In too many Wikipedia cases it is unfortunately the popular misconception that survives on Wikipedia.

In some cases, editors favor a popular but mistaken view, and even take down sound citation-supported posts, deleting good references to replace them with substandard material (supported by poor citations that occasionally are even journalistic hogwash).

They can do this within Wikipedia policies, because Wikipedia has a policy for "verifiability, not truth" http://en.wikipedia.org/wiki/Wikipedia:Verifiability. A journalistic hogwash print source can still be a perfectly "reliable" (in the Wikipedia sense) supporting citation to "verify" (in the Wikipedia sense) a popular misconception.

So when editors put in substandard references and delete better references, in support of edits pushing their preferred point of view, this false 'Verifiability' standard means there is nothing much in practice to stop them.

The Wikipedia "verifiability, not truth" standard is false because a verifiability that is 'not truth' is only a pseudo-verifiability. There's seemingly no policy that (with any reasonable force) facilitates and encourages exclusion of pseudo-verifiable but untrue material.

I would want to wish Wikipedia success in building a truthful and reliable encyclopedia, but the current pseudo-verifiability policy means that it can't even be moving in the right direction.

-wb-

Comment Problems over 'verifiability' instead of truth (Score 1) 219

This is why it's important to cite reliable sources, so any reader can verify the information. The real problem with Wikipedia is not the incorrect information (although there certainly are some errors), but all the useless crud that keeps gathering in articles.

Ok, it's important, but first, there's no real effective mechanism to make sure it happens at all, there are plenty of articles with unsupported outlandish statements.

Second, even when sources are given, there's nothing to stop any old rubbish that finds its way into half-credible print being used as 'supporting' citation to peddle some false line. Some editors will even put in substandard references and delete better references, in support of their chosen point of view, and again there's nothing much in practice to stop them.

For me the clincher is the Wikipedia policy for "verifiability, not truth" http://en.wikipedia.org/wiki/Wikipedia:Verifiability. A verifiability that is 'not truth' is only a pseudo-verifiability. There's seemingly no policy that (with any reasonable force) facilitates and encourages exclusion of pseudo-verifiable but untrue material.

I can't understand how Wikipedia credibility can survive a policy like the current state of its 'verifiability' 'standard'.

Proper verifiability should be expressly a means or a route towards truth, not a substitute for it, and subject to scrutiny and recall in that light.

-wb-

Comment Re:Why can't the text of these books be clearer? (Score 1) 58

the book-scanning process is completely automated.

Well if it really is automated, how comes it that some of the scanned pages show part of the hand (complete with finger rings!) of the person who was doing the scanning? It looks as if the scanning was done by someone who didn't realise that the text can't be read if there's a hand between the page and the scanner-glass!

I reckon that's a manual process.

-wb-

Comment GPL limits rights that employee has to transfer (Score 1) 504

The employer can only claim copyright over the portion of the work that was created under their employ. However, the sticky bit is that they licensed the original work under the GPL, so they must not distribute their additional code and, if they do, they must make it available under the GPL AND share copyright on the code with the original author.

Yes, and perhaps another way of putting it is like this. The contract between the employer and employee can obligate the employee to transfer his rights. But in such a transfer the employer can't receive wider rights than the employee had in the first place before he handed them over.

-wb-

Slashdot Top Deals

If you had better tools, you could more effectively demonstrate your total incompetence.

Working...