Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?

Comment Re:Cha-ching (Score 1) 64

Imagine a world free of public health. Imagine a world where the free asshole thinks that when the public health authority demands you take a vaccine, your liberty is being encroached upon. So imagine a world where no public health authority exists. Imagine a world where another free, unvaccinated asshole sneezes at your face, and imagine that in that free world, you simply died, because you were all free to do as you like, such as deciding not to spend your money on vaccination. Except, you're an ignorant fuck who knew diddly-shit about vaccines or epidemics, so you become a dead free man. That's real smart.

In general terms, there's no free lunch. When you mentioned "free", you probably meant "public health". And, just like the army knows best sometimes, the medical authority knows best sometimes because that's what they do: they study and take care of the health of large populations.

Comment Re:The Load (Score 1) 64

Fear not. The most cost effective model if the one based on general practice family medicine. It achieves an 85% resolution rate, with the other 15% of patients being referred to secondary and tertiary health centers. That is the most cost-effective model in the world and even private companies are slowly gearing towards that. All large effective public health systems in the world work like that. Slowly but surely the US will move towards something more like the British or Canadian systems (to give examples you can relate to).

It's either that, or systems brake due to increasing costs. It's cool to have a super-robot to perform surgery on you. Better yet, take preventive steps so you won't need the super-robot surgery. You will likely benefit from a better quality of life, too.

Comment Re:Poster should consider going back to the clinic (Score 1) 64

There will be some benefits in selection of oncology protocols in the short term, but knowing cancer genomics does not actually lead to new chemotherapeutic agents except in the long term

What about vaccines? Any informed physician that looks at the data can explain to you that bioinformatics has contributed close to nothing in terms of new vaccines. Why is that?

With regards to cancer genomics, there are a bunch of questions that arise from potential treatment that the pharmacogenomics peddlers never mention: how will you conduct trials? Will you promote small clinical trials, with chemo agents that represent small molecular variations? How will you manufacture such molecular variants? What would they cost? When you finally give them to humans, how will you monitor the clinical trials? Will you have a large enough sample (in the statistical sense)? What's the control? BTW, when I mentioned peddlers, I mention peddlers. I don't want to generalize.

The one area where whole exome sequencing and related technologies are likely to change care in a meaningful way is pediatrics and fetal medicine where there are tons of rare, fatal things due to rare point mutations.

I beg to disagree. When you have very rare diseases, you get extremely low frequencies. That means not one doctor will get to specialize or gain experience in such rare diseases. Every time the disease pops up, it will be a novelty for the doctor. What you need, what we need in Medicine is a way to keep extremely long term databases. In this way, data can be accumulated and sifted, and patterns that arise here and there throughout decades, that could be mined. The deployment of these databases would revolutionize the care of patients burdened with extremely rare diseases. It's of little use detecting the condition if you don't know how to care for the patient.

Have any of you read A Fire Upon the Deep, by Vernor Vinge?

In this book, the idea of databases spanning centuries is part of a central plot in the story.

Comment Re:Somewhat tangential (Score 1) 64

If you're spending thousands of dollars for genetic testing for a $4 a month drug like warfarin, you're doing it way wrong. It's like the proverbial million dollar cure for the common cold.

Right on, brother, right on!

Do these guys even know the algorithms for warfarin dose adjustment real doctors use for their real patients?

When you think of a test, you think of them in terms of a large scale. This is thinking in terms of public health (a term unfamiliar to the large American public, but one whose meaning all doctors understand). It means you gotta factor in things like cost-effectiveness. I mean, for a test to cost thousands of dollars, when we have simple, tried-and-true algorithms for warfarin dose adjustment is insanely stupid.

This is a yet another prime example of stupid research, a prime example of not having a clue about what's relevant in clinical medicine!

Comment The academic game (Score 1) 64

As we know, science has a lot more to do with the sociology of research than we like to think (say hello to Alan Sokal). Bioinformatics has fallen short of its lofty initial goals because it became a prime example of what nefarious effects the struggle for publication can cause, and also of the alienation of a whole field of scientists by another field of...scientists (?)

The failures of Bioinformatics have to do with it becoming a gold mine for publication-hungry CS PhDs who - if you're familiar with some of the field's exploding literature in the early 00s (or "noughties" for some) - knew really close to nothing about the biology of what they were researching, and produced pile after pile of useless algorithms and "data driven discoveries" far removed from biological or clinical phenomena. A lot of these PhDs seemed to assume the medical and biological professionals had little to contribute, since they were utterly incapable of doing Math. That is how you got things like a room full of Mathematicians and Physicists discussing how to model viral activity, without a single real Biologist specialist in the room (I am not making this up).

So, the field quickly became inundated with research whose only sole purpose was carving out a name in the publication game for Physics and Computer Science majors who had failed to land a position in their original field. Naturally, real doctors and biologists looked at the sometimes infantile-minded simplifications (of Immunology, Metabolism, brain electric activity, genomic modelling, etc.) and sometimes downright asinine assumptions and just walked away. The literature become a huge pile of impenetrable research, to the delight of the graph algorithm researcher (to name one of these sub-fields), but completely alienated from bench biology. Since the clinical phenomena became an excuse for abstraction while the field exploded with new journals, I guess the real doctors and biologists continued on with their research, largely unimpressed with Bioinformatics. They could play their publication game all they liked, while doctors and biologist would continue doing their Real World research. It's a tale from the history of science that needs to be written, because it amounts to almost a decade of high hopes and lost expectations.

And all I mentioned can probably be researched, too. If you look at the papers and their impact, what do you see? How much of that is relevant for research that came later? What's the quantity of dead-end "data driven research"? I'm not thinking about the seminal algorithms, of course, but the spike of...noise that came later...

And to say nothing of the lack of real paradigmatic change in the way the Computer Science was done, with systems full of state, and the lack of theory for real concurrent biological systems. Which is nothing but a reflection of the poor state of the field of Computer Science for such systems. Likewise, the field seemed to approach things with brute force: throw more computing power at it, and metabolic networks will be solved, protein-folding will be solved. Seems like all that C++ and clock cycles weren't really cost-effective...ya think?

Comment Yes, absolutely (Score 1) 387

Yes, absolutely, let's take the "mystique" out of coding.

Right now, I have software (written in Forth - because it's the ultimate language for DSL, I found), deployed for 6.000 patients that we use everyday, to churn out patient's prescriptions. In our public health system, patients get their free medicines for diabetes, hypertension, depression, hypothyroidism, asthma, etc (whatever is the "continuous" medicines the municipality pays for). They have a legal right to get them renovated. Doctors used to go crazy with renovating them by hand. It's no fun when you care for, say, 900 people with hypertension. Of course, this is not the U.S. (it's Brazil).

Why did I write in in Forth? Because it was so fast to write a mini-DSL. I could not even believe it. I got that for *free*, in Forth - I got a free "readline", a free parser, strings whatever size they come (Forth strings are better and safer than C's), a database for free, etc. Concatenative stack-oriented programming is very restrictive. If your stack is wrong, the Forth compilers will bork right back at you. Since you' re doing concatenative programming, it's basically like a functional programming paradigm: you fit Lego blocks of code on other Lego blocks. Since you must do a lot of stack-testing, unit testing is immediate and obligatory, instead of a "practice" or "discipline" that you must incorporate to your workflow. If your stack ain't right, you break things now, so you know pretty soon if your code is ok or not. And keeping track of the stack might become burdensome, so you automatically start refactoring code (everybody's heard of Leo Brodie's refactoring masterpiece Thinking Forth, right?). You keep it simple. Basically, everything I read about how awesome Forth is, is true. As a matter of fact, I first got turned on to Forth when I saw a guy who'd written his very own CAD software (which blew my mind). When I asked him what language he'd written it in, he said: "Forth, because you can go from very low level (in his case - graphic primitives directly on the screen - no OpenGL, etc.) to a very abstract high level (Forth words that concatenate in a DSL)". I understand now that it wasn't so much that he was a genius (although he did have a degree from MIT...), but that he got the tools he needed (after all, you can get projective geometry "recipes" from a book).

Initially, the software was in C (C and Forth can be deployed to any old trashy piece of hardware around, right? - this might actually be relevant, if you're financially constrained - which I don't expect anyone form a rich country to really understand), but then pointer indirection, parsing, data structures, file I/O in C, etc., just bored me out of my mind. Obviously, I understand people feel smart when they can write C-oid code with pointers to pointers but I found that to be stupid. This has to do with the fact that I met better languages (Common Lisp, for instance), before I met C, so I stand largely nonplused whenever I see someone claiming to perform advanced magic in, say, Java. Really? Java is a broken Smalltalk semantics with a C-oid syntax.

Having no strings attached (meaning: no managers breathing down my neck, me being a physician), I decided to scratch the C code and just go all Forth (because I actually think that Forth, Smalltalk, ML(s) and Lisp(s) are the best languages out there). I used win32forth because it's public-domain, but the price for Forth compilers are very cheap (and just go look at the benchmarks...) and I might change to a proprietary Forth (better documentation would be the reason, and support). The code is ANS-Forth, 99% (and the part that isn't is easily changed) - by which I mean "portable".

Although we're already deploying it in Real Life, I keep perfecting it (addicted to Forth). Right now, I',m fiddling with the Forth black magic - turn the compiler on, or off, etc. Stuff you really can't do in C/C++/Java without a huge amount of tools. Only in Lisp. But Forth is kinda of easier than Lisp, actually, I found (the model is simpler - throw things on the stack, then consume the stack). As a matter of fact, I barely know Forth. I recommend reading the tutorials out there and the Forth books from Forth, Inc (Forth Application Techniques and the Forth Programmer's Reference).

Forth (and other "weird" languages) force you to re-think your problems. Do you really need object-orientation, string-based stuff, regexes, etc? Maybe not... Maybe you have a hammer, and you think everything is a nail...Maybe u r doin' it rong.

I taught the doctors to alter key files in Forth. Like "in the old days", they don't *exactly* know they're coding. It doesn't matter *at all*. All they care is that they can customize the software, and feed the database with the mini-language I developed that's so intuitive for them. Actually, I did it for me, but then other colleagues became interested. I had to do this, since the contract for the EMR hasn't come through yet ;-) When the purchased software finally arrives, it'll probably suck compared to my little Forthy thing - it'll probably have no DSL for prescriptions. Printing is delegated to more competent software, such as office packages.

Here's an example of the DSL we use. For doctors, it's very intuitive - as a matter of fact, I just show them snippets of code in the database, like the one bellow, and they immediately get it:

hctz25 30cp 1cpm
simva20 30cp 1cpn

This expands to (may seem strange, I don't know how doctors prescribe in the US):

[ Patient's name goes here]
Internal use - continuous - 6 months

Hydrochlorothiazide 25mg ----------- 30 pills
Take 1 (one) pill VO* 1x/morning

Simvastatin 20mg -------------------- 30 pills
Take 1 (one) pill VO* at night (before bed)

[ Town, date ]

*[VO = per oris. i.e, orally]

So, if I, a physician, can code in Forth to solve my problem, why shouldn'teveryone? I think everybody should learn to code, as it helps people with real problems. That's what computers are for: for coding .

Comment Re:Just no (Score 1) 380

The changes that were made in the software kernel and objective-c are there for a good reason. Blocks, Grand Central Dispatch, LLVM, Clang - all highly non-trivial and important. You do not HAVE to use them if you don't want to. Anyways, the point is that you get amazing tools. Ever since Mac OS X became Unix, we knew we could write lasting software. Mac OS X is not going away anytime soon - hopefully.

Now, the OS upgrades were - are - much easier and safer than, say, your average Linux, where shit broke *all*l the time. The OS upgrade price was much smaller then Windows. All in all, with the top notch hardware, the OS patched up, the reasonably-priced and easy upgrades, Apple proved to be a nice investment in the long run, with good amortized cost.

Because you still got a nice Mac OS X workstation doesn't mean you don't buy new Apple hardware. At least, that's my experience, personal and social. This has to do with Apple aiming for consumers with higher income (meaning individuals and companies). The other hardware vendors have to scramble and figure out who their market is - and that's why they aren't really brands, or no can really figure out why a Dell notebook is better than the HP competitor (in fact, they're both unoriginal and probably one is just as bad as the other).

The fact that Apple kept your OS patched made you respect Apple, because it showed Apple respected and valued its customers. They were there for you in the long run. They didn't just make up shitty policies overnight, like Microsoft. Apple was hassle-free.

Now, with Apple leaving users out in the cold is really shooting itself in the foot. It's a fundamental shift in customers relation, and I predict Apple will pay dearly in the long run - this might be the turning point - maybe it's downhill for Apple, if the perception becomes one of a company that no longer cares about its customers.

Comment Re: Here is one thing that I do notice (Score 1) 289

This is a fine example of how retarded Google's strategy is.
I mean, does anyone even remember how Google Docs was supposed to be the Microsoft Office killer??? Nowadays, you'd best not lay all your chips on their shit. They might do some "Spring cleaning" or some shit, and your SOHO workflow goes BOOM!

Comment Re: Yeah. (Score 1) 289

You mean, won't notice until they do notice, right?
Android has no updates for it.
The Google Play store will leave them dead in the water, sooner or later.
Google hasn't a clue about how to treat customers. Customers are just numbers they crunch on big data. Look how they simply pull out features people used everyday on their products (or how stagnant they are - or is there any other reason Evernote, Dropbox, etc were able to eat away at Google mindshare for niche products ?)
Fact is, Apple will give more return on your investment. That's why people remain loyal to that brand.
There's really only one brand name that you attach to Android (Samsung).
Android sells more because it's cheaper. It's the Windows 95 of mobile phones.

Slashdot Top Deals

Systems programmers are the high priests of a low cult. -- R.S. Barton