Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment The academic game (Score 1) 64

As we know, science has a lot more to do with the sociology of research than we like to think (say hello to Alan Sokal). Bioinformatics has fallen short of its lofty initial goals because it became a prime example of what nefarious effects the struggle for publication can cause, and also of the alienation of a whole field of scientists by another field of...scientists (?)

The failures of Bioinformatics have to do with it becoming a gold mine for publication-hungry CS PhDs who - if you're familiar with some of the field's exploding literature in the early 00s (or "noughties" for some) - knew really close to nothing about the biology of what they were researching, and produced pile after pile of useless algorithms and "data driven discoveries" far removed from biological or clinical phenomena. A lot of these PhDs seemed to assume the medical and biological professionals had little to contribute, since they were utterly incapable of doing Math. That is how you got things like a room full of Mathematicians and Physicists discussing how to model viral activity, without a single real Biologist specialist in the room (I am not making this up).

So, the field quickly became inundated with research whose only sole purpose was carving out a name in the publication game for Physics and Computer Science majors who had failed to land a position in their original field. Naturally, real doctors and biologists looked at the sometimes infantile-minded simplifications (of Immunology, Metabolism, brain electric activity, genomic modelling, etc.) and sometimes downright asinine assumptions and just walked away. The literature become a huge pile of impenetrable research, to the delight of the graph algorithm researcher (to name one of these sub-fields), but completely alienated from bench biology. Since the clinical phenomena became an excuse for abstraction while the field exploded with new journals, I guess the real doctors and biologists continued on with their research, largely unimpressed with Bioinformatics. They could play their publication game all they liked, while doctors and biologist would continue doing their Real World research. It's a tale from the history of science that needs to be written, because it amounts to almost a decade of high hopes and lost expectations.

And all I mentioned can probably be researched, too. If you look at the papers and their impact, what do you see? How much of that is relevant for research that came later? What's the quantity of dead-end "data driven research"? I'm not thinking about the seminal algorithms, of course, but the spike of...noise that came later...

And to say nothing of the lack of real paradigmatic change in the way the Computer Science was done, with systems full of state, and the lack of theory for real concurrent biological systems. Which is nothing but a reflection of the poor state of the field of Computer Science for such systems. Likewise, the field seemed to approach things with brute force: throw more computing power at it, and metabolic networks will be solved, protein-folding will be solved. Seems like all that C++ and clock cycles weren't really cost-effective...ya think?

Comment Yes, absolutely (Score 1) 387

Yes, absolutely, let's take the "mystique" out of coding.

Right now, I have software (written in Forth - because it's the ultimate language for DSL, I found), deployed for 6.000 patients that we use everyday, to churn out patient's prescriptions. In our public health system, patients get their free medicines for diabetes, hypertension, depression, hypothyroidism, asthma, etc (whatever is the "continuous" medicines the municipality pays for). They have a legal right to get them renovated. Doctors used to go crazy with renovating them by hand. It's no fun when you care for, say, 900 people with hypertension. Of course, this is not the U.S. (it's Brazil).

Why did I write in in Forth? Because it was so fast to write a mini-DSL. I could not even believe it. I got that for *free*, in Forth - I got a free "readline", a free parser, strings whatever size they come (Forth strings are better and safer than C's), a database for free, etc. Concatenative stack-oriented programming is very restrictive. If your stack is wrong, the Forth compilers will bork right back at you. Since you' re doing concatenative programming, it's basically like a functional programming paradigm: you fit Lego blocks of code on other Lego blocks. Since you must do a lot of stack-testing, unit testing is immediate and obligatory, instead of a "practice" or "discipline" that you must incorporate to your workflow. If your stack ain't right, you break things now, so you know pretty soon if your code is ok or not. And keeping track of the stack might become burdensome, so you automatically start refactoring code (everybody's heard of Leo Brodie's refactoring masterpiece Thinking Forth, right?). You keep it simple. Basically, everything I read about how awesome Forth is, is true. As a matter of fact, I first got turned on to Forth when I saw a guy who'd written his very own CAD software (which blew my mind). When I asked him what language he'd written it in, he said: "Forth, because you can go from very low level (in his case - graphic primitives directly on the screen - no OpenGL, etc.) to a very abstract high level (Forth words that concatenate in a DSL)". I understand now that it wasn't so much that he was a genius (although he did have a degree from MIT...), but that he got the tools he needed (after all, you can get projective geometry "recipes" from a book).

Initially, the software was in C (C and Forth can be deployed to any old trashy piece of hardware around, right? - this might actually be relevant, if you're financially constrained - which I don't expect anyone form a rich country to really understand), but then pointer indirection, parsing, data structures, file I/O in C, etc., just bored me out of my mind. Obviously, I understand people feel smart when they can write C-oid code with pointers to pointers but I found that to be stupid. This has to do with the fact that I met better languages (Common Lisp, for instance), before I met C, so I stand largely nonplused whenever I see someone claiming to perform advanced magic in, say, Java. Really? Java is a broken Smalltalk semantics with a C-oid syntax.

Having no strings attached (meaning: no managers breathing down my neck, me being a physician), I decided to scratch the C code and just go all Forth (because I actually think that Forth, Smalltalk, ML(s) and Lisp(s) are the best languages out there). I used win32forth because it's public-domain, but the price for Forth compilers are very cheap (and just go look at the benchmarks...) and I might change to a proprietary Forth (better documentation would be the reason, and support). The code is ANS-Forth, 99% (and the part that isn't is easily changed) - by which I mean "portable".

Although we're already deploying it in Real Life, I keep perfecting it (addicted to Forth). Right now, I',m fiddling with the Forth black magic - turn the compiler on, or off, etc. Stuff you really can't do in C/C++/Java without a huge amount of tools. Only in Lisp. But Forth is kinda of easier than Lisp, actually, I found (the model is simpler - throw things on the stack, then consume the stack). As a matter of fact, I barely know Forth. I recommend reading the tutorials out there and the Forth books from Forth, Inc (Forth Application Techniques and the Forth Programmer's Reference).

Forth (and other "weird" languages) force you to re-think your problems. Do you really need object-orientation, string-based stuff, regexes, etc? Maybe not... Maybe you have a hammer, and you think everything is a nail...Maybe u r doin' it rong.

I taught the doctors to alter key files in Forth. Like "in the old days", they don't *exactly* know they're coding. It doesn't matter *at all*. All they care is that they can customize the software, and feed the database with the mini-language I developed that's so intuitive for them. Actually, I did it for me, but then other colleagues became interested. I had to do this, since the contract for the EMR hasn't come through yet ;-) When the purchased software finally arrives, it'll probably suck compared to my little Forthy thing - it'll probably have no DSL for prescriptions. Printing is delegated to more competent software, such as office packages.

Here's an example of the DSL we use. For doctors, it's very intuitive - as a matter of fact, I just show them snippets of code in the database, like the one bellow, and they immediately get it:


UC6
hctz25 30cp 1cpm
simva20 30cp 1cpn
L\D

This expands to (may seem strange, I don't know how doctors prescribe in the US):

[ Patient's name goes here]
Internal use - continuous - 6 months

Hydrochlorothiazide 25mg ----------- 30 pills
Take 1 (one) pill VO* 1x/morning

Simvastatin 20mg -------------------- 30 pills
Take 1 (one) pill VO* at night (before bed)

[ Town, date ]

*[VO = per oris. i.e, orally]

So, if I, a physician, can code in Forth to solve my problem, why shouldn'teveryone? I think everybody should learn to code, as it helps people with real problems. That's what computers are for: for coding .

Comment Re:Just no (Score 1) 380

The changes that were made in the software kernel and objective-c are there for a good reason. Blocks, Grand Central Dispatch, LLVM, Clang - all highly non-trivial and important. You do not HAVE to use them if you don't want to. Anyways, the point is that you get amazing tools. Ever since Mac OS X became Unix, we knew we could write lasting software. Mac OS X is not going away anytime soon - hopefully.

Now, the OS upgrades were - are - much easier and safer than, say, your average Linux, where shit broke *all*l the time. The OS upgrade price was much smaller then Windows. All in all, with the top notch hardware, the OS patched up, the reasonably-priced and easy upgrades, Apple proved to be a nice investment in the long run, with good amortized cost.

Because you still got a nice Mac OS X workstation doesn't mean you don't buy new Apple hardware. At least, that's my experience, personal and social. This has to do with Apple aiming for consumers with higher income (meaning individuals and companies). The other hardware vendors have to scramble and figure out who their market is - and that's why they aren't really brands, or no can really figure out why a Dell notebook is better than the HP competitor (in fact, they're both unoriginal and probably one is just as bad as the other).

The fact that Apple kept your OS patched made you respect Apple, because it showed Apple respected and valued its customers. They were there for you in the long run. They didn't just make up shitty policies overnight, like Microsoft. Apple was hassle-free.

Now, with Apple leaving users out in the cold is really shooting itself in the foot. It's a fundamental shift in customers relation, and I predict Apple will pay dearly in the long run - this might be the turning point - maybe it's downhill for Apple, if the perception becomes one of a company that no longer cares about its customers.

Comment Re: Here is one thing that I do notice (Score 1) 289

This is a fine example of how retarded Google's strategy is.
I mean, does anyone even remember how Google Docs was supposed to be the Microsoft Office killer??? Nowadays, you'd best not lay all your chips on their shit. They might do some "Spring cleaning" or some shit, and your SOHO workflow goes BOOM!

Comment Re: Yeah. (Score 1) 289

You mean, won't notice until they do notice, right?
Android has no updates for it.
The Google Play store will leave them dead in the water, sooner or later.
Google hasn't a clue about how to treat customers. Customers are just numbers they crunch on big data. Look how they simply pull out features people used everyday on their products (or how stagnant they are - or is there any other reason Evernote, Dropbox, etc were able to eat away at Google mindshare for niche products ?)
Fact is, Apple will give more return on your investment. That's why people remain loyal to that brand.
There's really only one brand name that you attach to Android (Samsung).
Android sells more because it's cheaper. It's the Windows 95 of mobile phones.

Comment Re:Anti-linux (Score 1) 279

You thought wrong. Linux was a free software project backed by IBM so Sun Microsystem and HP's Unix and every other proprietary Unix could be put out of business, because IBM sold Big Iron.

And now we're in a much better world because stuff that ran certified Unixen (such as radar and medical equipment) moved on to fucking Windows, because they couldn't trust the no-vendor-is-responsible business motto of the likes of Mr. Torvalds and Mr. Stallman.

Because if that situation, we began to see airports shutting down, because the air traffic optimization software went bezerk, banks and financial transactions that go sunny side up these days during marketing hours, medical emergency systems that hang, and so and so forth. And now, the world's biggest asshole, that Oracle CEO dick head owns Java. Whoopee!

Comment Re:This is how shuttleworth kills ubuntu (Score 1) 279

Mass-market Ubuntu, based on Unity ?!

Unity was the single most retarded move in Free Software. People mistakingly complain about Dash, but Dash is the piss-poor implementation of the modeless interface invented by Humanized, which was a good thing they invented (all home Windows machines run Humanized's Enso). And Qt is making C++ viable again for fast development, with its C++11 support - and the KDE guys have stuck with Qt through and through - but they just thought it better to break shit up at Ubuntu and stick with Gnome. Sure, Gnome is nice to have, it's all about C but it's showing its age. Like so many free software products, it is yet another proof that good quality demands a full time staff, and there is none for Gnome.

However, Unity is the piss-poor implementation of the piss-poor Gnome 2, which was the piss-poor implementation of the Mac OS 8 Human Interface Guidelines.

So it's recursively bad. It's just an endless spiral of shit. Ubuntu developers really need to get their heads out of their asses and take a look at the hardware accelerated Windows 8 (I know the UI is seriously broken, but the visual rendering is stunning) or Mac OS X. That is what a mass-market has come to expect, not an orange dock bar with icons in low-res an primary colors Get f*ing real!!

Comment Re:Shuttleworth shills ubuntu (Score 1) 279

Face it: Linux distros have failed. Everyone of them. Debian crumbled under its own weight (now, official purveyor of royal free software jelly for the Ubuntu Queen), Red Hat once was good, but now they sell per seat licenses and they've fed the deject (known as Fedora - or "bad smell") to the community, SuSE never really was, Mandriva is a stupid distro with a stupid name, Slackware died (it was a one-man show), the other small distros are just orbiting parasites and never get any traction.

The BSD distros are the complete opposite: resilient, tried & true, solid, a no holds bared, no bullshit truly free software, disciplined, community & business oriented, quietly toiling away and perfecting their solutions, and recently offering an edge over Linux, with many technically superior innovations that will never even be ported because of the GPL.

Suck it up, Linux fanboys.

Slashdot Top Deals

Remember, UNIX spelled backwards is XINU. -- Mt.

Working...