Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror

Comment: Re:cis and mi regulation is not "bad" code (Score 2) 13

by rockmuelle (#48638267) Attached to: Machine Learning Reveals Genetic Controls

For small genomes, yes, but for large genomes, there is a lot of "unused" material.

Only about 6-10% of the human genome is transcribed into RNA, either protein the coding kind or non-coding types used in regulation. (small genomes are almost always entirely coding and even include overlapping coding regions, large genomes are the ones that have "junk" DNA in them)

Transcription is most closely related to a processor reading machine code and doing something with it. In a computer program, we know that we can safely remove dead code paths and the code will still function. This is not true for DNA. Remove a portion of someone's genome and they usually die.

It's much more likely that the "junk"/"noise" regions of the genome are structural and help the DNA coform so the chromosomes can specialize for different functions. DNA folds differently depending on the cell type in multicellular organisms. Because the nucleus of a cell is a fairly crowded place, the way the DNA folds determines which sites on it are even accessible for transcription. Muscle cells expose one set of gene coding regions, fat cells expose another.

Taken from this perspective, large genomes are more akin to an origami fortune teller than machine code. Depending on the series of folding/unfolding events, a specific fortune is revealed. The fortunes are encoded directly onto the paper, but the paper also forms the structure used to access the fortunes. Another actor reads the instructions and acts on them (a person in the origami case or polymerase for DNA).

Comment: Re:Not in the hospital I work at. (Score 1) 73

by rockmuelle (#48498343) Attached to: Intel Processor Could Be In Next-Gen Google Glass

I do similar systems for genomics. Despite all the hype around cloud services in our space, we're finding more interest in local copies of the standard databases with links out to the canonical sources as needed. The local copies keep hospital IT happy and ensure access if the network is wonky.

And, it turns out that most clinicians are comfortable sorting through database records on their own and don't like magic algorithms attempting to do it for them. Access to the basic data is what they want.

-Chris

Comment: Re:Oh no (Score 3, Informative) 297

by rockmuelle (#48350223) Attached to: Study: Body Weight Heavily Influenced By Heritable Gut Microbes

The first few weeks of any training program typically suck. That's where willpower (or encouragement if you're in a group) plays such an important role.

Once I'm passed the initial hump, I always feel the "addictive" need to get more exercise and chase the high. In my specific case, the "high" comes after sustained exertion in the med/high effort range. I rarely see it biking (I'm a bike commuter and never really push myself). But running, climbing, mountaineering, and snowboarding all bring it out. For running, on long runs at a moderate pace it kicks in around mile 5 or 6. For short, faster runs, it kicks in about 30 minutes after the run and lasts for a few hours. Other sports have similar patterns. In my experience, the feeling is most similar to hydrocodone (which, unfortunately, I also know about from running).

Wikipedia's description of the "runner's high" covers some of the suspected mechanisms for it.

-Chris

Comment: Programming complexity (Score 2) 181

by rockmuelle (#48345235) Attached to: There's No Such Thing As a General-Purpose Processor

A big reason we accept the trade offs of modern processors is that it's generally easy to program a broad range of applications for them.

In the mid aughts (not very long ago, actually), there was a big push for heterogeneous multi-core processors and systems in the HPC space. Roadrunner at Los Alamos was a culmination of this effort (one of the first petascale systems). It was mix of processor types including IBMs Cell (itself a heterogeneous chip). Programming Roadrunner was a bitch. In having different processor families, you had to decompose your algorithm to target the right processor for a given task. Then you had to worry about moving data efficiently between different processors.

This type of development is fun as an intellectual exercise, but very difficult and time consuming in practice. It's also something compilers will never be good at, requiring experts in the architectures, domains, and applications to effectively use the system.

Another lesson from the period (and one that anyone whose done asics has known for years) is that general purpose hardware generally evolves fast enough to catch up with specialized hardware with a reasonable timeframe (usually 6-18 months, see DE Shaw's ASIC for protein folding as an example).

While custom processors are cool (I love hacking on them), they're rarely practical.

-Chris

Comment: One semester is a start (Score 1) 173

by rockmuelle (#48342589) Attached to: Codecademy's ReSkillUSA: Gestation Period For New Developers Is 3 Months

So, there is some truth to the 3 month number. I learned C from a minimal programming background in one semester as an undergrad, or about three months. Of course, 20 years later I'm still refining my skills. The rest of the CS degree gave me a much more solid foundation than I'd have if I had gone straight to work after learning C. Surprisingly, basic theory like complexity analysis come in handy when building applications.

-Chris

Comment: Re:Bruce Perens (Score 1) 240

by rockmuelle (#48039329) Attached to: Back To Faxes: Doctors Can't Exchange Digital Medical Records

Open Standards and Protocols are what this space needs, along with regulations requiring vendors to allow interoperability for free or a nominal fee.

Open Source software, on the other hand, won't really solve any problems. Someone has to write the software and vet it. EHR software isn't an itch people typically want to scratch. Of course, an EHR platform could leverage Open Source software for development. A Web-based EHR could use an entire Open Source stack and even contribute libraries for protocol support.

Open Source is great for infrastructure components, not so great for user-facing applications. At some level in the stack, someone needs to do the UX work, testing, and validation to create an application people can actually use.

I would never advocate for a fully Open Source solution for EHRs or any other complex, user-facing software, but I would put incentives in place to leverage as much Open Source in the stack as possible. Plus, any company that does that right will have much cheaper dev costs and will be able to undercut the competition a bit (though for supported software, dev costs are usually only 10-20% of the costs, with support, marketing, sales, etc taking up the bulk of the costs).

-Chris

Comment: Re:sounds like a job for (Score 1) 240

by rockmuelle (#48039217) Attached to: Back To Faxes: Doctors Can't Exchange Digital Medical Records

Um, Google tried the whole GoogleHealth thing a few years back and gave up: http://en.wikipedia.org/wiki/G...

This is not an easy space to play in. Hospitals and doctors are slow to change. Once an investment has been made in a particular platform it's very difficult to replace it.

-Chris

Comment: If High School is sufficient for CS, then why not? (Score 1) 392

by rockmuelle (#47919747) Attached to: Ask Slashdot: Any Place For Liberal Arts Degrees In Tech?

The question is interesting in relation to the current bias against four year degrees for software developers in some circles. If, as Peter Thiel claims, you don't need a degree, then it shouldn't matter what your degree is if you get one. So, from that perspective, a tech degree or a liberal arts degree shouldn't make a difference. If a liberal arts degree makes for a more intellectually well rounded person, then it could be argued that that's the better degree for tech.

Of course, I don't buy Peter's argument at all. A good CS degree teaches foundational methods that can be applied throughout a career. Don't get me started on the number of times basic complexity theory or knowledge of the full memory hierarchy has helped improve performance of web pages. Most hobbyists don't have those skills and write them off as just academic oddities. A good CS degree also exposes you to a range of technologies and methods for developing software (no, CS is not just math, no more than physics is just theoretical physics). It gives you an environment where you can develop your skills and gain exposure to the breadth of topics in the field. It's a Good Thing(tm).

Should all programmers have CS degrees? Of course not, but those that do are always going to have an edge over most of the other ones (there are always exceptions - I know a few great developers without degrees).

-Chris

Comment: Re:Some classes would be AWESOME! (Score 1) 182

by rockmuelle (#47908539) Attached to: Oculus Rift CEO Says Classrooms of the Future Will Be In VR Goggles

VR simulations are only as good as our ability to model and simulate the things we're studying. Physics, maybe. Chemistry and Biology, no way. The latter two are messy and don't lend themselves to simulation expect in a few very specific situations. If it's simply for information retrieval and watching videos, a book or screen is sufficient.

I've spent a lot of time with various 3D emersion technologies and scientific applications (old-school VR, Caves, polarized googles, etc) and the reality is that they don't add much. Don't get me wrong, they make GREAT demos. I love playing with the technology. But, spend any amount of time doing real work with them and their limitations quickly become apparent. It's not that the technology doesn't work, it's that most content doesn't really lend itself to the medium and for content that does, getting the user experience right is a difficult and expensive task.

-Chris

Comment: Re:Wait a minute, a few years ago I recall and AA (Score 1) 819

And that is how our current implementation of the free market actually works. No business action is made for the customer's benefit. It's always about making one more dollar off a captive customer base and pretending you're doing them a favor. America needs to return to stakeholder capitalism rather than the current shareholder model (yes, there actually are different models for market-based economies).

Comment: Re:What about (Score 1) 193

by rockmuelle (#47705781) Attached to: C++14 Is Set In Stone
Yes!!! I wish I had mod points. They basically had them ready to go for C++11 and then committee infighting killed them (Bjarne stubbornly backed the wrong horse - not that I have a strong opinion on this or anything ;) ).

Syntactic support for generic programming would be the single best addition to C++ to breathe new life into the language and get a whole generation of developers who've written it off interested in it. Generic programming is as paradigm shifting as OOP. It just kills me that it's so thoroughly obfuscated by template meta-programming in C++.

Comment: Re:good (Score 2) 125

by rockmuelle (#47655481) Attached to: The Fiercest Rivalry In Tech: Uber vs. Lyft
Markets are defined by their rules, plain and simple.

The rules for an ideal free market are pretty straight forward: everyone is free to do whatever they want. There's also another term for this approach in the political sphere: anarchy.

What most people really mean when they say free market (in America, at least) is a market defined by the rules of property law (the foundation of most western legal systems). As soon as you have some basic rules, you no longer have a free market.

A real free market is a theoretical extreme, like an ideal gas. It's useful for reasoning about things, but doesn't actually exist in any practical form in real life.

-Chris

The Universe is populated by stable things. -- Richard Dawkins

Working...