Forgot your password?

Comment: Nutritional deficiencies! (Score 1, Interesting) 219

by Theovon (#47916215) Attached to: Schizophrenia Is Not a Single Disease

Probably at least a few of those sub-disorders are actually nutritional deficiencies. We have this myth (perpetuated by MDs who have ZERO training in nutrition) that we don't have nutritional deficiencies in America. In fact, the American diet is horrible, and we all know it. B12 deficiencies are common (which is one of the reasons shots are often prescribed), as are deficiencies in magnesium, along with numerous other vitamins and minerals. Since the mid 90's, the FDA has mandated "enrichment" of foods, but the forms of the additives are NOT the biologically active forms, so some people have trouble processing them. For instance, MTHFR gene defects are common (my wife and I have different ones, and she has the really bad C677T defect), making folic acid (which is artificial) range from useless to poisonous to some people who need to take methylfolate instead. In fact, since the mid 90's a lot of people have reported declines in their health, which may be correlated with that FDA mandate (although without a more complete study, we have to assume this is anecdotal and COULD be correlated more strongly with something else).

Anyhow, my point is that many psychological disorders, such as bipolarism, are associated with vitamin deficiencies. If you look at the symptoms lists of various B vitamin deficiencies, for instance, you'll see that it is already established what kinds of psychological effects can occur in cases of "extreme" deficiencies. If we can get past the idea that nobody in America can have extreme vitamin deficiencies (you can have plenty of some vitamin but still be anemic if you can't USE it in that form), then we can start treating mental disorders using carefully controlled diets and supplement schedules. I'm sure it won't work on everyone, but it would be insane to not try it in place of loading people up on antipsychotics because the "doctors" are mental hospitals have no fucking clue about nutrition.

And just to reinforce, folic acid is basically poison to about 10% of humans. Different vegetables contain methylfolate and/or folinic acid, NOT folic acid. Defects in the genes that code enzymes that convert folinic acid and folic acid are more common than most food allergies, making this a serious problem!

One interesting side-effect of this is the proliferation of the bad genes. People with homozygous C677T mutation have about a 30% conversion rate from folic acid to methylfolate. (Meanwhile the unconverted folic acid itself interferes with the methylation cycle.) If a woman gets pregnant and takes folic acid in large quantities (which is what doctors instruct), the fetus will take all of the methylfolate, and the mother will get very sick. Meanwhile the fetus will be allowed to develop when otherwise it would have naturally aborted due to an inability on its own to convert folinic acid that you get from food. As a result, we have more people born with this defect, while people in the FDA and the medical profession are too ignorant of the consequences to deal with them properly. Mind you, if they were to take more methylfolate, the viability of this defect would increase, but at least the mothers wouldn't get as sick.

Comment: Personalized medicine... and nutrition (Score 3, Interesting) 291

by Theovon (#47881899) Attached to: Link Between Salt and High Blood Pressure 'Overstated'

Yeah, much of what we know is being overturned. Some of the disinformation was probably created by the food companies that wanted to make cheaper food. Back in the 70's we were told that fat was bad, and so all these processed foods got lots of extra sugar instead. Now we find out that sugar is bad and you need to consume more of the right fats. We're also starting to see that this "food pyramid" they taught us about should be basically inverted. The reason for the food pyramid is more to do with cost (grains are cheap) than nutrition.

Today, we know a hell of a lot about the impact of genetics, microbiotic flora, and many other things that affect individuals differently. For instance, many people have some mild sensitivities to various food proteins, although no always enough to notice more than some unexplained lethargy unpredictable times after eating certain foods. Of course, for some people, it's bad, like those with celiac disease.

Here's an interesting one: Apparently, about 10% of the population (US or world, I'm not sure) has a homozygous MTHFR C677T mutation. These people cannot convert folic acid (which is artificial anyway) or folinic acid (found in lots of vegetables) into methylfolate. As a result, these people suffer from massive B9 deficiencies (which indirectly causes others, like trouble absorbing B12). Moreover, it's not just that folic acid and folinic acid are not useful to them; they're functionally poison, interfering with the normal function of the methylation cycle. So these people need to take large quantities of methylfolate and cut out certain "healthy" vegetables. They also have to cut out "enriched" foods. We're starting to see a correlation between health problems increasing in these people and the mid-90's FDA mandate to enrich certain foods with Folic Acid. Lovely.

Comment: iWatch is a pretty cool prototype (Score 1) 730

by Theovon (#47865415) Attached to: Apple Announces Smartwatch, Bigger iPhones, Mobile Payments

If I had disposable income, I might get an iWatch for my wife (who actually uses a watch). This is a great little toy for people with some disposable income and an itch to collect expensive gadgets. It looks cool and probably has some great functionality.

For those complaining about it, they're expecting too much from a first-generation product. Give it a few years, and the features, battery life, and price will improve to a point where more of us would consider buying one. Meanwhile, Apple gets to test out ideas that will improve its later products, and some of what is learned from this will also positively impact other Apple products.

Some time next year, I'll go check one out in an Apple store for about a minute. (Which is about how long I can stand to be in an Apple store since I already know everything about the products before I go there, so I get bored quickly.)

Comment: Supply and demand (Score 1) 385

by Theovon (#47864631) Attached to: Unpopular Programming Languages That Are Still Lucrative

In general, having a niche skill pays more if you can find a job in that niche. Companies that need that skill know it's a niche skill and one way or another end up realizing they have to offer higher salaries to attract the right talent. On the other hand, if you are an expert in COBOL and go to work doing web programming, they're not going to care, and your COBOL knowledge will have no impact on your income.

That all being said, I don't understand the general fear people have of learning new languages. Sure, it takes time and effort, but it should be considered part of the craft. At Ohio State, the CSE department used to teach a language called Resolve for its beginning courses. It was DESIGNED to make things like data structures and algorithms easier to code, avoiding a lot of the cruft of other languages that is unrelated to the code that matters. But students complaned and complained about learning a language they'd never use again. My thinking is that if they're going to be successful, engineers need to be willing and able to learn new languages on their own time.

At OSU, they eventually caved to student pressure and switched to Java. Why they chose Java is beyond me. Where I teach now, they use Python, and that makes a hell of a lot more sense to me. If you're entirely new to programming, there is a lot of boilerplate (from the perspective of the uninitiated) Java code you have to write just to do simple things that is unnecessary when writing Python.

Comment: Re:There is no slump in open positions (Score 1) 250

by Theovon (#47842715) Attached to: IT Job Hiring Slumps

Either that or they don't want to come to the DC area. Considering the massive cost of living there and the fact that "6 figures" can mean "just barely over 100K", maybe you're just not offering enough. I turned down a really nice offer in Arlington, VA, because $150K was worth substantially less there than $85K where I work now.

Probably the only place worse than DC would be NYC.

Comment: Re:Me too. I teach CS part time. (Score 1) 546

by Theovon (#47840325) Attached to: Does Learning To Code Outweigh a Degree In Computer Science?

In my university, we accept huge numbers of international students because they pay higher tuition. If we didn't do that, we wouldn't survive financially, because the state's economy is poor. So in terms of keeping the institution alive, this is the best thing to do. And in any case, this never seems to negatively impact our very good reputation in the northeast. (Besides, it's not like we're giving good grades to bad students anyhow.)

As long as those poor students aren't TOO distracting, the revenue they bring in is good for everyone else. (And some of them get brought around to actually find the subject interesting.) Our domestic students are almost all very good. And there is always a nontrivial portion of the international students who are also very very good. I like to think about the cases where a student who had trouble getting into to other schools was given an opportunity to unexpectedly shine in ours. This happens plenty.

Comment: There is no slump in open positions (Score 4, Interesting) 250

by Theovon (#47840293) Attached to: IT Job Hiring Slumps

The companies say there aren't enough IT workers. The IT workers say there aren't enough jobs. It really comes down to there being huge numbers of IT workers but very few good ones.

As someone who educates CS students, I see the whole spectrum. There are lots of students who seriously have no interest in learning the material. All they care about is getting a diploma. Where I teach, those students don't make it all the way through the program, due to a combination of poor grades and being caught cheating. But when I was getting my undergrad degree, I was always angry about the fact that employers couldn't distinguish my A's from those of people who didn't actually learn the material.

Not surprisingly, supply and demand is a factor here. With low numbers of CS students, standards have to be lowered to keep the tuition revenue going. As the student population grows beyond capacity, schools are able to be more selective based on SAT scores, high school GPAs, and weed-out courses.

Comment: A "degree" is useless to those who don't care (Score 1) 546

by Theovon (#47822101) Attached to: Does Learning To Code Outweigh a Degree In Computer Science?

As someone who teaches computer science for a living, I can tell you that if you're only majoring in computer science because you think you need to get a degree, then the degree will be useless to you. You'll do the minimum work to pass (if that) and not retain anything you learned. Then you'll have a hell of a time trying to find a job. Employers have become jaded and assume that although you have to be a college graduate to apply, almost all college graduates are morons. This is because most of them are there just to get a degree, and employers have to go through gargantuan efforts to find those few who are actually good.

On the other hand, if you're the kind of person who is good at learning to code and you actually find computer science interesting, then getting a degree will help you immensely. If you're really smart, you would learn most of this stuff on your own anyway, but classes help you organize the knowledge, and professors can help you with the difficult questions. If you go to a good school, you'll learn more than you would if you did it alone because college degree programs help direct you along and force you to practice as you go along. Finally, when you're done and graduate with good grades, you have verifiable evidence that you've been exposed to this knowledge. If you just learned it yourself, you'd have to ask the to take your word for it, and they're not going to do that.

Also, let's not forget that finishing a degree is also proof that you are able to start and finish a long-term project. It means you have attention span and can be dependable. Being able to finish things is another rare trait that employers put a lot of effort into looking for.

Comment: Re:All about the brand (Score 1) 132

by Theovon (#47806641) Attached to: Apple Reveals the Most Common Reasons That It Rejects Apps

I don't know what your professors were like, but I instruct my graders to (a) do the assignment themselves for an objective set of answers (to later compare to mine) and (b) look for common "wrong" anwers and evaluate them carefully. For (b) there are three reasons why we might mark correct a "wrong" answer. One is that I just screwed up my work. Another is that I screwed up the question. And another is that I may have given a misleading explanation that lead students commonly produce a wrong answer. We also consider carefully how many people got it "right." A few times I have just dropped a question out of the grading and given extra credit to those few who got a good answer.

I suspect I put more time into grading than I "should" given tenure requirements, but I can't bring myself to do a shit job at teaching.

At least not intentionally. :)

Comment: Can the executive branch be held in contempt? (Score 1) 248

by Theovon (#47785227) Attached to: US Government Fights To Not Explain No-Fly List Selection Process

What would happen if the executive branch (which is supposed to enforce the law) simply refused to comply with a judicial order? Can someone be held in contempt? Who would take on the role of enforcing the judicial order (in terms of compelling the action or executing punishment)?

10 to the minus 6th power Movie = 1 Microfilm