I swear I remember doing this exact thing when I was a kid.
Much as most don’t understand the scientific definition of “theory,” you seem to be using the wrong definition of “doubt.”
Proper scientists recognize that a currently held theory is merely the best explanation we currently have for a phenomenon. In light of the evidence, they believe it’s PROBABLY MOSTLY true, but they are willing to easily accept that it isn’t if new evidence demonstrates that the older theory doesn’t explain all the facts. This isn’t “doubt” so much as “critical thinking.”
The doubters the article is referring to are people who, DESPITE the evidence, believe the theory is NOT true. Of course, most of them are painfully unaware of the evidence, they have no idea how to get to it, and they wouldn’t know how to interpret it if they had it. A lot of that is due to a broken educational system.
People say there’s “mounds of evidence” for evolution. So I’ve asked biologists if there was a compendium of major publications in the area, but I didn’t get very far. There are decent college text books, but many don’t present the original evidence; they only recount the findings from the literature. Part of the problem is that most of the “evidence” is boring tables of measurements of fossils and bones. If you won’t know what the numbers mean and how they relate, they’re just numbers. They are the evidence, but it doesn’t help they layman at all. Another part of the problem is that any summary of the evidence would leave out too much. A proper treatment of the topic would be on the order of “every peer-reviewed publication on the topic since Darwin.” This is because publications cite each other so they don’t have to reinvent the wheel. They make “assumptions” they don’t have to justify because someone else already did, but it’s a major undertaking to follow all the rabbit holes. Biology PhDs have trouble with that. A farmer will be hopelessly lost.
With most sciences, most people are clueless. But since they have no other reason to doubt it, this doesn’t cause any conflict. People have heard of chemistry and astronomy and mostly just consider them to be overly difficult or esoteric. It’s only biology (and some of cosmology) that makes any statements that go against things people have been taught to believe. They have no hope of understanding the science, but they do believe what their religious leaders tell them, and there is nothing intelligible to the that says otherwise.
It’s this lack of understanding of what “common folk” go through that makes me really angry with people like Richard Dawkins. As far as many people are concerned, he’s nothing more than an arrogant jerk who thinks that everyone who believes differently from him is a moron. I’ve seen dozens of videos of him on YouTube, and I never see him present evidence. He merely claims that it’s there and believes that it should just be obvious to everyone what it means. It’s like me (the computer nerd) when I was in high school who treated people unkindly because they didn’t understand computer as well as I did. Now I’m a CS professor, and I have to teach basic CS concepts to young adults. It’s VERY challenging to get some concepts across, but I work hard to do it. Dawkins is terrible at this. Perhaps if he deigns to teach an undergraduate course now and then, he might do okay, but he strikes me as one of those all-too-common lecturers who has no patience for anyone who questions what he says. His attitude reminds me of so many religious people who insist that you’ll go to hell if you don’t believe blindly exactly as they do. I guess calling someone a moron isn’t as bad as telling them they’re going to hell, but it’s a similar intolerant attitude, intolerant to people who don’t share your same training or idiology.
To me, people who doubt evolution are very unfortunate. It’s sad to miss out on such interesting science or to go through life believing things that are provably false (note that not all of religion is provably false, but the world clearly wasn’t created in 6 literal days), given a long history of scientific discovery. And I’m also really afraid of some of them putting their beliefs into legislation. But I think it’s important to understand that these people have not been equipped by their environment (culture, schooling, etc.) to understand some very abstract ideas. Only once you accept that these people are not evil for what they believe can you start finding clever and interesting ways to convey what we have learned from science and why we believe certain theories are probably correct.
Usually double-blind is a good thing, like when doing a scientific study or reviewing one. But in the case of Google, the hiring method (for software engineers) involves a sequence of engineers asking you to solve toy problems and scribbling notes on a single sheet of paper. That single sheet of paper is mostly what the hiring committee sees, along with your resume (which nobody looks at any more than superficially) and maybe some comments from your recruiter. There is absolutely no consideration of things like personality, team work, cross-polination from other fields, or even CS disciplines outside of software engineering (they do 90% algorithms, 10% computational complexity, 0% operating systems, 0% computer architecture, 0% programming language theory, 0% anything else).
I have a PhD in computer engineering, and I currently I work as a CS professor at a major SUNY research center. Based on Google’s request (they called me!), I interviewed at Google's NYC office for a software engineering position (although my research area is computer architecture, which they didn’t quite seem to understand). I went there, I was friendly and didn’t stick my foot in my mouth, and I answered all of their algorithms questions (some I could have done better, but I think I did a good job). A few weeks later, I get a call from my recruiter. They were declining to make me an offer for two reasons. One was some vague statement about me not fitting with their culture. No idea why. The other was that I had appeared to have jumped around jobs too much. That last one made no sense. I worked one industry job for almost a decade, then I went to grad school (where I worked a research assistant and did a couple of internships), and then I got hired as a professor. How does that constitute jumping around too much?
I checked out Google’s hiring practices on glass door (before I interviewed, of course), and I see a similar trend. Google has no compunctions against wasting people’s time. They regularly cold call people to interview and then decline to make an offer, even for people with doctoral degrees and/or substantial industry experience. I have two good friends who work at Google, and they’re brilliant at computer science theory, but even so, I still really don’t know what Google is looking for.
Of course, maybe I just suck, and Google figured it out. I doubt it, though. I have a PhD for Ohio State, my dissertation is 120 pages (not including references), I currently have 13 major publications, three at top-tier conferences, first author on 9. I recently won an NSF CAREER award ($450,000 over 5 years). I started the Open Graphics project, which is basically dead right now but did produce real open source graphics hardware. And before all that, I worked in a small company where I had to do everything from tech support to IT to software development in a dozen languages to chip design. Among many other things, I designed a graphics accelerator ASIC that’s present in most air traffic control towers around the US (among so many other things I can’t keep track of). In the early 90’s I released ANSITerm for the Atari ST, which was very popular at the time and is still a very popular BBS terminal program among retro computing enthusiasts. I’m pretty sure I don’t suck.
When you buy a medical prosthetic from a medical company, almost none of the sticker price covers materials or basic engineering. Most of the money is split between liability insurance and extra R&D and testing overhead to make damn sure that someone won’t misuse the device, thereby generating a law suit. In Law, products liability is a huge area; big companies have deep pockets and often lose in suits where the user of their product was clearly doing something really stupid. (Chain saw instructions: Do not use hands to stop chain!) The fact is, people are sue-happy, and that’s the primary reason why all medical devices cost so damn much.
If someone is selling 3D printed prosthetics, they are GOING to get sued, and they’ll get put out of business very quickly by some moron who found a way to hurt themselves in a heretofor never conceived of manner. It’s just inevitable.
If someone were to make open source designs avaiable for prosthetics so that people could print them themselves, you’d think that the user would be taking all the liabilty into their own hands right? Ha! When something goes wrong, the maker of the 3D printer will get sued. And no matter what kind of disclaimer they put on it, the maker of the 3D schematics will get sued too. All because people find amazing ways to hurt themselves and sue over it. Especially with medical devices.
Why do you think airline food is so damn expensive? When something goes wrong with a plane, everyone gets named in the suit. The airline, the airplane manufacturer, all subcontractors of said manufacturer, including the company that made the rivets, the supplyer of the airline food, the pilot, you name it.
When my parents were threatened and decided to get guns, they both trained hard. My dad is a really accurate shot, and my mom is even better. Unfortunately, not everyone is quite so well trained. And some people seem to think guns are toys that you don’t have to be careful with. I’d be in favor of stricter regulations on certifications beyond what you get from an internet course. But I also think it should be a lot harder to get a drivers license. Too many idiots on the road.
As for our right to bear arms, there are two reasons why that right should never be taken away: (1) the government hasn’t demonstrated its ability to protect people otherwise and also often borders on presenting its own threats to the people (I don’t advocate violence against government representatives, but the government DOES need to generally operate in fear of at least political retribution from the people it’s supposed to serve), and (2) we’re too reliant on the government anyway — I think people need to be more self-reliant, not reliant on the nanny state.
In the US, state and federal governments no longer function to serve the people. The fund themselves at gunpoint, taxing the shit out of citizens, but operating primarily to further the goals of a minority of big corporations and major political party agendas that keep politicians fat and in office. Andrew Cuomo has given talks about restoring trust in the government. The government should NEVER be TRUSTED (per se). Trust doesn’t enter into it. It needs to be a centralized resource that pools resources and enacts laws to meet the needs of the people, and it is “trusted” only so far as the people judge its success at those goals, based on sets of internal and external checks and balances.
I trust federal and state governments like I trust Windows: It’s okay as long as I run antivirus software and firewalls and perform regular reboots. And watch it carefully and aggressively save my work and perform regular backups and perform regular maintenance, etc., etc.
I’m not saying that ALL companies are like this, but in many of the larger ones, the first people looking at your resume are non-technical. Many just have a checklist, and if the over-worked HR person looking at your resume does not perceive that you have every one of the listed qualifications, it goes straight into the bin. An over-abundance of applicants leads to a superficial and stochastic filtering process that isn’t especially good at figuring out which applications can do the job.
I’ve worked as an engineer, and now I’m faculty in a CS department. On an unrelated note from the above, I find that it’s easier to get a job with a CS degree than other major engineering fields. Not necessarily a GOOD job, though. Compared to EE, for instance, there are way more jobs for CS graduates, although many of them are low-paying grunt work that could indeed be done by lots of people with only a high school diploma.
Except that they won’t hire people without the degree, because it’s one of the required checkboxes on the HR form.
We’re exposed to environmental toxins CONSTANTLY. Vaccines, we give once every month or so? Wouldn’t it be a more productive use of our energies to clean up our environment and diets?
Of course, it costs no money to avoid vaccines, and all it requires is a bunch of whining about conspiracies. Eating better and not ruining the planet actually takes EFFORT, something many Americans don’t seem to believe in much.
It’s not mytical that some vaccines used to contain thiomersal, a mercury-based preservative. This was replaced with an aluminum compound, and aluminum is correlated with diseases like Alzheimer’s. Of course, we have no evidence that aluminum accumlation causes Alzheimer’s; it could just as well accumulate as a side-effect. Still, it’s cause for investigation. Some flu vaccines are grown in chicken eggs, which may be of concern to someone who has an allergy to eggs. In general, most preservatives aren’t a good thing to be putting into your body, although I’m at a loss how else you’d give vaccines a reasonable shelf life.
As for autism, there is a growing but confusing and often conflicting body of evidence that it is associated with a variety of different things: Inability of the liver to keep up with metabolizing toxins, over-activation of the immune system, food sensitivities, and a number of things I can’t remember right off. Actually, the three I listed aren’t entirely unrelated. Food sensitivities can cause heightened immune response (depending on the nature of the sensitivity), some of which are auto-immune like celiac disease. As for the liver, I don’t fully understand its role, but there seems to be some issue with competition for a limited resource (which is why taking too much tylenol and/or alcohol can cause liver damage), and it’s involved in doing some cleanup during immune response, I think, and if your body is busy dealing with a pathogen (perceived or real) then it won’t deal with other brain-affecting toxins well enough. (If you want to spend the time to check this, please do.)
One hypothesis regarding autism is that there is an accumulation of toxins in the system that the liver can’t keep up with, and those toxins impair brain function. If you eliminate foods you’re sensitive to, the liver has less work to do and can better keep up with the remaining toxin workload.
So the reasoning seems to be that vaccines cause an overactivation of the immune system and that that response is somehow different from the normal one if you contract the real disease, that over-activation lasts a long time, and during that period, the liver is too busy to metabolize toxins that cause autism.
Ok, fine. Let’s go with that. So vaccines may add ONE contributing factor that may, in some circumstances, overload liver function. Also, so do allergenic foods, polluted air, polluted ground water, BPA, pesticides, etc., etc. But the one thing they pick on is vaccines? Of course, because we HAVE to eat our shitty American diet and drive our gas-guzzling cars and blast our farms with neurotoxins. Oh, NO. We couldn’t possibly boycott those other things with the same vehemence (and possibly ignorance) that we do with vaccines!
So my opinion is this. If you think that vaccines cause autism and you’re being a responsible parent by keeping your kids off vaccines, then you’re a moron unless you also:
- Drive only solar electric vehicles or use horses
- Use reverse osmosis and only glass containers for ALL of your water consumption
- Eat a 100% organic paleo diet
Just to name few. Because only then will you at least have any semblance of consistency in your reasoning. I can’t say for sure whether or not you’d be RIGHT, but at least you’d be CONSISTENT.
As for me, I get my kids vaccinated but we also eat a mostly organic diet, high in nutrients, low in junk food, and we filter our water. Also, we live out in the country and get fresh air. So IF there is some kind of convoluted link between vaccines and autism, I think we’ve more than offset that risk by removing some of the OTHER potential environmental factors sometimes vaguely linked with autism. Also, we feel better because we eat healthier food, and I’ve lost 30 lbs (down from almost 190) since December 2013 by putting myself on the paleo diet (actually, it’s SCD, but you never heard of it). BTW, although I and my wife both have family histories of ASD, neither of our kids show any sign of it, despite the fact that they get vaccinated.
Only problem is that many MDs I have met are just as muck quacks. They superficially assess the symptoms and prescribe something that only treats the symptom so they can charge the insurance company a rediculous fee and move on to the next patient.
Urticaria, for instance, can be a symptom of a number of serious and less serious underlying causes. Most doctors will merely prescribe an antihistamine. An antihistamine is a good short-term measure to make the patient feel better, but it should also be cause for concern and prompt deeper investigation. Almost never happens.
“Alternative” doesn’t enter into it. “Lazy” is the word we should be using here.
Sorry. I remembered it wrong.
The only way to know what alternative medicines to take is to educate oneself. Unfortunately, that is a daunting task for people not experienced in research.
My wife and I have had health problems that were helped only through the assistance of some alternative medicine practitioners. There was this one nutritionist we went to in Ohio, and her main advantage over the typical MD was that she was willing to investigate to figure out underlying causes. MDs invariably would dismiss us because they were unfamililar with our ailments and were never ever interested in spending more than 15 minutes on a patient. They would NEVER do research. Even specialists weren’t interested. We went around in circles for years, never getting any help, and a lot of the advice they’d give us would directly contradict advice we’d get from other MDs and also from articles we’d read in places like JAMA.
The thing with MDs is that they’re really just normal people who are a bit smarter than average and have advanced clinical degrees. Very few of them want to go into research. Most just want to do basic practice. Just like my PhD in computer science doesn’t make me expert in all of CS or competent to teach all areas, an MD doesn’t make you magically able to treat every illness. And when you get into something super unusual, an MD is unlikely to know about it, even if you manage to find the right kind of specialist. (I’ve noticed, for instance, that most endorcrinologists don’t know a damn thing about thryroid disorders because they all specialize in diabetes.) In my life, I’ve only met a couple of MDs who were super smart and had a mind for research and advanced diagnosis. Most are just people who want to do a regular job and not get sued for malpractice.
So, like so many other people not helped by mainstream medicine, we turned to alternative practitioners. (Some MDs, more DOs and nutritionists. We haven’t gone to any Naturopaths.) Occasionally, one would suggest something homeopathic, and we would just ignore it. But what they did that was useful was run tests that regular MDs wouldn’t think to run. For instance, we found out that we had protozoan infections becase our nutritionist had us submit fecal samples to a lab that does diagnosis by DNA testing. The treatment involved presenting the findings to a DO who wrote us prescriptions for Tinidazole, which is a standard anti-parasitic medicine. So, the irony is that in order to diagnose our condition, we had to go to an alternative practitioner who was interested in actually doing diagnosis and did that by running standard blood and fecal tests and treating problems with standard pharmaceuticals. Who’d have thunk it.
However, there are numerous herbal and natural treatments that work because they’re based on similar chemicals to those found in regular medicine. Here are but a few examples of “alternative treatments” that work:
- Taking 5HTP instead of an SSRI to treat depression (it’s a precursor to serotonin that easily basses through the blood-brain barrier)
- Taking dessicated porcine thyroid gland for sub-clinical hypothyoridism (because it contains all the thyroid hormones)
- Taking dessicated bovine adrenal gland for norepinepherine and cortisol insufficiency (because it contains them)
- Using oil of oregano to treat some kinds of microbial infections (because it’s antimicrobial)
- Taking Goitrogens concentrated from cruciferous vegetables to treat hyperthoridisn
- Using a netipot to clean out the upper respiratory system to help clear/drain infections faster
- Taking low-dose naltrexone to treat fatigue and auto-immune disorders (this treatment is shifting from alternative to mainstream now)
- Eating less grain or eliminating it altogether to improve digestive function
- Identifying food allergies/sensitivities and eliminating those foods to reduce misdirected immune response and tissue inflammation
- Eating a diet high in probiotics and cultured foods
- Supplementing with a variety of amino acids and neurotransmitters to help with mood problems (e.g. theanine, which is a great mood enhancer)
- Having willow bark tea if you feel like it instead of taking aspirin (because they both contain acetylsalicylic acid)
- Drinking green tea for a variety of health benefits, including antioxidants
- Echinacea for immune support
- Rhodiola rosea to help with energy and concentration problems
- Valerian to aid with sleeping
Different people reading this will disagree on which of these are alternative and which are effectively mainstream. Some alternative treatments have undesirable side-effects as well as benefits. Some have purported benefits that are questionable. Basically all of them are unregulated, so you don’t always know what dosage you’re getting. But a lot of them work, and to know what does, you have to research it carefully. For every herbal treatment you’d consider, there’s a wikipedia article that tells you all about the pros and cons.
Homeopathy is bizarre, though. Supposedly you dilute something so much that there’s nothing left. How in the hell does that help you? Because you drink more water? But I know perfectly rational people who tell me about problems that persisted until they tried homeopathy. If it worked, why? Is it psychological? Or might it be the case that not all homeopathic treatments are really “homeopathic” and that they actually contain active ingredients. But what then are you taking? Scary. At least with gingko, you can read about the chemical content and look at peer-reviewed studies that tell you what the good and bad effects are. With homeopathy, you have no idea, because it’s shrouded in mystery and magic.
All three of my installations of Windows are completely legit, and I intend to keep it that way. However, I don’t use Windows enough that I feel any urge to upgrade. Two are in VMs, and one is on a super-old laptop that I let my kids use. The two in VMs may be upgradable, the laptop probably not. But why do I want to spend the money to upgrade something I don’t use much? Actually, one of those XPs may get upgraded, but only because I’m getting a company I consult for to pay for it.
And of course, everyone else on slashdot waits with baited breath to see you and your ilk post grammar complaints. Surely, nobody just gets over the minor typos and actually concentrates on the article, which (unlike so many other articles) is actually really interesting news for nerds.
What astounds me is the arrogance of some people who seem to imply by their behavior that they believe that they themselves never make mistakes. I would assert that losing sight of the forest for the trees (what’s more interesting, quantum telescopes or complaints about grammar?) IS a mistake, which means that you ARE imperfect and therefore might want to stop making yourself look like an ass by unnecessarily complaining about grammar and spelling.
Some people have a really hard time separating “truth” from “fact,” and they also have difficulty with how these relate to science.
A novel may contain truth, in that it is not a factual account of anything, but you might learn a life lesson from it. Indeed, many childrens books (and certainly many other genres) are specifically intended to teach valuable lessons. Religious practitioners often conflate the two. If your scriptures are (as they are taught) “true,” does that mean they are factual? You might learn something from the Bible, but there are many things in it provably non-historical, consistent with the Hebrew penchant for taking other people’s oral traditions and adding a “moral” component. Anything historical in it doesn’t necessarily convey useful truth, and anything non-historical is not necessarily devoid of truth.
Conversely, it’s common to mislead by the use of facts. Propagandists often present accurate factual information, followed by specious reasoning that leads the listener to an incorrect conclusion. It’s all in how you present things, what you emphasize, what you downplay, and what teleological conjectures you want to draw to explain those facts. Politicians are brilliant at making the statistics say whatever they want.
Then there’s science. It is indeed fact-based. And we hope that it is true. But in fact, it is not a truth generating engine. It is a MODEL generating engine. A true model is, of course, better than one that is merely numerically accurate, but there’s only so far you can be sure (or maybe even care) just how true it is. Sometimes, you just need something predictlve. A recent Ars article about zebra stripes mentioned how scientists developed and tested several different explanatory models before they found one or two that were fully consistent with all of the facts. Every single one of those models, even the wrong ones, was scientific, because they were falsifiable (a term that few people really understand). Another example is the prevailing theory of the moon’s origin; we have a model that is consistent with what we can measure today, but there’s so little physican evidence that we only accept the model because we lack any better explanation. It if turned out to be wrong, nobody would be the least bit surpirsed. Even a blind, non-explanatory model, like using a neural net for numerical regression, is of scientific value, because it can be used to do engineering, and it may aid in further analysis that leads to a falsifiable explanatory model. Once a model has gone from postulate to hypothesis to theory, consistent with the evidence, we can say that it is consistent with the FACTS, but as for truth, we can only say that it is PROBABLY MOSTLY TRUE. Each time we discover some more evidence that we haven’t explained or which contradicts the model, we have to adjust it, making it incrementally more probably mostly true.
This brings me to pseudoscientific ideas like intelligent design. Even if it were, hypothetically, true, it isn’t and can’t be science. Why? Because it isn’t falsifiable. Anything you can’t explain, you can dismiss as being the result of some outsider tweak, so it’s impossible to prove it wrong. It’s also not predictive. It makes no interesting testable claims that evolutionary theory doesn’t, so it doesn’t yield any new knowledge. Finally, it’s useless to engineering. Not all scientific theories are necessarily going to be used by engineers, but in the case of intelligent design, it CAN’T be. A potentially useful scientific theory must be based entirely on predictable naturalistic mechanisms. This way, engineers can develop new systems that rely on or leverage those natural phenomena. Intelligent design, on the other hand, requires miracles or alien interference that we’re (by definition) too primitive to understand. And unfortunately, engineers can’t perform magic and don’t have access to alien hyperspace nano-wormhole entangement bioengineering technology.