The alcohol flush reaction they refer to isn't just about feeling unpleasant. Yes, people with two copies of the gene for it rarely drink. But those with only one copy (that is, they have some of the enzymes to metabolize acetaldehyde), while less likely to drink, often do so anyway (because they can take it, and they enjoy the feeling or feel socially obligated). And when they do, they raise their risk of esophageal (and I believe a few other cancers) significantly more than someone who drinks the same amount but lacks the flush reaction. Acetaldehyde is highly carcinogenic; most people just get rid of it quickly enough to limit the damage.
In short: If you give this to alcoholics, a large number of them will tolerate the side-effects and you've just dramatically increased their risk of cancer.
- Learn the theory, identify weaknesses (not gaps in your knowledge) and develop experiments to confirm your doubts
- Come up with a better model that either involves simpler assumptions (no, "God did it" is not simpler, because there would be thousands of assumptions to explain how he manages to exist in a way that is undetectable and yet constantly altering reality) and some evidence from experiments or studies, or come up with a model that has more assumptions, but strong experimental or studies supporting it.
I can come up with hypotheticals all day. None of them require much in the way of assumptions. My previous argument was basically four pillars: 1. Stronger eggs survive in more situations, 2. Stronger eggs require either stronger hatchlings or better tools, 3. Stronger hatchlings require more energy, and therefore tend to do less well in times of drought and famine than weaker hatchlings and 4. Existing species have genetics that can alter by degrees without mutations. I doubt you have any significant problems with any of those assumptions, yet the result is somehow unbelievable to you.
Initially 10% of hatchlings that could have handled the weaker exterior can't get out
Only because you make invalid assumptions about how it must have evolved. Lets start with an amphibian and egg. Now lets say that a mutation causes the exterior to be a bit more rubbery. Initially 10% of hatchlings that could have handled the tougher exterior can't get out, but 10% more eggs survive being trod on by large animals. Except it's not static. Each generation that gets out of the egg has a greater concentration of the genes that give them the strength to escape the tougher egg. Repeat the process a dozen times over the course of a million years. Eventually you reach an equilibrium; the shell can't get tougher because the resources needed to escape it are expensive enough that the animal would have a higher energy burn, and fare poorly in times of drought or famine.
Fast forward a few tens of thousands of years. Another mutation causes the animal to develop one tooth earlier than it should. It's weak, but it allows weaker hatchlings to escape an egg of equivalent strength. The mutation spreads, aided by the occasional drought of famine, where the "weaker" animals survive. Later, another mutation makes this early, poorly formed tooth drop off; it was getting in the way, and it's better to grow strong teeth later. The egg shell toughens more and more, and starts becoming less water permeable as some individuals find a niche laying eggs near the water line where egg eating marine life has less access to it.
Lather, rinse, repeat. Tougher and less water permeable eggs make the eggs survive more often, and in more places. Small changes can be compensated for with existing intra-species variation, but if a novel mutation arises that deals with the costs of the new strategy more effectively, selective pressure will spread it. Follow this chain of events for a hundred million years, and you got from fish to amphibian, and from amphibian to reptile. It's not a whole bunch of lucky coincidences at once, it's one coincidence, adaptation to take advantage of it, then another coincidence and further adaptation, over and over, over the course of millions upon millions of years. It took billions of years to go from single cell life to multicellular life, a hundred million years to go from marine life to amphibians and so on. This is a mind-boggling scale of time; continents circled the globe in the time it took for mammals to evolve from reptiles. You don't see the continents shifting, but it happens all the same.
The tiny changes and recombinations occurring in animals today won't produce many new species "naturally" in your lifetime, but over the next 10,000 years? Million years? 100 million years? I wouldn't bet on animal life remaining unchanged.
There's good money in it, assuming you can get motivation out of making the already absurdly wealthy incrementally richer. I spent time at a hedge fund; paid better than any job I've had before or since, but it was really hard to go to work every morning, because I felt no sense of accomplishment. I just felt like I was squandering my education skimming off the work of others (see High Frequency Trading, the entire speculative commodity futures market, etc.).
The few people who benefited from my work (besides myself) were already so wealthy (the minimum net worth requirements are ridiculous) that every single one of them could stick their money in a savings account and spend it at a rate of $200K a year for the rest of their life with no risk of going broke. Hard to get excited by the prospect of letting them spend $300K a year...
but this is almost the definition of monopolistic behavior.
They only have like 5% of the market?
Closer to 10% now, though your point still stands. That said, it depends on where you draw the distinction between products. Sure, virtually any application could be written to run on virtually any OS. But if you want to run OSX exclusive apps without reinventing them from scratch (which hits all sorts of other IP laws), OSX is your only choice. If Apple machines were some sort of special purpose device, then the argument for linking them together is stronger, but they're clearly not special purpose; the software is sold separately, the hardware is off-the-shelf, etc.
I'm not saying you're wrong. But there is something very odd about a business model that becomes illegal simply by growing in market share. And if OSX were really "just" another desktop OS, then no one would bother making clones. But if you treat Apple as having a monopoly on "OSX" rather than a small share of the "desktop OS" market, then the picture is very different. There's nothing wrong with having a monopoly on OSX, but abusing the monopoly to improve sales of their other product lines is problematic.
Of course? If they sell the software separately, what makes it so obvious that they have the right to say how it will be used? We don't seem to have this sort of system for physical objects. If I buy a car, I can do whatever I want with it (within the law) without checking the rules laid down by the manufacturer. Sure, it may void the warranty, but it's not illegal. Beyond that, lots of software specifies the OS its supposed to run on. If I run a Windows app under WINE, have I somehow broken the law?
It's a much harder line to draw than you make it seem. In my opinion, Apple might be in the right on this specific point, but this is almost the definition of monopolistic behavior. Only Apple can sell OSX, and they're using the software monopoly to artificially prop up their hardware division.