While I agree that limiting intellectual discussion to the higher halls of intelligencia is dangerous, destructive, and ultimately pointless, it does help to get some perspective.
Blogs are often the realm of the amateur philosopher where people repeatedly rehash things that have been either established or disproven decades ago. They talk about other things, though, and provide information to people who don't have it yet. Just because this information is redundant to what's going on somewhere else doesn't mean it isn't useful.
But the writer is correct in that many people present arguments about things that really have no validity, and then they expect others to act on their arguments. The concept of "family" is a good example. What is a family? Why is an adoptive family less valuable than a genetic one? Why would gay parents be less loving than straight ones? Does a family suffer if it's a man and his mother raising a child instead of a man and the child's mother? Or is it all equivalent if slightly different in a way that we personally disapprove of? We don't even bother to define the word "family" before we start insisting that "family values" are of primal importance, and yet the stupid people fall for it.
The argument of mind and brain is similar, but another order of magnitude tougher to understand. Nonetheless, people who have never bothered to think about the definitions make statements about them that they expect everyone else to take seriously. It's fine if we all recognize that this is all in fun, with maybe a bit of brain candy thrown in for good measure.
This is a question of who is being more rude - the person with the laptop, or the person asking to borrow it. Laptops are a personal resource, not a community one, and I somehow doubt that anyone is providing you with any benefit for using it, short of "not hating you for life."
There are lots of reasons for not letting others use your laptop. School resources get pretty beaten up over time, and you don't want your laptop to wind up looking like one of those. Battery life is a limited resource, not just the charge, but also the number of times you can charge the battery, and they are EXPENSIVE. My laptop now has three ports that have just given up the ghost from regular plugging and unplugging.
Let's face it. Nobody in high school NEEDS to check their email during school hours, but you do need to keep your laptop working. Damage caused by casual users is inevitable, not just to the OS, but to the hardware itself. Asking you to allow that damage simply because you're supposed to be nice is RUDE, and shows no respect for your property.
So, unfortunately, it's not just a matter of saying no, but of educating them regarding why borrowing your laptop is unacceptable. I hope this provides you with a good start.
Yes, "too good to be true" is exactly the description that I gave it five years ago when I ran into it. It doesn't help much that very few of the authorities on the subject will even recognize its existence, although most psychiatrists accept that the concept is sound.
However, both the scientific and anecdotal evidence supports it. Of the seventy or so studies that have been performed with naltrexone and alcohol, they all either support or at least fail to contradict the results. There are currently numerous people who have taken up the process since the release of the book, and their success rate does seem to fall in the range of 4 out of 5. There are literally tens of thousands in Finland who have undergone the treatment with the same results.
The concept of it being usable for any addiction is close but not quite correct. It's been demonstrated to be usable for opiate addictions, and for endorphin based behavioral issues like kleptomania and gambling addiction. Smoking, however, isn't on the list because nicotine addiction is acetylcholine based.
You're smart enough to ask for a peer reviewed article. Good. Here's one for you.
Well, no, not standard naltrexone therapy. Naltrexone is distributed with instructions not to drink. It is often cocktailed with antibuse which makes you sick if you drink. The problem with this is that, if you don't drink, the urge to drink doesn't go away.
Given standard naltrexone therapy, most alcoholics will stay abstinent until the craving overwhelms them, and then give up the naltrexone and start drinking again.
I'll take this offline and we can compare notes.
The basis of the treatment can be summed up fairly quickly. Drinking alcohol releases endorphins, and the endorphins addict us to the alcohol with a force identical to morphine addiction. Taking an endorphin blocker results in a reversal of this effect, where drinking makes you loose interest in drinking over time.
The treatment that results from this effect is equally simple. You have the alcoholic take an endorphin blocker (naltrexone is typical) and then have them pursue their normal drinking habits. After about three to six months, 78% have significantly reduced desire to drink, 25% just stop drinking and have no desire to pick it back up again. I think you can see how this would put Betty Ford out of business and is indirect opposition to AA.
The fine details are a little more complicated, but only because it goes against a lot of logic. For instance, most people expect it to have a "diet pill" effect where it suppresses your urge to drink, and that's how the naltrexone tends to be prescribed. Used this way you'd actually have better results with a placebo, and people give up when it doesn't work that way.
But they wouldn't have to write a book if there were nothing else to say, would they?
Well, setting aside what "the man" has to say, the unusual barrage of snake oil that often comes with trying to find treatment for alcoholism is another solid reason why this treatment has had difficulty with adoption.
Unlike other treatments, though, this one is backed up by about seventy studies, and has a fairly large one that specifically identifies its effectiveness at around 78%.
Most of what we call alcoholism has been cured. The problem is that anybody who might tell alcoholics about it is either financially or emotionally invested in an existing treatment. It's like religion (see responses to this post as demonstration), and it's very frustrating.
For all the details, see the recently published book on the topic. I'm not selling the book, and if you want the details for free, I can provide you with that, too.
When an idea becomes one of the foundations of our world view, then any threat to it is like a threat to our own body. So you lose a finger. Big deal. You can just say that you didn't need that finger to keep your arm, right?
Religion is like that. When new information conflicts with what we insist must be true, it causes cognitive dissonance. CD is like the evil opposite of that sense of beauty we all experience when we listen to great music or check out a hot bod, we get the feeling that this is something we want more of. When something doesn't mesh with what we're absolutely certain of, we get the feeling that this is something we want less of. The more people stick it in our face, the less we want of it, until we start using legislation to keep it away.
There really ARE good applications of cognitive dissonance, but this isn't one of them.
Microsoft makes their INCREDIBLY EXPENSIVE knockoff (which still flops).
These people wouldn't be dangerous if they weren't brilliant. It's something called "the aura effect". When someone does something well, everyone starts to think that they do everything well.
I've seen people like that ruin entire departments. They can code like a demon and produce spectacular and extremely functional software. They do great things and then move on to the next project. They start to form a following, which adds to their notoriety. Small religions form around them.
And then some poor schmuck is handed last year's effort to make a few minor adjustments and finds that it's thoroughly undocumented and uncommented. The call structure averages thirty functions deep, there are more interfaces than there are classes and as many classes as there are functions. Everything is extremely efficient because there is no segmentation of functionality, and subsections have no clear interface boundaries. But because this person is such a great corporate asset, said poor schmuck has absolutely no traction with management in terms of calling attention to this.
More years go by while the superstar wraps entire departments around their coding habits. Great chunks of the company's IP are written in this person's style because he's become a shining example of how things should be done. Maintenance costs go through the roof until enough software engineers with a clue point out these problems and insist on policies that address these problems, and major projects have to be initiated to rewrite huge chunks of existing incomprehensible spaghetti code.
At which point our shining primadonna takes his experience and awards and goes and finds a job with a fat salary in some other poor company.
One of the things that I looked for this most recent time around was a phone that didn't feel like a brick in my pocket, and make my keys smash into my leg. That phone does some cool stuff, but it's darn near a laptop. No thank you.
IMHO as a long term member of the software engineering community, I think that you're covered and can reasonably get out. Your previous employer is greedy, and wants more of a good thing.
Since you're part of a development effort and may have a body of unique knowledge, it would be a good idea to offer to help them transition in a new person to replace you, and to be willing to answer questions they might have on a contract basis after you've left. This is really up to you, though. You're not required to do this, and if they get mean or greedy about it you should definitely cut them off.
This is no more than we can expect from employers these days. We get our two weeks severance and we're out the door, and we consider ourselves lucky to get that. You may want to consider any personal relationship you have with the management, but professionally speaking, I think you're doing just fine.
The program isn't debugged until the last user is dead.