A "vacation" is a work week wherein you're only mandated to do urgent things, like fix actual problems (usually: printers). You don't have to work on any long-term projects, and you don't have to sit at your desk certain hours just because it's that time of day. If you somehow manage to get all of the computers in the building working right, you can actually leave for a while, until they have another problem and call you.
It's not straightforward to convert the cost into dollars. There's an opportunity cost, because the people who are working on the XP codebase could be doing other things. If they're at all good at what they do, Microsoft would much prefer to have them working on other things (say, on bug fixes for Eight).
Part of the problem with maintaining an old code branch is that at some point you have to decide whether you actually want to maintain it or not. At some point the answer is always no, the newer versions are better, we no longer want to mess with doing X on the old version. Over time, the value of X escalates. There's an inherent progression, because as you do less work on the old code branch, it becomes not only more obsolete but also less familiar and less well maintained. When you stop doing new feature work on the branch because you're getting ready for release and want to sort out the bugs, you have entered the "Golden Age" for that branch of the code and started an inevitable progression. Without feature work, there is no motivation for infrastructure work or refactoring. With nobody doing feature or infrastructure work or refactoring on the codebase, the level of familiarity with it fades. Bugs take longer to track down and fix. Worse, the consequences of any changes that you make are not immediately obvious to anyone (because, remember, nobody is intimately familiar with this branch of the code any more), and furthermore users have come to expect a certain level of stability, and so the level of testing needed for each change increases. At some point bugs that aren't security relevant and don't cause loss of data no longer seem worth fixing. So you don't bother any more. Now your developers spend even less time working with -- and are even less familiar with -- the code. You go from "bug fixes only" to "important bug fixes only" to "critical and security-relevant bug fixes only" to "security fixes only" and eventually "critical security fixes only", and sooner or later you throw in the towel entirely.
This is not specific to Microsoft. Ask the guys at Debian why they no longer provide security updates for sarge (which is newer than XP by several years; in fact, I think it's newer than SP2). They no longer provide security updates for etch or lenny either. Updates are available for stable (currently, that's wheezy) and oldstable (currently squeeze). The precise economics of how security updates are provided and what resources are expended in providing them are of course very different for Debian as compared to Microsoft. But certain things are the same, and one of those things is, producing security updates for old no-longer-actively-maintained branches is proportionally more resource intensive than producing security updates for current and still-actively-maintained branches. Given the tendency of old branches to accumulate, at some point you have to have a cut-off date.
I say this as a network administrator who still has a number of Windows XP systems on the network at work, and not enough budget to replace them all in 2014. My current plan is to replace as many as possible of the remaining "front-line" Windows XP systems (i.e., the ones that are connected to the internet and directly used by ordinary users on a day-to-day basis). Non-internet-connected Windows XP systems will not be replaced in 2014, nor will ones used mainly by IT personnel, and a couple others might get converted to Debian wheezy (which runs better on old hardware than Seven -- we are not deploying Vista or Eight at this time). That'll only buy them an extra year or two, but it might allow our replacement hardware budget to stretch just far enough. Not every system is eligible to be considered for conversion to Debian, for various reasons, but it's a possibility for some of them.
Nonetheless, I don't begrudge Microsoft the privilege of discontinuing support for XP. You know when you deploy a new system that eventually it's going to be end-of-lined. If anything we artificially shortened this timeframe for ourselves by choosing NOT to deploy any Windows XP systems until after SP2 came out. If I had to do over again, I wouldn't change that.
(Calm down. It was a joke. We actually do know there's no felt, and we clean all the grime off the rollers, and we do it every couple of years. So all you germophobic neat freaks can just chill.)
It probably has little or nothing to do with the story in the book. In the first place, that would be typical for a Hollywood treatment of any book. Additionally, this particular book doesn't have enough story to fill out an entire 20-minute sitcom episode, let alone a feature film.
> Take the Lord of the Rings for example, I remember the language and style
> of the Fellowship in particular being awkward and simplistic
Tolkien may have used simple language, but he didn't spend a page and a half detailing the appearance of a particularly mundane shrub in the dullest words possible. Also, not all of his characters were strictly one-dimensional and remarkable primarily for their unexceptional ordinariness. LOTR had a detailed plot, as well.
In terms of movie, LOTR had exactly the opposite problem of Of Mice and Men: it was fundamentally impossible to cram the entire story into a series of three longer-than-average movies. Even if they'd gone with five movies (one per "book" instead of one per volume), they still would have had to leave out a lot of the action.
Yeah, as an American, I've heard about it all my life. However...
> I was underwhelmed and to this day still do not understand what all the fuss is about.
Yeah, I think this is how most Americans who have actually attempted
to read the book feel about it. It's one of those works that gets by
on pure reputation: people don't want to publicly admit that they
didn't like it, because then they would not seem intellectual, because
everyone knows intellectuals all like the book. (Of Mice and Men has
almost exactly the same reputation and is even more poorly written.
The Scarlet Letter isn't very much better, and lest I pick exclusively
on American authors, I'll throw War and Peace into the mix as well,
though I suppose maybe it's better in the original Russian; I've only
attempted to wade through it in English.)
We need somebody famous but with no pretensions (someone like
a Letterman or a Foxworthy) to speak out in a voice that will be
heard and tell everyone the obvious: the emperor is butt nekkid.
Please don't mistake me for saying that classic literature isn't
good. There are a lot of classics that I really like. In fact, most
of my favorite books are classics. Hamlet deserves its reputation.
So does Tom Sawyer. To Kill a Mockingbird is pretty decent even
just viewed as fiction and furthermore can contribute significantly
to understanding certain historical social issues. A Tale of Two
Cities is if anything underrated. The Bible is grossly underrated.
I'm not saying that classic literature in general isn't good. I'm
only saying that certain specific works traditionally listed among
the greats don't actually deserve to be included.
Lousy marketing, mentioned in the article summary, is one. Traditional vehicles are marketed
extremely aggressively, with the result that people often have a significant emotional investment
in the vehicle before they even find out how much it costs. I've yet to see or hear anything
about electrical vehicle marketing that would make me think it compares.
Up-front price is another. When making a "big purchase", and especially when buying an item
that will no longer be available in the same makes and models by the time they go to buy
another one, people almost always take the nominal pricetag MUCH more heavily into
consideration than later maintenance costs. If you want to see the extreme end of
this, you only have to look at the market for printers. Inkjets *own* the market, despite
the undisputed fact that the TCO of laser printers is FAR lower if you print anywhere
near the median household quantity of pages per month. But new laser printers cost
quite a bit more than new inkjets, so everybody buys inkjets. I think they outnumber
laser printers by something like twenty to one in domestic deployments. (In business
environments, the margin is somewhat narrower, admittedly.)
Another factor is that electric vehicles were initially brought to market and heavily
publicized significantly too early, when the technology was clearly not really ready
for prime-time yet, resulting in a lot of rather unfavorable reviews and press. This
kind of thing sticks in people's minds, and while the newer models are significantly
improved, a lot of people still have the overall impression that electric vehicles
are not very good, for reasons that, while they still have some truth to them,
were undeniably much MORE true ten or fifteen years ago. (One of the best
examples of this is the impression most people have that electric vehicles are
impractical if you have to drive more than a few miles per day. The range is
still not practical for long trips such as going on vacation, nor will it be soon;
but many folks are under the impression that electric vehicles are impractical
even for moderate commutes, which was true in the early nineties but not so
There's no way to pack even remotely enough resources to last anywhere near a lifetime, and there aren't any meaningful resources to be found on Mars. If you can somehow manage to haul in enough solar collectors from Earth, it might be possible to keep yourself in air and water until the equipment breaks down, but food's going to be a serious problem, and you can just forget about anything complicated like medicine or the ability to repair the air-making equipment when it breaks.
CPAN is almost as good for upgrading (in some ways maybe better), but it lacks the ability to easily *downgrade* packages, which isn't a big deal for what it's used for but would be a significant deficiency in an OS distro package manager.
> I personally don't have much gripe against sudo
The gripe probably isn't with sudo as such so much as the way it's configured on Ubuntu by default.
In particular, on Debian you use the root password to do admin functions with sudo; whereas, on Ubuntu you use your *own* password to gain root privs. I suspect this is what the other poster is complaining about.
Which way is better depends on the circumstances. For the systems I administer, as it happens, the Debian way is significantly preferable; but I can easily imagine multi-admin scenarios where the Ubuntu setup would result in better overall security and accountability. What's really needed, IMO, is some good documentation on how to decide which configuration is right for any given system (and how to make the change if necessary).
I assumed so.
> I haven't been to King's Island since it was Coney (and moved).
I was there when it was owned by Paramount, but I haven't been there since Cedar Fair took over. I imagine they've probably made improvements (and by "improvements" I primarily mean coasters), because that's how they roll. But I live in Galion, which is closer to Sandusky than it is to Cinci, so when we want to go to an amusement park we normally go to the Point. (As wimpy as Ohio may be in terms of spices, we're as hardcore as any place on earth when it comes to roller coasters.)
> I've been to the "bug house" at the zoo!
Yeah, the bug house was one of my favorite parts of that zoo too.
> spaghetti in it. That's Cincinnati style.
Oh, I've heard my dad talk about Cincinnati-style chili, but I've never had it. I've only been down to Cinci a handful of times (five or six maybe, all told), and chili was never high on my list of things to experience while there. (There *are* things worth going there for. King's Island isn't Cedar Point, but it's not chopped liver either; the Beast alone is worth the admission price, if you ride it about three times. They've also got an excellent zoo and some nice museums -- though one of the best ones is across the river in Kentucky -- among other things.)
> I've heard about "midwest spices". They called it "Spam"
> because it was Ham with Spices. Salt and pepper!
That "salt and pepper" thing is actually a misnomer. In fact, pepper is not widely used around here. Even sausage doesn't always contain pepper. My mom and one of my sisters categorically won't eat anything that contains it, period. (My other sister will eat sausage that contains a small amount of it, though, and Dad likes it in sausage and on eggs. Then there's me: I routinely use cayenne and have been known to cook with small amounts of habanero -- but I did get the idea to do so from anyone in Ohio.)
The most widely used spice in Ohio cuisine is almost certainly cinnamon, which shows up in about a third of all our non-chocolate desserts (and a lower percentage of the chocolate ones).
The other possible contender is onion powder, which is widely used in lieu of fresh onion.
Other spices that are more common here than pepper include garlic (frequently used with onion powder in meat dishes; but a lot of people don't like it, including my mom), cloves (usually with cinnamon in desserts; also used in pickling and occasionally with oranges), nutmeg (usually with cinnamon in desserts, but in tiny amounts), oregano (mainly used in pasta sauce and pizza sauce), and basil (sometimes added to the oregano, albeit in smaller amounts).
Oh, and I realize someone from outside the MIdwest might not agree, but my mom considers imitation vanilla flavoring to be a spice. We use a LOT of that. Whole tablespoons of it, practically every day. I bet I've eaten (food containing) more vanilla flavoring this month alone than black pepper in my entire life, past and future.
> spaghetti in it, we thought she was trying to poison us!
I don't know about putting spaghetti in it (that doesn't sound _bad_, but it does sound rather _odd_), but my mom makes "chili" that does not have any seasoning in it other than salt and maybe a half teaspoon of onion powder. It's basically a hamburger broth soup with kidney beans and diced tomato. Why is it called "chili"? Well, I don't know. I guess the name was available, because nobody in her circle of acquaintance has any experience with the genuine article.
Traditional Midwestern homemade cuisine is, as a general rule, not real big on spices, especially the "hot" ones. Despite this, a lot of the food manages to be highly palatable. The canned fruits and jams are superb. The pasta is good, and the casseroles are generally not bad. The baked macaroni and cheese is great. I'd go through fire for a serving of mom's corned beef and applies. The desserts as a rule are fantastic, as long as you stick with ones made by people over age 40. (Young people keep getting the urge to make desserts out of trendy magazines, which tend to fall into one of three categories: either they recipe is so focused on being easy to make that it's completely lame, like making box-mix cupcakes in the microwave -- seriously, my sister does this -- or else it's so focused on being exotic and new that it's unaccountably bizarre, like Jell-O salad with cheddar cheese and beets and mayo in it, or else it's Yet Another Cream Cheese Thing. Skip all that. Always go for the desserts made by people who are using a recipe they got from an older person, such as their grandmother or an elderly woman from church. Those are always good.) Scalloped potatoes, sloppy joe, cornbread, muffins, biscuits, zucchini bread, pumpkin bread, deviled eggs, goulash (nothing at all like the Hungarian dish by the same name, but that's neither here nor there), baked beans, applesauce, pancakes, bacon and eggs, glazed carrots, cookies, cakes, pies, puddings, buckeyes, all very good.
But the chili... yeah, not so awesome.