Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:Kitchen Knives (Score 1) 137

I've used Wustof, Henkels and Chigago Cutlery, and have settled on Chicago Cutlery as the most practical choice *for me*. There's no doubt that the more expensive knives are lighter, better balanced and more elegant, but the Chicago Cutlery knives work every bit as well for most people, and that includes very serious home cooks.

Why spend the extra money for a fancy knife made by laser-wielding German craftsmen? Well, I suppose if you spent eight or ten hours a day cooking like my Dad and older brothers did (I'm the only one who didn't go into the restaurant business), then the tiny advantage of a slightly nicer knife might add up over the course of a long shift in the kitchen.

That said, my Dad was a professional cook from the time he was twelve years old until he was 70, and he didn't use fancy knives. He had these ancient hunks of razor sharp steel, forged in some previous age of the world, that could bone a chicken faster than you could unzip a jacket. Nobody in his kitchen would be sissy enough to complain about fatigue from using a knife.

Comment Re:why don't we keep them and use them? (Score 1) 288

The job of the lawyer who works for you is to make you a discouraging target for the lawyers who work for someone else. It's also his job to tell you how much exposure you have to litigation, so unless he does a bad job (shame on you for choosing a bad lawyer), hostile lawyers are no excuse for mis-estimating the cost of future business operations.

The idea that Americans sue each other at the drop of a hat is a myth, unsupported by data. A mere 2% of people injured file lawsuits -- this even holds true for victims of negligent medical practices [,a href="http://jama.jamanetwork.com/article.aspx?articleid=408791">citation]. Tort litigation is less than 5% of civil caseloads; the vast majority of civil cases are contract disputes, and that's *definitely* an area where you pay your lawyer to keep you out of trouble. Since the lion's share of civil lawsuits is contract disputes, it'd be more fair to attribute the relatively high number of civil lawsuits to *business culture*, not *legal culture*. Lawyers don't make you sue; *you* decide to sue.

Comment Re:why don't we keep them and use them? (Score 4, Insightful) 288

A lawyer can't make you do anything. I once had a business partner who froze like a deer in headlights whenever our lawyer opened his mouth. As I said to him, the lawyer's job is to advise you of the trouble you might get into; but there's always *something* to be concerned about; it's *your* job to make a decision and shoulder the consequences. Business people choose which risks to take, and lawyers help them figure out what those risks are, simple as that. If your plans go kaplooie, it's your fault; possibly for hiring the wrong lawyer, or possibly hiring the right lawyer but letting him run your business for you.

This "it's all the lawyer's fault" business is childish baloney. It's not lawyers that keep owners from continuing to use these old reactors, it's the fact that these reactors are old and obsolete. It's not lawyers that made decommissioning the plants more expensive than projected, it's that nobody had ever done such a thing when the costs were estimated, and everyone chose a best case scenario in their plans because they wanted to see the things built. That's a *business* mistake, and an engineering mistake, but unless the lawyer was telling them they'd be able to cart their waste off to the town dump it's not a *legal* mistake.

Comment Re:Star Wars, now with Lens Flare (Score 1) 325

Despite the annoying camera work, the thing about J.J. Abrams' Trek reboot is that he and his scriptwriters really demonstrate understanding of the characters. If you're a TOS fan you can name those moments from the first Abrams reboot movie that make you sit up with that thrill of recognition, even if you didn't care for the movie overall.

Trek is cerebral, even when it is being stupid. At its worst Trek is like Newt Gingrich, who someone once described as "a stupid person's idea of what a smart person sounds like." But occasionally Trek really is thought provoking. In this I think the second Abrams Trek movie is the more authentic *Trek* movie of the two. Both movies bolted a *trek* ethos onto a blockbuster action movie.

Abrams was 12 when the franchise debuted, so it's virtually certain that Ep IV will be his touchstone. So I think we'll see some of the action movie set pieces, gimmicky camera shots and quick pacing of his Trek movies. Ep IV is a different kettle of fish from Trek; it's not cerebral, it's visceral, it's movement, it's surprising and arresting images. Most of all it's mythological, with characters quite literally concoted according to mythologist Joseph Campbell's guidelines for fairy tale characters. The thing about those archetypal characters is that, compelling as they can be, they don't have a lot of internal complexity. The best line in the original trilogy was ad libbed.

So I'm thinking Abrams the storyteller might be tempted to punch up the Star Wars characters a bit, to shade them so they feel a bit more human. If that's true, I think we might see the Abrams' Star Wars movie approaching what he did in his his Trek movies, but from the opposite direction.

Comment Re:Not Internet Connected (Score 1) 481

Actually, the idea of having an entire parallel infrastructure consisting of obsolete, unconnected machines is somewhat reassuring, even if the cost is somewhat exorbitant. After all, would you feel better if software upgrades, launch codes and targeting data were installed on the launch hardware with a Windows formatted USB flash drive?

Comment Re:Was FORTRAN really that hard? (Score 4, Informative) 224

I've actually programmed in Fortran and BASIC way back in the day (late 70s early 80s). From a language point of view early dialects of Fortran (e.g. Fortran IV ca. 1961 and still in widespread use in the 70s) and BASIC were in fact *very* similar. What was different was that Fortran was *compiled* and BASIC was *interpreted*.

It was common until the mid 1970s for Fortran programmers to physically drop off a deck of punched cards at the operator's window. They'd get their results some hours later, if not the next day, after the operators got around to running the job. Most of the time those results wouldn't the desired computation, but a compilation error. So to be productive in Fortran you had to think about your *entire* problem in advance, carefully preparing your deck to get as much as possible correct before handing the job off.

BASIC was an interpreted language initially. That meant you to type in little snippets of your program, even individual expressions, to see how or if they did what was expected. If you typed in a program and there was a syntax error, you'd know as soon as you hit "return". This allowed a more exploratory approach to programming and learning to program. Of course, you could get the *same* interactive experience in a much more sophisticated language by using Lisp.

I started programming in C in the 1980s, and this use-style distinction between compilation and interpretation remained. A full compile and link of our modest application took something like 30 minutes on the minicomputer we were using, which had a clock speed in the single digit MHz range. So we prepared our source changes *very* carefully, and used our down time to read the full 8 volume Unix manual from cover to cover, over and over again. There was something to be said for such an approach to programming, but it was not for the faint hearted.

By the 90s this had changed. Compilers were orders of magnitude faster; you'd actually hit "compile" and *watch* the compiler do its thing. A decade earlier that would have been like watching paint dry. Editors became syntax-aware too, and debuggers went from requiring mad voodoo skills to being interactive and usable by ordinary mortals. So now compilation vs. interpretation is a packaging and runtime issue; there's not much to choose between them with respect to how *hard* a language is to use. Naturally someone who cut their teeth in the modern era look at BASIC and Fortran as they were in the 60s and wonders what the big deal was. But it *was* a big deal, at least for people who weren't going to learn Lisp.

Comment Re:Evolution has given humans the following: (Score 1) 499

Well,famine occurs when pests or weather cause a staple crop to fail. Preagricultural humans would adapt to an anomalously dry year or the emergence of a 17 year locust swarm by shifting to an alternate food source (e.g., locusts). Humans are both omnivores and apex predators so we're very well adapted to extracting calories from even a distressed ecosystem. Obviously that won't help with something like the shift of the Sahara from grassland to desert, but well before that we'd pull out our evolutionary ace in the hole: our ability to migrate long distances.

So human famine is an agricultural phenomenon, and agriculture has existed for barely 5% of our species' existence: long enough perhaps to exert *some* evolutionary influence, but not the dramatic ones you posit.

Comment Re:Missing the point (Score 1) 399

You can get a pretty nice analog watch for $50-$100.

There's a certain satisfaction in something that does a limited number of things very well. For me the perfect thing for telling time is an analog "dive" style watch with tritium hands and markings and a high contrast face. You always can tell time instantly without digging your phone out of your pocket or using two hands, and the bezel is frequently handy for timing stuff. Of course with the tritium you're talking closer to $200, but it's still not exclusive "rich guy" territory. Ironically in the mid to low end of the market, simpler is often more expensive because simpler is better and some people will pay a premium for that.

That said, I've recently switched from my dive watch to a Pebble smartwatch. It looks like hell as far as I'm concerned, but it does two things really well: tell the date and time, and deliver notifications (calendar mostly is what I'm interested in). So despite its somewhat kiddie-toy look, the Pebble is elegant from a use standpoint. Most of the apps are redundant given that the Pebble is essentially slaved to the phone; with the exception of the MultiTimer app, nearly every app I've tried has a much better counterpart on the phone.

The limited usefulness of the Pebble is a good thing in my opinion. It means you can focus your device usage on information you need instantaneous access to. It also means the device can get by with an e-ink display, which means you're OK if you forget to charge your watch for a couple of days. One of the hallmarks of making good design tradeoffs is that relieved of some requirements (high resolution color display) you're free to do a better job on others (battery life, sunlight readability).

Apart from its looks, the Pebble itself is nearly perfect in my opinion. The companion app on the phone on the other hand leaves a lot to be desired. It's somewhat squirrelly, so it doesn't pass the "grandma test". But that doesn't matter for the "early adopters" buying these things now. As a whole the Pebble system works great for everyday tasks.

Comment Re:"State takes custody of teenage girl" (Score 1) 329

And as for facts, we have wildly conflicting opinions from two regularly-reputable sources: Tufts and BCH (who was referred by Tufts).

True, but the problem is that the parents went shopping for a diagnosis they'd previously settled on. So it's not just Tufts vs. Children's. It's Tufts vs. Children's and all the other doctors who gave the parents a diagnosis they didn't like.

This doesn't necessarily mean the parents are deliberate medical abusers. Diagnosis shopping *is* a red flag, but it's entirely possible they settled on "mitochondrial disease" because they have an older daughter who received that diagnosis and were familiar with some of the symptoms. This would also explain the form the younger daughter's somatoform disorder took. Given a daughter with somatoform disorder inspired by her older sister, and a disease with somewhat fuzzy diagnostic criteria, it wouldn't be hard for well-intentioned but stubborn parents to find a doctor who will give them the diagnosis they seek. Maybe even a very *good* doctor, particularly given that repeated rejections give them an opportunity to unconsciously tweak their presentation.

In the meanwhile, I side with the parents. Yes, I am biased because I am a parent myself.

I side with the hospital, and *I'm* a parent myself. And it's not because I'm a fan of authority, because I'm not. It's because I think there is no motive for Children's to act in bad faith here. This is a bad situation for them, and the easiest, most self-interested thing to do would have been to send the girl back to Tufts to be treated for a disease they didn't think she had. BCH may be *wrong* in their diagnosis, but I think the prima facie evidence tends to indicate that they're acting in good faith.

I *assume* the Tufts doctor is acting in good faith, but given that Children's has implicitly accused him of being duped into giving substandard care he may be a little defensive. This is understandable, even though if the patient's parents had been physician shopping they'd have learned, consciously or not, how to be very convincing in obtaining the diagnosis they wanted.

As for the parents, their good faith is neither here nor there. Their good faith would not be probative in the question of their child's diagnosis. Crusading advocate parents making emotional (read "bad") decisions can be hard to tell from Munchhausen by proxy parents. So for now I choose to believe that they're acting in good faith. There can be horrible situations which arise from *everyone* trying to do the right thing.

Slashdot Top Deals

God help those who do not help themselves. -- Wilson Mizner

Working...