Amtrak employees are NOT federal employees. Amtrak is a publicly subsidized private for-profit corporation with common stock held by four other railroad companies. The Federal government is an investor, but holds only preferred stock.
Well, speaking of Amtrak employee accountability, I have a story about that. A few years ago my family took a train ride across the country. When we changed trains in Chicago I noticed that the reading light in my sleeping compartment was stuck on, which of course was bad if I wanted to actually sleep. I found the friendly and helpful attendant and reported it, and her reaction was like watching a balloon deflate.
"What's wrong?" I asked.
"If we report damage they take it out of our wages," she said.
"What! What do you mean take it out of your wages?" I asked.
"If a car is damaged under my watch I have to pay for it," she said.
"Well," I said, taking out my swiss army knife, "I guess there's nothing to see here."
I have to say that I've never encountered such a nice, enthusiastic, friendly group of people with such an abysmally low morale as the crew of a cross-country train. With passengers they're great, but all through the trip I'd see two or three congregated having low muttered conversations. It didn't take me long to figure out they were talking about management. And while the experience was wonderful, the equipment was in horrible shape. It was like traveling in a third world country.
With management that bad, more data doesn't equal more accountability and better performance. It means scapegoating.
If 99/100 scientists agree one thing is true, it's more likely to be true than the alternative backed by 1/100 scientists.
Which is beside the point. Consensus isn't about truth, it's about burden of proof.
Suppose Alice and Bob both try to make a perpetual motion machine. Alice claims she has failed, but Bob claims he has succeeded. The scientific community treats Alice's claims of failure without skepticism but it automatically assumes that Bob has made a mistake somewhere.
Does that seem unfair to Bob? Well, imagine you're a rich guy and Alice and Bob are both applying to you for a job. Bob says you should give the job to him because he's your long-lost fraternal twin your parents never told you about and which the hospital hushed up for some reason. When you mention this to Alice she freely admits she is not related to you. You automatically believe Alice, so is it fair to Bob to be skeptical of his claims?
It's a case of "extraordinary claims require extraordinary evidence. In either case Bob can prove his claim, it's more complicated and time consuming because he has to explain what went wrong with all the prior knowledge. Alice's claims in either case are consistent with what you reasonably believe to be true so you can reasonably assume she's correct.
...when we replaced the scientific method with scientific consensus?
Er, no. That's like positing science going off the rails because it replaced instrumentation with data.
As ShanghaiBill says, Bats aren't rodents. I'll just add that bats and rodents are about as taxonomically unrelated as two mammals can possibly be.
Bats are more closely related to horses, bears, rhinos, even whales -- like most mammals they're members of the huge and diverse superorder Laurasiatheria. Rodents are in the much smaller superorder Euarchontoglires, the only non-extinct members of which are: rodents, rabbits, hares, pikas, tree shrews, flying lemurs, and the various primates.
Pressure cookers have actually made a comeback among foodies. The difference from grandma's pressure cooking style is that times for anything but pot roast are *extremely* short. For example if you're cooking broccoli it's done after two minutes at pressure. Grandma would have kept the broccoli in the pressure cooker for five minutes and removed it as a pale gelatinous goo.
A pressure cooker is a good acquisition when you're setting up a kitchen because even though you might use it only a couple of times a month, if you don't lock down the lid what you have is just a nice, heavy pot. Slow cooked is still the way to go for chili, but if you don't have eight hours you can get passable results in well under an hour with a pressure cooker.
I believe the metaphor you're looking for is "re-arranging the deck chairs on the Titanic".
I wouldn't call it a metaphor, nor would I say that Asimov's point is that you can't codify morality. His point is more subtle: a code of morality, even a simple one, doesn't necessarily imply what we think it does. It's a very rabbinical kind of point.
I nominated "dingleberry".
Spin, sure, but it's a waay bigger minority than I expected. I'd even say even shockingly large.
The genius of Asimov's three laws is that he started by laying out rules that on the face of it rule out the old "robot run amok" stories. He then would write, if not a "run amok" story, one where the implications aren't what you'd expect. I think the implications of an AI that surpasses natural human intelligence are beyond human intelligence to predict, even if we attempt to build strict rules into that AI.
One thing I do believe is that such a development would fundamentally alter human society, provided that the AI was comparably versatile to human intelligence. It's no big deal if an AI is smarter than people at chess; if it's smarter than people at everyday things, plus engineering, business, art and literature, then people will have to reassess the value of human life. Or maybe ask the AI what would give their lives meaning.
Dear moderators: "Troll" is not a synonym for "I disagree with this".
That said, I disagree with this.
We've known since the investigation of 9/11 that suicide bombers are not necessarily dead-enders except in the literal sense. Economic powerlessness might play a role in the political phenomenon of extremist violence, but it is not a necessary element of the profile of a professional extremist. These people often come from privileged backgrounds and display average to above average job aptitude.
Mohammed Atta's life story makes interesting reading. He was born to privileged parents; at the insistence of his emotionally distant father he wasn't allowed to socialize with other kids his age, and had a lifelong difficulty with relating to his peers. At university he did OK but below the high expectations of his parents. He went to graduate school in urban planning where his thesis was on how impersonal modern high rise buildings ruined the historic old neighborhoods of the Muslim world.
That much is factual; as to why he became an extremist while countless others like him did not, we can only speculate. I imagine that once he decided modernity was the source of his personal dissatisfactions Al Qaeda would be attractive to him. Al Qaeda training provided structure which made interacting with his new "peers" easier than ever before. And martyrdom promised relief from the dissatisfactions of a life spent conscious of his own mediocrity. Altogether he was a miserable and twisted man -- but not economically miserable.
And that point is encapsulated in a single adverb: still. "Still" is what makes this news; it wouldn't have been news twenty or thirty years ago.
I am old enough to remember when genital equipment was considered employment destiny. When my wife went to oceanography graduate school the sysadmins of the school minicomputers were all female. The all-male faculty called them -- I kid you not -- "Data Dollies". Data dolly was considered a good job for a technically inclined woman because it paid well for an entry level job, involved computers, and was an easy job to hand off when you quit to marry the professor you'd snagged. Plus they'd have a hard time getting work in industry. Clearly that was a transitional moment because there were a substantial minority of women graduate students in the program, but *no* female professors, much less senior administrators.
But given the strong cohort of women in that class, it is surprising the thirty years later there is still a lingering perception in this country that science isn't for women. But maybe it shouldn't be surprising. Change doesn't happen instantaneously, nor does it necessarily ever become complete. When I was in college the notion that women had to become full time homemakers was still predominant -- not among students, but of people over thirty or so, practically everyone in positions of hiring and authority. That attitude seems weird and foreign to a young person today; I expect it's hard for a young person to grasp how pervasive and indeed how genuinely oppressive that belief was. It's a bit like the difference between the way I experience watching Mad Men and the way my kids do. I actually *recognize* that world where smoking was everywhere, big shots drank during office hours, and "womanizing" was a word people actually used without irony. It was fading fast, but still there. To my kids it's like an alien civilization in Doctor Who. So yes, the news that many Americans see science as a profession that somehow belongs to men is a bit like discovering a Silurian in the closet.
The women of my generation fought hard to establish a beachhead in male dominated professions, and if they're sometimes a bit snippy about it, well they earned the right. It wasn't easy to be an oddball among your peers and freak to your parents, teachers and and people in authority generally. And this was at a time when there was no such thing as geek chic to offset the disadvantages being an oddball. Being a geek was bad, period.
Now that cadre of pioneering women is at or approaching the apex of their careers. They're still a minority in their age cohort, but they left a wide open hole in their wake for the next generation. It's taken awhile for that hole to fill up because when opportunities open for a group they go for more high-profile professions (47% of medical students are women, as are 48% of law students). But in another generation I am sure the view that science belongs to one sex or another will be a truly fringe belief.
"Accidentally" isn't certain here. If I was part of something that was wrong and I wanted it to be known, I would very well "accidentally" leak it too.
Except I don't see how that applies in this case. Stay or leave -- it's not the bank's call. But if politicians are putting leaving the EU on the table, even as an empty gesture, then naturally the bank has to start thinking about contingency plans. That's just common sense, even if you think the very idea of leaving the EU is mad.
It's also common sense to keep that on the DL to prevent misguided overreaction to what is after all still a hypothetical scenario. The Bank of England a central bank and so people must be constantly scrutinizing it hoping to glean inside information on future monetary policy. That's to say nothing of having to deal with the conspiracy theory nutters.
Well, with electronic toll-paying that could work, but it would still shift the burden from low MPG to high MPG cars.
The great thing about a gas tax is that it's a simple way to kill two birds with one stone: encouraging higher mileage and paying for infrastructure. The problem is that not everyone agrees that both birds are important. Two-birders think that high mileage vehicles should be discouraged because of externalized costs -- pollution mainly, but also space required in parking lots, greater risk to other road users etc. One-birders don't care about externalities but understand that the roads and bridges need to be repaired. Zero-birders are just idiots.
I'm a two-birder myself, so raising the gas tax is a no-brainer. I'd also issue everyone a flat rebate per driver, because in fact I'm a three-birder: I'm concerned about the effect of a regressive tax on the working poor who have no options but to drive to their jobs.
But I'm also a realist. There are a lot of one-birders out there and the roads need repair. It's also politically easier in one-birder territory to sell something as a fee rather than as a tax, even though from my perspective that's an irrelevant difference if you're raising the same revenue either way.
Well, they're already opting to have damaged natural joints like hips and knees replaced. That's a case of upgrading from natural to artificial to gain function. As the performance of artificial limbs increase, it might become an increasingly commonplace treatment for older people, just like knee or hip replacement.
If we project that trend forward for twenty or thirty years I wouldn't be surprised at all to see artificial legs that outperform natural legs for the purposes of walking or even running. But I don't think people with normal abilities will be trading in their limbs just to be able walk a little longer, run a little faster, or carry more weight. That won't happen until the replacement is subjectively indistinguishable from the real thing; until you can feel the grass under your toes.
I'm comfortable predicting locomotion parity in the next fifty years, but I wouldn't care to speculate on when we'll see sensory parity.