People with bluetooth earbuds are already walking around like schizophrenia patients talking to themselves.
I predict that non-contact hand motion detection will be made practical soon. 3D cameras will "read" hand gestures so that you are essentially typing in the air. Everyone will look like magicians, waving gestures at their watch or gizmo.
Past paid experience with obscure and legacy languages saved my arse after the dot-com crash dumped tons of coders back into circulation on the west coast. I got no offers until I down-played my web experience and highlighted pre-web experience and languages. You never know what experience will come in handy. It's similar to investing advice: diversify your portfolio.
I've been around a while, and I agree there are what can be called "elite programmers", but with a caveat. Such people are "masters of code", but generally they are not very good communicators, and work on projects and niches that require a high degree accuracy, fastidiousness, and debugging skills. They often work on systems software, such as OS's, database engines, network control software, weapons control systems, etc., and are usually paid quite well.
But they don't do so well when the requirements are fuzzy or change often. They don't handle ambiguity well.
Of course, this is probably an oversimplification, and possibly a feedback loop where one that is highly "code oriented" will tend to be given yet more code-centric projects away from fuzzy office politics and fickle customers such that they don't develop their "fuzzy" analytical skills. Thus, they are not necessarily inherently "bad" at dealing with Dilbertian chaos, it's just that they lack experience with it because they went into a field or niche that is more technology focused, which helps keep them away from office politics and goofy users.
It's hard to master both human nature AND machines in one lifetime. Those who focus on a mix of both are probably in-between skill-wise between people and machines, and this is where the majority of developers and developer/analysts are.
And of course there are exceptions to the rule: some master both, but those are rare in my experience. (Those who believe they have mastered both are common, but egos tend not to be accurate to their owners.)
When they say that, I sometimes reply, "Yes, one CAN often win Russia Roulette, actually. But that doesn't mean it's a good game to play." Communicating risk is a tricky balancing act, though. Generally if somebody wants to be a jerk, they can and do spin it either way.
I may end up replying, "since you often seem to disagree with our risk assessment, how about we schedule a meeting to discuss it?" They often then pipe down because usually such mouthy people don't have the solid reasoning to plead their case: they just play sound-bite politics, but are fish out of water in detail-land.
There are also some careers where nobody wants a mediocre practitioner. When one's freedom is on the line, nobody wants a mediocre lawyer; when one's life is on the line, nobody wants a mediocre doctor. Therefore, why should it necessarily be the case that companies would want mediocre programmers? Some programming does have life on the line: software in cars, planes, nuclear reactors, or Therac-25 radiation machines; or people's or company's finances: software in banking or stock trading.
There are also some careers where you simply can't succeed at being mediocre, for example any kind of research scientist: if you don't publish good work (and have the kind of innate ability to enable you to do good work so you can publish), you simply won't succeed. How do you we know whether programming is the kind of job where one can be mediocre and succeed?
I've interviewed lots of candidates, many of whom claim N years experience in language X. I'm often stunned at how much many don't know -- stuff that anybody who completes a CSX101 or algorithms or data structures course should know. Is that mediocre?
That study is full of ships!
When a company wants to do something risky, I try to make sure I practice C.Y.A. with a well-CC'd email with wording similar to, "I believe it's notably risky to do X. I highly recommend against it. A lower-risk alternative is to do Y."
Management can go ahead and choose X if they want, but at least you've documented that it's against your recommendations.
Some people simply enjoy blaming and pointing fingers, and will jump at the chance to do so. (Sometimes there's also sticky politics behind it that a techie isn't made aware of.)
And don't expect outright apologies. Many people really hate to admit they are wrong. Humans are just that way. The best you can hope for is that they respect your opinion more in the future because your prediction turned out correct and theirs flubbed.
Carnegie's "How to Win Friends and Influence People" is a great book on office relationships and human nature, even if it's a bit disturbing in places. I highly recommend all geeks read it. It should be required reading in college.
Well, we have an ex-Prime-Minister who paints puppies, goats, and feet sticking out of bath-tubs!
Probably. It's difficult to project such in an interview and I'm not very good at faking not being nervous. Most geeks are probably more like Jamie Hyneman than Adam Savage. "Oh goody, a beta DLL, watch this mamma smoke! Heee he he...!"
I knew the risk of startups at the time and was willing to accept it then. I was trying to transition off of desktop application dev, expecting it to be a shrinking field, and knew I'd probably have to eat some salary for a year or two in exchange for web-oriented experience.
There is a middle ground between groundbreaking and dud. We could learn something new about the interaction of fairly well-known forces, for example, even though it won't provide anything of significance in space due to some yet-to-be-found limit. Or be some inadvertent testing snafu that will make future testers smarter, having this hard-won lesson.
If I had to guesstimate the probabilities right now, I'd go with:
10%: Revolutionary breakthrough
50%: Somewhat interesting lesson having only incremental practicality per new technology or testing methods
40%: Dud or scam
Representative Louie Gohmert (R-TX) is worried that scientists employed by the U.S. government have been running roughshod over the rights of Americans in pursuit of their personal political goals...
And politicians, corporations, and the wealthy have NOT?
Let's not have a double standard here. If we are going to hunt down bias, hunt down ALL bias.
We use standard units around here, none of that "furlongs per fortnight" crap.
By political standards? "Hillargrams"
Apptly named. Even better than "fappler".