So until the software (or hardware) necessary to make systems more secure improves a great deal people won't use it. I can't say what the nemchmark is for user tolerance / acceptance, but if I had to guess I'd say is was about 1 second of "automatic" activity, zero intellectual input and one simple mechanical movement. Implement that and you've probably invented computer security.
Social scientists will be able to understand and predict the interactions of people the way physicists understand and predict the interactions of objects
Since social scientists are completely unable to quantify anything in their field of study, I somehow doubt that they will ever come anywhere close to doing "real" science in the way that physicists can: with their "laws", measurements and equations.
But maybe this person doesn't really have much idea how proper scientists do their work?
As it is, in other parts of the world, the quality of degree is more of a door-opener than the subject - or the university. So you're sometimes better off getting a 2-one from a less rigourous college than a 2-two from a more prestigious establishment. As so many places only ask for an upper second or better and care little about the subject or where you studied.
You need to be very adaptable, so that you have a baseline skill set that allows you to be a call center operator today
If they want to approve every change, then just flood 'em with paperwork. 1 day spent automating your process should keep them busy for at least 6 months. Meanwhile you won't have any changes that have been approved, so you can get on with the interesting stuff.
Oh and if anything fails, dies, gets a virus (presumably security updates and virus scanner downloads count as changes) or lets the world and his/her dog steal your company's secrets then it's not your fault: the board hadn't approved the change you submitted weeks ago.
The good thing is that the change board are taking on responsibility for the changes. By approving them, provided you execute them exactly as described, then they are to blame for any problems - as they gave approval. Make sure you keep a paper trail and have a record of everything you do.
They will quickly tire of the burdensome, boring and ultimately futile work. So enjoy the honeymoon period. It wn't last forever, but if you handle it properly, you can shed the blame for any problems for at least a year - even if the board disband. The confusion and lack of clear indications of who should have approved what can be spum out for a long time - in the right hands.
Meantime, you will have plenty of opportunity to look for another job.
Any newspaper that doesn NOT carry a horoscope and limits sports coverage to a single page (2, tops) must have a sensible set of priorities. In addition it takes the reaslistic view that pretty much everything of importance has a business or financial driver or consequence (though it does cover natural disasters and upheaval in non-financial terms, usually with a much more level-headed and unsensationalised tone, too).
The weekend FT, especially, is the closest I've ever seen to a well-balanced, non-partisan, grown-up (more in-context F-words and nudity than any other newspaper manages, but it all fits in with the mature nature of the writing) content than you'll find elsewhere.
And full-sized newspapers are so much better than tiny little tablets or even PC screens for getting the BIG picture
The Economist has always had a penchant for saying very little with the largest number of words.
I find that the Economist has a very high information density. Not just in its headline topic but in many other areas of journalism, too.
As for "half-truths and over simplifications", that's not my experience. Maybe you just don't understand a lot of the rather complex concepts and language that their professional and technically proficient writers use?
You do a disservice to everyone involved when you force your brightest people to take on additional roles.
This sounds as if the author is saying developers are brighter than other people. Well a few may be, but when you look at most of the dumbass bugs that appear (not to mention the spelling mistakes, tortured logic, crappy coding styles, and mistaken ideas of what constitutes "good") I really can't see that being the general case.
As it is, I feel that it does developers GOOD to get them into a position where they see apps and O/S's from the other side. After all, these were developed, too. So all you're asking the developers to do is see what the results of software development looks like. Rather than allowing them to live in an ivory-tower, isolated development world and then tossing their deliverables over a wall for other people to munge into something workable. If they don't like that, then maybe the problem is with their own craft: producing bad products, than with the operational work.
The issue is not that some open source software has a bug in it. We're all grown-up enough (I hope) to realise that NO software is ever perfect.
The only interesting point about this situation is how the Open Source world reacts to it and what processes get put in place to reduce the risk of a similar situation arising in the future.
We are bad at predicting the future because it cannot be predicted.
The gadgets that we think about as "the future" (actually: only the future of technology - the broad-brush future of the planet is vert easy to predict. We know how high the population will grow, when the max. will be reached and where all those people will live and when they will die. Omitting disasters (natural or man-made), wars and pestilence our future is easy to map) are totally subject to random decisions: which standard will be adopted, which advertisements will be used (and therefore the success or otherwise of an appliance), which bugs will be fixed and which ignored - mking the difference between choosing product "A" or product "B" as the next big thing.
Since the next generation of gadgets is built on the one before, think video games: an easily described lineage - right back to "pong" or PCs, the random decisions made every couple of years compound those made before.
While those paths are easy to see in hindsight, guessing (and it IS only guesses, no talent required) which decisions will lead to the next generation of successful gadgets and form-factors is not possible.
You also have to remember that the cover (and all articles about "the future") are written for a contemporary audience. Therefore all the stuff mentioned or described has to be acceptable to those people. If the artist had just drawn a small plastic chip, it would have been meaningless. A floppy disc, although nobody who could ever claim to be a Byte reader would consider it viable, signposts the idea of miniature storage.
In that respect it was prescient.
Write as you wish, you're not bound by any rules
This was (maybe still is) the fashion in UK schools for a long, long time. So long in fact that the current generation of teachers were brought up this way. The idea being that correcting grammar and spelling mistakes would somehow "stunt" creativity - and that creativity was more important than you know: being understood or communicating clearly.
Since the teachers were not taught that there was a correct way of writing, they cannot possibly pass on to the next generation a skill they never gained, themselves.
Downward spiral, anyone?
So I wonder if any of these people actually did any of the "cool projects" they claimed, or did they just pose around, with their newly aquired status (or otherwise) symbols.
How does a 500 year data set apply to a 4.5 billion year old planet?
Extremely well, as it turns out. You don't need weather records going back to the dinosaurs to forecast tomorrow's weather. It would simply be irrelevant. All you need is enough information to establish a valid model for NOW and then use it's predictive powers. The climate people have all of that and they've run the numbers. Guess what? It works.
So what if the model only holds for a few decades, that's long enough to forecast some rather disturbing possibiliites. Ones that may (or may not - but that's a different issue) need some people, somewhere to do something
The scientists have done the science bit. It's now a political game to actually get people to do something. Questioning the science at this stage is a bt like questioning the properties of gravity - just because it may have been different 10 billion years ago.
So what do we have to give up to have a zero change in the global temperature
Only one thing: having so many offspring.
The problem isn't that we have an excessive lifestyle. The problem is that there are TOO MANY of us having an excessive lifestyle. Get the population down to a billion or so and we can all have diesels, coal-fired power stations and as much beef as we could ever desire.
It's just that all 7 billion of us can't all do that at once.