IMO (and I'm older so may be biased) I want older programmers. I was talking to a young guy where I work and he had his own ORM that generated code. "Why don't you use entity framework or nHibernate". "Because I wanted to build one".
And that's a young programmer's attitude, and to some extent, in the days of mainframes, building cool stuff in a company was a good thing because you had no other option. But in the days when you can just download something open source that someone has built, and wire it in and test it or maybe buy something for £100, it makes no sense. We know about things like technical debt, that young guys don't, that you want to write as little code as you can to solve the problem.
When you get into the "real world" use of tablets, not Jonny Ive sat lounging with one in an ad, but actually using one, you realise that they're actually a pretty bad idea. The positioning is all wrong. How do you watch a 2hr movie on one? Hold it up or look downwards. Neither are as comfortable as watching a movie on a TV or even a laptop where the screen is supported. Touch screens might be new, but they're not progress over more accurate and faster keyboards and mice.
A tablet offers less than a tablet, with few upsides. OK, they're smaller and lighter, but they're not so small as to give any advantages. Anywhere that you can take an iPad, you can take a laptop.
This is what I can never get about this "great media consumption device". At best, it does a lot of different media consumption quite badly, meaning that you still can't replace all the dedicated devices.
It'll play MP3s, but you can't exactly use it down a gym. It'll play movies, but a 10" screen with no stand isn't as good as a laptop, let alone a 32" TV. It'll give you eBooks but a kindle is far easier to read
A laptop is actually a better "media consumption device". Larger screen, self-supporting. Every single service that can give you content on an iPad is also available on a PC, and a PC allows you to watch a load of internet media that you can't get on an iPad. You can watch a Blu-Ray or DVD. You can plug in your movie library via USB. You have far more storage space. If you want to put it on a TV, you need nothing more than an HDMI cable.
The problem is that until someone makes the design more productive than a laptop, it won't be as good. That's not just about the keyboard, but also that the laptop is designed to give you a vertical, self-supporting screen. An iPad may be smaller and lighter than a laptop, but it really isn't any more portable. Anywhere that you can take an iPad, you can also take a laptop.
They're not even better as a "media consumption device". 10" 4x3 device with no stand is better than my 14" 16x9 laptop? Does it have as many formats? Can it store as many movies? Does it have an HDMI connector built in? If I use LoveFilm, can I watch it onto my TV or does the app block that?
Look at Slashdot readers, who you would think would be on the vanguard of this technological shift. Instead they are some of the clingiest whiniest buggy-whip holdingist resistors of change to be found, simply because post-PC devices cannot yet replace high-end CAD workstations or some other such uber-specialized nonsense that do not matter to the general trend.
No, we're technology skeptics. Show us why these are better than what we had before and we'll use them. And we're about to be proven right about the "bonnet welded shut" thinking as the iPad 1 is about to stop getting upgrades. It's likely to be a redundant device, 2 years after they were sold.
I give it a year, maybe 2 before the 3D thing dies out. The studios liked hyping it because you couldn't see it on TV, the theaters liked it because they could charge a premium. But the fact is that audiences have got bored with it. They're opting for 2D.
As I understand it, there's plenty of evidence for a warming trend. In that sense, climate change is a fact. The acrimonious debate (for people with enough mental capacity to get past a knee-jerk reaction) revolves around two questions 1) whether or not it is caused by human activity, and 2) whether it in fact represents a continuing trend and therefore a crisis for humanity. Neither point 1 nor point 2 has been proved definitively but many minds much more knowledgeable about the facts than I seem to think so. Unfortunately, this doesn't really seem like a provable proposition. Given the complexity of the environment, one might as well try to prove that String Theory is correct. I support and admire the scientists who struggle to understand/explain/prove either String Theory or climate science.
Some of the global warming is man-made. You drive a car, it burns fossil fuels and produces CO2 and the planet warms. You won't find many skeptics doubting that
The problem is all the stuff about feedback effects and just how much of a problem they are. The results of the past 15 years are that we did not see warming that was within the range of the models. I know scientists working in other fields who say that the methodological approaches and their attitude to open presentation of results are poor and would not be tolerated in their field of science.
HR are mostly about covering ass and are fine for checkout/call centre staff. Beyond that, they're a pain in the ass.
I had a manager who had a huge fight because HR wouldn't hire IT guys without degrees. They'd screen out people with 10-15 years experience.
Seriously, I can only hope that the people at UEA DIAF because a fireman was suspended for having a joint a week ago, because their technology made it easier to drug test them.
If you're working to support the war on drugs, you're a money-grabbing fascist. Go and research how to make drugs that can't be detected by sniffer dogs and make the law a farce and we might see a change in them.
Maybe Computer Science should be in the College of Theology. -- R. S. Barton