Who says he has knowledge of business? Just because he's 40-something? And a PMI certification? For the love of God. Here comes another useless PM.
Please do not move to management simply because you're "of a certain age" and haven't kept up with technology. The last thing IT folks need is a manager who became a manager for the wrong reasons. "I can't find any other job" is a wrong reason.
I am a 44-year-old Perl developer (turned manager). But I haven't worked at a Perl shop in about 4 years. At my last full-time Perl job, I became a manager (because the team needed a new manager, and I felt I was ready to do it), and then from there I moved to a management job at a
1. That there are some useful things in the
2. That I prefer not to work in a
From there, I joined a small PHP shop as their CTO. PHP is a bit frustrating for this particular Perl guy, but fortunately I don't have to write much of it. However, we are small and I do write a certain amount of code. I actually use Perl for a lot of the stuff that I write. I hired another guy who writes some Ruby. I also dabble in node.js and some other stuff. It's an amazing time. In fact, it has BEEN an amazing time for our entire careers. Have you done anything with AWS? It completely changed the game. Web services in general? Back to Perl: Moose? Dancer? cpanm?
You are not a dinosaur because of your age. You are not a dinosaur because of your programming language. You are a dinosaur because you have let the world pass you by. Get busy learning: Node, Redis, PHP, Python, AWS. Pick one that sounds interesting and learn to do some cool new tricks. Nobody is stopping you.
While that's a good point, it would surprise me if that much thought goes into it. I assume it's simply a matter of zero-tolerance HR policies for all employees. That's the easiest and least risky thing for them to do, whether it's the right thing to do or not.
I'm an IT manager. The right thing to do is to treat every hire individually, and to examine the circumstances of every case. Not everybody's career prospects should be ruined forever, based on youthful mistakes (I made a youthful mistake or two myself).
I have benefitted from changing jobs every few years, but I never switch jobs after only a year. I'll stay at pretty much any job for a minimum of a year, and I try to stay everywhere 2 years. My career so far has been 1 year, 2 years, 3 years, 6 years, 3.5 years, and 2.5 years. I felt like I stayed at the 6-year job too long. It wouldn't have been too long if I had grown in the right way while I was there, but I didn't. I kind of plateued and stayed because it was comfortable.
And that is the reason why I think switching companies every few years is essential, especially in IT. There are just so many things you will never learn if you don't work at different companies, even if the things you learn are simply how jacked up and ridiculous some companies are. I've been at startups, Fortune 100 companies, and companies in between. I highly recommend giving each company your full commitment for at least a couple of years, but always be willing to move on when you're no longer growing your skills or doing interesting things.
See, this is the attitude. He says his job is to make sure his employees go home every day. He's wrong. While nobody should be reckless with the lives of law enforcement, and it is a great tragedy when the lives of law enforcers are lost in the line of duty, it is their job to protect and to serve the public. It is their job, from time to time, to be injured and/or killed in the act of protecting and serving the community. That is the job. It is voluntary. It is not for the faint of heart, or for the cowardly. It is not, in short, for me.
When the number one priority of law enforcement becomes protecting the life and safety of officers, above all else, then the public suffers. And the lives and safety of citizens have been sacrificed on numerous occasions in the name of protecting law enforcement.
I am not anti law enforcement. But I am deeply concerned about their low tolerance for risk.
I manage development teams. First of all, like it or not,
OK, back to the other question: Do you have to be a 7 or higher? Heck no. I find that there's very little "computer science" in what we do in a lot of professional software development. It's a lot of CRUD and knowing how to put various things together (REST plus AJAX plus responsive plus ORM plus third party integration plus memcached plus S3, etc, etc). You don't have to be a brilliant programmer for that. You have to be able to learn quickly, retain knowledge (at least enough to remember what you Googled to do it again), be a good communicator, and be a really good team player.
In summary, I would gladly take a 4 with potential and a great attitude, over a 9 who is a dick. I would do that over and over again. You just need to make sure that you have a few 7s around, but not too many.
Take this for what it's worth, but I find it relevant. I, too, laughed at Microsoft's Surface tablet, and it's dismal sales figures. I was the happy owner of a personal iPad and a work iPad, and I figured that was what the tablet experience was, and what it needed to be.
I was wrong. At least for me, I was wrong.
I have the strange situation of working at a Microsoft shop, as a manager of a development team (though I am an open source guy, by background). The local Microsoft sales office invited several of us to an Azure educational session. Part sales, part hands-on training. I am a big user of Amazon Web Services, and my team uses it as well. So I figured instead of just being an Azure hater, I would come along and do the hands-on walk-through, because it's some dedicated time for some busy people to really dive in and see what all the fuss is about.
But this comment is not about Azure. Azure was fine. I'm still AWS fan, and Azure was just similar enough, and just different enough, that I'm not chomping at the bit to start using lots of Azure.
But here's the thing: They had a door prize. And I won it. It was a Surface with Windows RT. And I love it.
Would I love it as much if it had not been free? We will never know. But all I can tell you is my story.
One might argue that it's stupid for this thing to have a "Desktop mode." But it also has a USB port (shocker!) and supports Bluetooth. I already have a Bluetooth keyboard for my iPad (which I never use), and a USB mouse that I use for my laptop. I put my Surface on it's kickstand, pair the keyboard, and plug in the mouse tongle, and the dang thing is transformed. It's a little laptop. It has Office installed. It has the full Windows version of IE. Yes, I hate IE, and it sucks that there's no Chrome or Firefox for this platform. But the point is that, in a jam, this makes a damn fine little PC.
And then it's NOT a PC. You pick it back up and it's a tablet. And a NICE tablet. The apps are nice. The Reddit app is my favorite Reddit experience, in fact. Netflix, Facebook, etc.
Oh, and the best part... the Surface team designed it to be used with mouse, or a finger. It behaves differently in each case. And it behaves generally the way you expect (want) it to.
So, call me a fanboy. I did not expect to really like it. But I do. I really like it. Anybody slamming Windows 8, and particularly Windows 8 on a table, isn't really giving it a chance. It's really nice.
I'm a big fan of 3D printing, I am. And at the beginning of this video, I just thought this was the coolest thing. Well, except for a couple of things.
Tape measures are widely available and inexpensive.
This one is REALLY short (just over 4 feet), and it was comparable in size to a standard 25-foot tape measure.
Worst of all, it's not accurate. It's off by a 16th of an inch at the maximum length, and it would only get more and more inaccurate, as the length increases.
Other than that, it's perfect
Computer Science has its place, certainly, but it's not in every IT shop in America. I've been giving this a lot of thought lately: How do you take those unemployed and underemployed people, whose jobs have basically disappeared, and are never coming back... and intersect SOME of those people (not all of them will be able to do it) with the enormous shortage of talented and capable IT people.
I've come to almost accept, over the last couple of years, that there's such an insatiable demand for IT, and such a shortage of competent IT people, that it's just a reality that we're going to have lots of lots of crappy people in IT, and there's nothing that can be done about it.
But I'm having difficulty completely accepting that. Because I know that the skills that you need to be good at solving technology problems are not extraordinary. I just barely started college, and then quit to join the Air Force. Five years later, I got into the web business (in 1996) and I've had a great career for 18 years. I recently decided to finish my degree, but that's a different story.
The point is: I'm not a computer scientist. There have been a few times in my career when I would have benefited from a CS degree, but not many. Mostly, what I have needed is intelligence, verbal and written communication skills, the ability to quickly learn new things, a passionate interest in technology, the three Larry Wall traits (laziness, impatience and hubris), and an understanding of how users think and act. Editorial skill has not hurt me, and neither has graphic design skill.
While I would be really interested in helping to build an educational program, one problem I have is that I'm self-taught, and therefore don't really know how you're supposed to teach this stuff. But I would love to be part of a workshop where industry folks come together for a week and brainstorm on this topic, or something.
My big sticking point is this: I honestly believe that the one non-negotiable requirement for being a good technologist is intelligence. And this seems to be controversial, because it makes it sound like I'm calling other people stupid. And, well, I am. I really wrestle with this. I wonder how good a web developer you can be if you're not quite smart.
I've heard about that awful EHR (Electronic Health Record) integration effort between the Veterans Administration (VA) and the Department of Defense (DoD) for years. It's a failure of a lot of things, but if open source is even on the list of those things, it's low on the list. At the top of the list is dotted lines and bureaucracy, of course. Heck, IT projects often go off the rails, particularly big expensive ones. Let alone one done for the Department of Defense (DoD). And of course, it's not just the DoD, it's also an inter-department collaboration. Doomed for failure, unless it's managed excellently.
It appears that one big reason that this integration project is so hard is because the VA can't compete when it comes to process and bureaucracy. They don't have nearly as large a budget. This quote is telling:
"The iEHR demise was expected by all, accordingly," one VA source said. DOD officials "outspend, outtalk and outlast us at every engagement. We try to emulate much of their process-based decision-making as if we could afford to. We can't. The overhead is crippling, and we are not funded equivalently."
It pains me to see any IT project that gets out of control and ultimately fails. I hate it even worse when it's the government. As a veteran, I especially hate to see this one. And as an open source user, contributor and advocate, Oracle blaming that massive failure on open source adds insult to injury.
Not only is this reduction in redundant staff probably appropriate, but this is one of the rare situations in which "synergy" is used in a non-lame, non-stupid way.
When two companies merge, the hope is that the two joined as one company will be more effective than they were when they were working together, but separate companies. Synergy is a reasonable word to describe that.
But unsurprisingly, that synergy does not always happen. You're combining two companies, with two different cultures, perhaps incompatible systems, perhaps conflicting ideals. And you're certainly going to have some redundancy. For example, in each company, you probably have one person ultimately in charge of technology. Now you have two, and have to work that out. You also have one person ultimately in charge of smaller things: The phone system, for example. Now you have two. These things have to be worked out.
A successful merger in which some people don't get laid off would be very surprising.
I've been using vi/vim for almost 20 years. I hate emacs. It's a perfectly fine piece of software, it's just not for me.
But I'll come to the defense of emacs on this one. Let's not blame his editing software for his RSI.
Well, as far as Linux on the Desktop -- although I continue to be impressed with what Ubuntu and others have done in this area, Linux on the Desktop for the masses seemed to be pretty much killed by Apple's brilliant move to go to an operating system that would eventually draw an absolutely massive number of people who would have otherwise been a part of that migration to desktop Linux. As much as Apple contributed to open source from that time on, I can't escape the fact that Linux on the Desktop adoption definitely suffered. I know I never seriously considered Linux on the Desktop again.
However... a couple of years ago, after having been a Mac desktop user for about a decade, I found myself working at a Microsoft shop, and they handed me an HP with Windows 7 on it. I was really quite worried about how well I would do, even though I had, of course, run Windows as a desktop prior to Mac's move to OS X (Windows 95, Windows 2000, etc).
With the exception of certain things that Windows 7 does very poorly (WebDAV client? Hello?!?), I've been overall fairly happy with both the OS and -- even more so, I think -- Office 2010.
But the best improvement that has come on the Windows platform in recent years is the continuous improvement of Cygwin, most notably a decent terminal in which to run it, mintty. If not for mintty, I would probably have struggled much more with Cygwin, and therefore with Windows. The Windows/Cygwin power punch may be the most productive setup available (aside from those environments where no MS interoperability is necessary at all, in which case I would still want a Mac).
As a fairly experienced technologist with increasing responsibility over the last several years, and who has had a certain amount of success and gathered some decent ideas along the way, I do actually think of myself as either a future CTO or future business owner.
But I almost NEVER think of myself as a future CIO. CTO definitely. But you can *have* CIO.
- I've been on slashdot since almost the beginning
- I'm a recreational musician who fantasizes about recording and distributing music
- I'm a web developer who has implemented DRM to protect the intellectual property of my employer
I decided to post here, so that I could say that I don't think there is any good use of DRM. I have heard lots of stories of people who distributed their own non-DRM'd music online and who do very well, for example. I think the good stuff will always pay off. People will recognize the value and the artist will be compensated.
I also hate the properties of DRM that inconvenience the consumer. Having to repurchase your content, for example But before I started typing this comment, I thought of one use of DRM that could be considered legitimate. A streaming subscription such as Netflix, or computer training videos and stuff like that, is something that works very well, is transparent to the user, and does not need to stand the test of time. As long as your subscription is active, you can access your content. You have no need to access the content after the subscription is over.
I've also taken advantage of software subscriptions lately. For example, I need Photoshop sometimes, but not all the time. Instead of paying a ridiculous amount of money to buy Photoshop, I can may for a month of Photoshop, which gets me through whatever project I'm working on. This is a form of DRM, and without it, Adobe would not offer the product the way I want to consume it. The same with Netflix. I love it, and without that protection, they could not offer it.
Yes. Gimp. I know. Sorry, I like Photoshop.