After having worked professionally with computers since 1997 and having gotten only one professional certificiation (an NT4 MCSE which was dead simple to get), I've often wondered about going back to school to get a CS degree. But is it worth it for a 34 year old guy to do this?
My college degree was a BS in Communication (Telecomunication with emphasis on audio production) and was basically just to appease my folks. I've been out of college for over ten years now and the degree only really helped me in terms of being resume helper. Considering that it wasn't a hard science and I suck at math, I imagine I probably have a good deal of catching up to do before I could even pursue a Master's in CS. The thing is that over the past seven years, I've picked up a lot about networking, general OS support, scripting in CMD, Bash, Perl, a little TCL and programming in C (my big project is to really do a lot of C after the kid is born). Computers as a profession never occurred to me before because I mostly used them for music and graphic design. But once I moved to Linux, I got bit by the bug and started to play with computers at a much more advanced level and found it to be enjoyable. The whole D.I.Y. feel of the *nix OSes is much more fun than just using applications on Windows. So... the big question, should I try and take this interest and really hone my skills by getting a CS degree.
Finally, the setback. Like I said earlier, I suck at math. Not that I don't understand it, but I have a severe problem seeing my own mistakes unless someone points them out to me. Once, when I was taking a geometry class in undergrad school, I would literally check my work five times over. This made what should have been a 45 minute homework session more like 4-5 hours. But, I STILL couldn't catch my mistakes. When I would go in to talk to the teacher, they would instantly spot the error. usually a set of transposed digits or a minus sign instead of a plus sign. Even today, when I write something in Perl or Bash, I will make mistakes that I can't see... until the machine points them out. Fortunately, since the machine points them out, I "get" the problem and can usually fix it about 99% of the time. But working with computers hasn't required much actual math for programming in C or writing scripts in Perl/Bash. Instead it seems that logic is much more important and at that I excel. I knew this back in undergrad when I took a deductive logic course and aced it without trying while the rest of the students bemoaned their Cs Ds and Fs. So... in the "real world" of programming, how much math (as in figuring out the calculation before the computer can tell you that you made a mistake) is required?