Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:stats (Score 1) 235

Agreed -- I left in 2008 because it was painfully clear that despite good performance reviews and two promotions, my raises over 7 years amounted to inflation, plus the two bumps at promotion (a total of 15%). IBM wants to pay everyone below upper management an average salary, no matter how much better than an average job they do. It's entrenched in their compensation model and no first-line manager can do anything about it for his/her best employees. Every time someone is paid above the average for their band it's a constant battle to get them even cost-of-living adjustments, since they're above average.

Comment Re:NYC guy is an idiot (Score 1) 235

If you're still doing Theory X stuff like "annual performance reviews," you're doing it wrong, and deserve to die in a fire.

I had to google Theory X, but I don't see what it has to do with annual performance reviews, which have existed at all 5 places I've worked as a professional software engineer. What's wrong with an annual performance review?

Comment Re:Moral Quandary Perhaps? (Score 1) 235

Some of it is less obvious than you might think. Everyone wants personal growth and development, as well as feeling appreciated. Some people are on two-year tracks, and are destine to change jobs every 18-24 months.

In my experience one does not yet know what they are actually working on after 18 months or 24. In my experience it takes 3 years to have a sufficient depth on an area to be a solid contributor, and sometime between 5 and 7 years, if you're still working on the same thing, you're losing flexibility.

I've done several jobs over the years, and while no individual project is multi-year, even when working on a codebase as small as 400 KLOC it takes years to get to know enough of it sufficiently well to not introduce as many bugs as one fixes.

But maybe all these young kids don't work on anything as hard as 400+ KLOC systems.

Comment Re:Think ahead, move sideways, not up.... (Score 5, Insightful) 473

The drawback for me is that I'm finding it harder to continue to get energized to learn new technologies. I can still do it, but it's becoming more of a hassle. Not so much the languages, but the specifics of frameworks and technology domains (i.e. web vs. traditional client-server vs. realtime). Probably more a personal limitation, I'm not the smartest guy in the world.

This. It was a hard enough transition for me leaving all the various little office habits I had from 7 years at IBM. I had to learn new source control system, new way to build and install the OS, etc, in addition to spending several years where I didn't know intimately the details of the code I worked on. After 7 years I was a subject matter expert on a decent sized chunk of the AIX kernel. After two years at the new place, I finally felt like I knew enough code to say something authoritative about it. That was hard and frustrating.

However, it's also left me feeling sure that the only way to avoid irrelevance is to regularly make myself uncomfortable, so that I don't get too attached to the comfort. At this point my personal feeling is that it takes 5-7 years for me to become saturated on what I'm working on and to need that new thing.

Having kids taught me the same lesson too. As Kahlil Gubran wrote, "Verily the lust for comfort murders the passion of the soul, and then walks grinning in the funeral."

Comment Re:I've crossed that threshold, but it concerns me (Score 5, Interesting) 473

I suspect it's harder to hire someone who's older simply because the pool is smaller. That is, almost everyone at 21, or 23, or 25, whenever they finish college or graduate school, will be interviewing for a job. A lot fewer people at 40 will have a reason to leave, especially if they've become Senior and somewhat indispensable at their company.

I left IBM three years ago to work for a company not far past startup days. At 33 (at the time) I was one of the oldest developers at the company. Now, though, as the company has grown (and been acquired), not only are there more older people at the company, plenty of people who were young when it was founded 10 years ago are in their mid 30s and now have spouses and children. Several senior people have now gotten married or had kids, so in that sense the whole company has aged up toward me in just the three years since I started (age is often as much a particular position in life w.r.t. how long one has been married or how old ones children are).

And very few of these people now in their late 20s or mid 30s are looking for a new job, because they have one they like. So the pool of available interviewees continues to be heavily biased toward college graduates.

Comment C is still relevant (Score 5, Insightful) 473

Yes, I've noticed no one is writing operating systems or anything else in C anymore. I better learn the language du jour.

Except that my experience with multi-threaded systems programming is still useful. Even when everything is virtualized, there will be C code running on the bare metal that someone needs to create and maintain. New hardware products will need drivers written in C, or entire embedded systems written in C.

Sure, the next social media website won't be done that way, but for some of us writing that high a level of application wasn't that interesting.

And didn't I just read that Facebook had to highly optimize malloc(3) to support its operations? What's malloc written in? Oh yeah, C.

Comment Re:The Internet is based on C (Score 5, Informative) 201

- I think people who put the "*" of pointer syntax near the variable name and not the type name when declaring pointers should be shot. It should always be int* pointer_to_int, not int *pointer_to_int.

I'm sure my complaints are unwarranted except for the first point.

But that's backwards of what the compiler really does. Consider this:

int* p, q;

What types do p and q have? p is a pointer-to-int; q is an int. By putting the * next to the type name it makes it look like all the things are int*, but they're not. By putting the * with the type (which I did for my first year of C coding) you're making reading the code harder rather than easier. It'd be like writing

a = b * c+d;

and trying to convey that the '+' binds tighter since it doesn't have spaces. That's not what the compiler will do and writing it so only serves to confuse the reader.

In addition, what you see at declaration is representative (modulo the weirdness of array subscript and pointer deference) to what you'd do to get the type. That is, int ***p means that you'd have to type ***p to get an int. *p means you'd need another ** to get an int, etc.

Comment Re:Not many people want you to support consumer te (Score 1) 533

Sorry, but how is supporting your personal mobile phone, a job for your company's IT department?

All right, let me explain.

If having access to company email on my iPhone gets me working more efficiently, or if I can do work on the bus commute that I couldn't do before, then supporting my iPhone has a business justification and should be part of IT's job. It's that simple. I would hope that no one wants to make it easy to check their work mail just for shits and giggles. They ask because it's relevant to doing their job, and IT's job is to support the rest of the company doing their job.

Comment Re:What's the point of this? (Score 1) 315

I don't think I'd heard that the education costs compared to salary were bad for computer science. For IT this may well be possible, but I went to a state college, had a small stipend in grad school, and made more in 2006 than my dad, who was a Mechanical Engineer with 30 years experience working at NASA.

So I think it depends what you do with the degree. Working in IT may not pay well, but systems programming will probably put your household in the top 10% of income within 10 years. Pay at IBM, EMC, Amazon, Microsoft, Google, etc., are all pretty comparable.

Comment Re:FlowCharts can be very useful (Score 1) 315

In 10 years of professional experience writing system software, I've never drawn a flowchart. In the 7 years of college and grad scholl prior, I didn't either. Code with comments can be very clear. Well-designed and documented data structures, with simple methods for manipulating them, are how I get my job done most days. I'm not saying that they aren't useful for you, just that they're not a panacea and they may not click for other people.

To the OP, there's such a wide variety of possibilities in both programming, software engineering, and computer science that you can talk about just about any subset for hours. The most useful thing, probably, for high-schoolers is (1) some overview of the different options, and (2) a few suggestions on what they could do today to try them out, to see if they're enjoyable.

Comment Re:Start with your chair, monitor, keyboard setup (Score 2) 235

IMO, those images are useless. I don't sit in one position all day, and it's not healthy to do so. I slouch, I lean forward, I sit up straight. My legs are stretched out in front of me, tucked behind me, sometimes one leg is crossed under my knee with my heel on my chair, etc.

I have never had a keyboard tray I liked; invariable I bang my knees on it and it's not wide enough for my keyboard and all the places I want to put my mouse. My arms are usually out to my sides a bit. I want my monitor's center about at head height so for some things I look up and some I look down.

To the OP, in my 10 years of experience the only thing that I needed that was in any way uncommon was a split keyboard so my wrists didn't hurt. YMMV, of course, since the things your body complains about won't be the same as mine.

None of the chairs I've had at work have been awesome or terrible; they were somewhere to put my ass so I could type. For that matter, I'd be happy with a standing desk too but they cost money and my cube isn't well set-up for that.

Slashdot Top Deals

Kleeneness is next to Godelness.

Working...