Code reviews are highly subjective, human endeavors. I've certainly wasted more than a little of my life in "reviews" that were nothing more than
personality driven, agenda laden time-wasters that usually surfaced little more than grammatical erros in comments. *If* the reviewers actually
bother to look at the code before making comments.
Here's a little exersize you might want your boss to be involved in:
- Grab an arbitrary piece of code from outside your organization.
- Inject 10 or so errors or other issues into it
- Divide your usual review crew into 2 groups to review the code separately.
- Tell one group that the code was written by a new intern, so you'd like them to eyeball it.
- Tell the other group that the code was written by your most senior developer (preferably, one w/ a big ego),
and they need to review it "cuz the boss says we have to"
- Compare how many issues each group finds/reports.
I suspect you already have a good idea what the outcome will be. That should be enough to tell you how effective code reviews are.
Automated code formatters/code inspectors, along with decent compilers/linkers (or interpreters) will surface most of the
issues that code reviews find.
Instead of pissing away valuable developer time, put those reviewers to work writing and executing tests. Right away, you'll discover
whether the code is testable. And then you'll discover whether its actually correct.
Tests don't have egos, agendas, personal axes to grind, or coworkers they don't want to piss off. They don't take vacations or sick days.
They don't have opinions about the author of the code. They usually don't leave the company. They generally don't have an opinion about
how many/few comments there are, or if the code has been formatted to corporate spec (unless those tests are executed as part of the
automated tools mentioned above). Sure they can be drudgery to write, but its the only real way to know if the code actually does
whats its supposed to.