*People* are different, and like different things. Men and women, however, aren't that different (roles in reproduction excepted), so a statistically significant difference points to a social or psychological cause, not biology.
That said, the PC isn't itself the problem, as the TFA -- or maybe just the summary -- seems to imply. Looking at other professions with gender imbalances, though, one can posit a few underlying causes. a) Secretaries were once men who helped important people with important matters; once the typewriter came in, women seized on typing as a "respectable" way to support themselves and the modern secretarial pool was born. (See http://www.stuffmomnevertoldyou.com/podcasts/why-is-secretary-the-most-common-job-for-women-in-the-u-s/) b) Blechley Park and earlier research projects employed female "computers" before they developed electric ones because women worked hard and worked cheap. All the mathematical whizzes, however, were upper-class men; who would pay for a woman's education, when they would just get married and pop out kids? (See also Disney animators.)
Obviously somebody needs to do solid research, but one could hypothesize that the PC coincided with three trends: the growth of male-dominated "hacker" culture, the use of PCs by Serious Men for Serious Business, and the decline of mainframes (i.e. server rooms in which nobody knew or cared women worked). Without hard data, though, this is mere conjecture. Loads better than "women don't like computers", though.
Legacy properly describes a software system, not a language. Languages rise and fall in popularity. Sometimes a language has inherent limits, sometimes the implementation stinks, sometimes the syntax or paradigm no longer become fashionable. Sometimes languages and platforms disappear only to re-emerge years later. Back in the late 1990's NeXTSTEP/OPENSTEP was turning into a "legacy platform"
Stay in the industry long enough, you'll see everything come back.
Is this a valid analogy? In short, no. A bit longer answer: NOOOOOOO. For a full explanation, read on.
I can't speak to how construction works, but I know how software development and developers work. Usually software breaks not because of a bad developer, but because of integration issues and subtle interactions which are hard to detect, and even harder to assign "blame" to without a lot of investigation. The investigation is generally the hardest part, so you'll have to charge time already spent.
Worse, your boss is proposing a "blame game" where every defect is somebody's fault, almost always somebody on the current development team. Far from encouraging better software, this will keep developers from entering their own bugs (or any bugs) into the bug tracking system, and encourage finger-pointing rather than collaboration. Meanwhile, your boss thinks he'll save money by making developers work for free "on their own time". In the worst case, the person who touched a piece of code is IT, whether it's a legitimate mistake or a weird edge case. What you'll get is a workplace full of egos, fiefdoms ("don't mess up MY code"), and destructive competition.
Get hold of portable property. -- Charles Dickens, "Great Expectations"