
The IBM lab in Toronto used to have a system like this, back before Celestica was spun off; half the lab did software and the other half was manufacturing. I recall one manufacturing guy got the maximum suggestion award for saying let's use just one stabilizer foot instead of two on each PC case.
I submitted an idea that a back-of-the-envelope calculation suggested would result in quite a bit of savings. Downgrade all the "work-at-home" sponsored phone lines from business grade to regular consumer grade. This was in the mid-90s with very slow modems, not exactly taxing to the phone system. Suggestion declined.
Then a couple of years later it was announced that they would do what I suggested. I inquired if the award still applied. It was just like when your warranty expires and your computer breaks. The two-year "statute of limitations" on suggestion awards expired, and the suggestion was implemented shortly after.
Which is a roundabout way to say, in the hardware/manufacturing world they may pay out for productivity suggestions, but don't count on it in the software business. (After all, who hasn't had an idea that would speed up some process 1000x and make some slacker in their office redundant? A slacker who serves on the committee evaluating suggestions.)
How does documentation get 'bugs', with access to the source and the developers it would be straight forward to get each programmer to write up a high-level description of what each function does, gather that into a spec, and voilà, there's your documentation already.
I take it you've never actually seen any source code, or met any programmers?
This function does something that makes no sense unless you already understand 5 layers of other workarounds.
That function does exactly the opposite of what its name suggests, leading to a never-ending cycle of bug reports because no one believes the documentation.
This programmer reviewed and signed off on the documentation for the last 5 releases, which was prepared from their writeup. A different reviewer discovered 100 mistakes in said documentation.
That programmer is a perfectionist who opens a bug for every nit they want to pick regarding grammar and punctuation. (And they're frequently wrong.)
I think the problem is the false assumptions and analogies that get introduced by these lines of thinking. If a network is "this guy talking to that guy", your thinking will be constrained by what you know about human conversation. If there's a problem, someone can talk louder, slower, etc. and the analogy holds. But if the solution involves something that has no equivalent in the day-to-day world, how are you going to conceptualize it?
My pet peeve, that descends from this same idea, is from the teaching of object orientation. A car is a type of vehicle; a truck is a type of vehicle; they both have wheels; maybe the number of wheels is different; maybe each has a different turning radius or procedure for backing up.
Great, now you understand inheritance, polymorphism, member functions, etc. However, in practice, you use object orientation to avoid zillions of "if" statements, special case code, large blocks of almost-but-not-quite duplicated code.
In my experience, someone who learned OO by simple analogies is likely to be locked into thinking "I have to write every program by making a tree of classes like with the cars and trucks". Rather than "there are different ways to structure the code, and a good OO way will get rid of a lot of branching, magic numbers, and redundant code".
You can now buy more gates with less specifications than at any other time in history. -- Kenneth Parker