One thing that keeps coming up is the constant inflow of rookie (and intermediate-level) programmers making rookie mistakes. There seems to be an unwillingness to treat software creation, from the academic level onward, as a controllable process towards a working, reliable, secure, usable, maintainable result. It's still being treated from day one as a sandbox with a rigorous theoretical mathematical underpinning, but cowboy coders and fluid design-level rules in the day-to-day.
Examples of this are that the nuts and bolts of code standards, defensive programming, code hygiene, technical debt, refactoring, and at a higher level, revision control, automatic builds, code review, and static analysis are considered best practices by some, but are nowhere near ubiquitous.
It may not be an unwillingness as much as growing pains, or that the field lacks a requirement for a P.E. certification that can be used to push back on unreasonable business pressures. Don't assume that you're entering or working in a field that has a well-established set of rules that you can rely on, and if your gut tells you that cult of personality is overriding a technically-based meritocracy, that may very well be the case. The process of software creation seems to still be changing, evolving, maturing.
You can still learn those best practices and apply whichever of them you have the power to in your own environment -- just don't assume everybody will abide by them, or even agree as to what they are.