I have worked in and around many environments where no degree of clear documentation would help due to how completely brainfucked the implementations are. Despite the tombs of very thorough documentation, they're still a goddamn mess. Even the high level overview has caveats and exceptions of how it works, where it works, etc.
Hell, pick an operating system. You typically need at least a high level overview of how it works before you dive in: "This is UNIX. We use pipes and redirects, and everything is a file. Go."
Now, a system which follows best practices? Absolutely, there should be no 'teaching' required, assuming it's of moderate complexity. "x happens when y occurs in certain scenarios" or the like. But I've rarely seen an environment which even approaches 'best practices' because status quo just-get-it-done has been the order of the day for entirely too long, and people are lazy and/or overworked.
This is coming from the systems/network side of things, often in environments which are developer centric (and historically 'managed' by the devs). Custom applications cobbled together for functionality across a dozen hosts with half a dozen scripting languages over a period of a decade... it's a nightmare, and frequent.
As someone who has invariably come into environments with little/no usable documentation, and have since been thanked several times for leaving behind such useful documentation (yes, in wikis), there's a time and a place for documentation. High level things (need, purpose, etc.) as well as 'gotcha' specifics are useful.