Actual coding is the smallest part of modern software development not just because of all the meetings agile techniques like Scrum require, but also because we're expected to support the code we write instead of just writing it in isolation, tossing it over the wall and expecting some other sucker to maintain it. The theory is that if the developers have to support the code themselves, then they'll pay more attention to quality, reliability, stability and other factors that improve maintainability.
Of course other related work like design, documentation, code review, testing, deployment, performance analysis and so on contribute to making actual coding a small part of the whole process.
Jobs with 20 hour seat-of-the-pants hackathon sessions in some low level language that gets dumped straight to production are increasingly rare.
The question is whether all of this overhead is worth the effort? If done right, maybe all of this turns coding into professional software engineering that can reliably produce high quality solutions to business needs... or maybe it's just another failed attempt, like waterfall, that adds all sorts of useless overhead to fool management into thinking they have some sort of control.
So far, I'm thinking that it may actually help, but the jury's out and I think it's highly dependent on your organization and individual team. Even great ideas can be need up by poor implementation.