Businesses will have to spend people, time and money on a moving target.
They always do. Otherwise they'd still be using MS-DOS because they don't want to upgrade their LOB programs.
There are viewpoints on both sides - I used to work with a company that once something is stable, it was frozen. They were deathly afraid of doing any sort of change for it might break something. This is one viewpoint. Of course, it also meant the software they used was horrendously outdated to the point if you needed to do anything new and novel, you often had to do it yourself. They deployed CentOS 7 as a standard platform for everything. When I got laid off, things were slowly moving to moder modern platforms - they decided they were going to use Ubuntu because the LTS were good for 5 years, so they could target the next LTS, wait a year or two for it to stabilize then spend the year moving to it so when the LTS went end of support, they were able to migrate.
Old software has software vulnerabilities. It doesn't rot, per se, but it gets harder and harder to use, and even things like platform support get iffy at best. They were deathly afraid of moving to newer systems. I even proposed using Docker so we could move to a newer Linux, but at least still keep the old build environment. No tricks, but it was a non-starter.
I got laid off from that company. I found a job at a new company, and they don't run bleeding edge, but they run stable latest versions. I'm sure their IT department snapshots the servers we use and then updates them to the latest version of software to see if it works. If so, it's what running. I know because the git servers and such are constantly running the latest versions. Heck, if you fall behind on patches, their VPN software nags you and can even deny access to services (update your software - we don't allow vulnerable machines to connect).
I'm sure they're running the latest not just for fun, but to make sure everything still works, and since they're doing that validation anyways, might as well deploy those latest versions.
Doesn't hurt that they are in the cybersecurity field, so it will not do to be accidentally hacked by anything older than a zero day. If a vulnerability is fixed, the patch should be deployed on everyone's machine yesterday. Also means any software used for development or included in the product has to be the latest available and track all associated CVEs.
The first company I kept running into issues because the kernel was so old and some things in it were so immature compared to the current kernel that the only workaround was to reboot the machines periodically because they refused to upgrade. Eventually we gave up trying to support it, and forced them to deploy new Ubuntu VMs that hosted all the necessary software instead of having to deal with the greater and greater deltas between what we were developing with and what the servers needed for production ran. He also started deprecating older services so he could migrate to newer versions. He wants to tackle the firewall next because it too is ancient.