I disagree.
Considering the example of the Irish Health Service in the article points out the issues.
External, loosely affiliated parties (doctors, clinics, ambulances) with no infosec knowledge need to contribute vital data (x-rays, patient notes) that is placed onto core systems, and it can be a matter of life-and-death that accurate complete data is retrievable with minimal delay, fails, errors by authorised individuals.
The network focused ring based system described is a case of imposing a theoretical defence model, that'll be broken before it's even deployed. Thinking of service delivery appications as though they were old style prisons.
These are infrastructure systems that need to be Engineered and architected, and a lot of the engineered components should be possible to standardize, make repeatable.
Not the minimum viable product that we work with every day. Software, hardware, OS, Comms internal, comms external, operators, maintainers, installers.
So questions raised.
Should every application be expected to allow for a huge overhead; something like your system of rings,
A cluster of containers for resilience patching, test.
it's own application-aware firewall cluster at each major API and boundary,
as current vendor trends seem to be heading.
Standardized, vendor independent specs and methods for defined API boundaries, but not set in stone; permitting revisioning.
Defined at-rest data storage bunkers with encryption and verification, anywhere data might need processing or caching.
Digital certificates indicating training required to acccess potentally breaking change areas.
So much for complexity being the enemy of good.
Most IT professionals or consultants might only have an inkling of parts of this. And no way are we going to start writing in mathematically proven ADA from day-to-day.
And vendors won't do this. Even OS sending print to printers remains a half done, problem-prone bodge.