Rozine writes: My company is being split off from a larger, more hide-bound organization after decades. We're using this opportunity to expand our development team and to change a lot of the development processes that we've lived with for a long time. One of the areas that we'd like to change is our environment setup. Currently, we have development, QA, and production environments running. Our production environment consists of hundreds of machines running hundreds of different processes in a massively complex and scaled up system, almost all on a customized Red Hat Linux (RHEL3-5), with a few AIX and Solaris we're looking to eventually decommission, and one or two Windows boxes. Dev is always broken and lacks some major features that we develop for production. QA has most of what production has, but it's a huge task managing process rollouts that can conflict with UAT needs, especially when sometimes developers perform development in the QA environment due to lack of features or stability in dev. We've recently discussed adding more environments to the mix — a real UAT environment so that clients can have a stable onboarding experience, and multiple dev or QA environments so that we can isolate changes and eliminate wasted time dealing with stability issues. We have support from senior management where cost is "not an issue" (although I'm sure that has limits). We've run into trouble, though, because our complex software only supports the three current environments and it would be an insane task to add more. Has anyone had experience with more sophisticated environment setups in the past? Is it possible to virtualize an entire environment, so that applications think they're the only dev environment and connect to the same "machines", but are really on separate boxes? Does this scale to twenty environments easily, or should we set our sights lower? Is this the wrong approach?