... equally likely is that Commercial Unix killed Commercial Unix.
As a couple of datapoints, in the mid 90s when I started my sysadmin career in general at a university computer science department, the "traditional" Unix boxes were well-represented: Ultrix, SunOS/Solaris, HP/UX, AIX, Irix. Faculty members were just starting to buy their own linux laptops - and what a difference.
Instead of recompiling a kernel and informing Solaris, for instance, that I was going to be adding a disk, or a network device, faculty members would simply slam in a PCMCIA card - and their network interface was wired up. RedHat brought in RPMs and a strong dependency mechanism. Compare that to Ultrix, which used ... tar and cpio as their "package managers". Compilers no longer needed to be licensed - you want a C compiler, or a Fortran compiler, or your Linux box? Go crazy. GNU helped kill that ridiculous license stream.
Linux came with devoted coders who constantly strove for improvement. When Alpha, you had cross-platform Linux, rather than Unix devoted to just one hardware platform. And Linux adapted the things that had worked well for commercial Unix. All the features were there on Linux.
Commercial Unix could've moved onto similar paths: freed up their code base, dropped licenses for basic functionality like compilers. Innovation still happened in Commercial Unix: dtrace, ZFS, hardware-accelerated OpenGL. Meanwhile, the providers of Commercial Unix used them simply as a means to sell their hardware - which struggled to remain competitive with the glut of cheap and ever-improving PC hardware.