I watched the development of micro computers in the 1970's. Try re-reading BYTE (and other) magazines of the era. The technology was shockingly primitive. No standardization. The first standardization was around hardware, the 8080 and the S-100 bus. Still no significant software standardization because every system had some cobbled together custom keyboard / display or printer setup. Find a used keyboard from a liquidator, figure out it's circuit board layout, write your own custom interface software, etc. It wasn't until 1977 that the holy trinity arrived (TRS-80, Apple II and Commodore Pet). The first standard off-the-shelf computers. This was where you started to see some commercial software take hold. Just watch the ads in the magazines.
Now to the point.
I am ignoring Unix until a time when it was practical for most people to actually run it. The early 1990's when Linus created Linux was the perfect time. And at that time all of the Unixes were walled off proprietary prison camps and ran on workstations that at that time cost a couple tens of thousands of dollars. Linux ran on a common PC. By the mid to late 90's some people were noticing that you could run Linux on a souped up PC for ten grand and replace a thirty grand Unix box.
If Linux hadn't come along, Unix would be something in obscurity.
Here we are today where you can get Linux on a Raspberry Pi for $35 with 1 GB of ram, gigabytes of SD card storage, 4 core processor, etc. And proprietary unix is relatively obscure.
That makes Linux sure seem innovative to me. It obviously did something VERY right. So much that now Microsoft can no longer ignore it.