We'd have had PS2 connectors, floppy drives, beige boxes, flaky suspend/resume, x86 BIOS, 32-bit processors, no built-in 3D acceleration, no built-in WiFi, 100mb ethernet, etc. for even LONGER than we did.
There is no evidence to suggest any of these at all. PCs were very customisable, so OEMs and users alike could spec systems the way they wanted. If they didn't want floppies, or wanted to add built-in WiFi then it was all possible (although on-board implementations of WiFi often still look to the system like PCI cards). The first version of 64-bit Windows was released two years before Apple started to add support to OS X.
Do you remember having to buy PCI-USB cards
I remember having USB connectors on my motherboard long before we had anything to plug into them (devices to plug in didn't appear on the market until they came out with version 1.1 of the standard that fixed a bunch of problems) and before Apple started using the connectors. It is hardly surprising that the connectors would appear on the PC first considering that the standard was developed by "Compaq, Digital Equipment Corporation, IBM, Intel, Microsoft, NEC, and Nortel" (according to Wikipedia). The difference was that they didn't throw out the old connectors (and so gradually moved to the new standard) rather than Apple's approach.
The thing about the "Apple model" is that there is less flexibility in the hardware. With PCs (and other systems) you could make the computers as you wanted them to be by using industry-standard components. Yes, Apple started using those components, but would they have ever done that without the PC to spur them on? Would they have ever had real multitasking without Unix and Windows shaming them into doing it?
Probably. Who knows? It really doesn't matter, since we can't change history to find out the answer.