They already did this. It is called MIMO.
We all understand that.
What you're missing is that:
- MIMO works better, over longer distances, when the antennas are more separated. The more the separation, the greater the distance, for a given accuracy of phase.
- But it also requires the radios to be synchronized to within a tiny fraction of a single cycle, so the patterns add up correctly. At 2.5 GHz an entire cycle is one quarter part per BILLION and MIMO reqires more than an order of magnitude better accuracy than that.
When the radios are all in one box, that's easy: You drive them from the same oscillators, and watch your wiring and components.
When they're in different boxes, separated by hundreds of feet or by miles, it's a whole different can of worms. VERY fancy equipment to generate VERY stable signals, extra stuff to estimate their drift (which varies from moment to moment), and it's still a massive pain. You don't get that kind of synchronization between boxes, even in a house, when they're connected by inexpensive commodity cabling.
What these guys did is tweak the protocol to add a tiny synchronizing burst from the designated master transmitter just before each packet. Combined with estimates of the moment-by-moment ongoing drift (computed from reception of the synchronizing bursts from previous packets) they were able to get current commodity-quality hardware to stay adequately synchronized to hold the pattern together for at least the duration of the packet. (I'm betting they can do the same sort of thing with the receivers, too, working off the sync burst from the master transmitter.)
The result is being able to do MIMO with radio/antenna assemblies in different, disconnected, well-separated, boxes, using only packet-quality interconnects and doing synchronization via a small bit of air bandwidth.
That got MIMO over a major hump, in equipment cost, antenna separation, and utility.