And you're getting very close to the Shannon limit with turbo codes. LTE isn't much more spectral efficient as compared to HSPA+, but it has wider frequency bands and so can get more peak speed to customers.
So you can increase the amount of spectrum you have, with the current infrastructure, to get more capacity. That will buy you a few years of network traffic increase.
But eventually you have to figure out how to get less capacity demand and more SNR. There's really only one way to do that: change the infrastructure topology. And that has lots of problems.
It's kind of like we're near "Peak Bandwidth".
This makes it very nice for all kinds of embedded environments.
Efficiency matters. Python is great, but you don't want to use it for embedded work.
Compounding this fact, ARM isn't that great of an architecture. It's got variable length instructions, not enough registers, microcoded instructions, and a horrible, horrible virtual memory architecture.
The big thing that ARM has is the licensing model. ARM will give you just about everything you need for a decent applications SOC. Processor, bus, and now even things like GPU and memory controllers. Sprinkle in your own companies' special sauce, and you have a great product. All they ask is for a little bit of royalty money for every chip you sell. And since everyone is using pretty much the same ARM core, the tools and "ecosystem" is pretty good.
But there's not much of an advantage to the architecture... the advantage is all in the business model, where everyone can license it on the cheap and make a unique product out of it.
And nowadays, the CPU is becoming less important. It's everything around it -- graphics, video, audio, imaging, telecommunications -- is what makes the difference.
Hopefully someone will listen to their complaint before they are forced to take matters into their own hands.
And I think everyone also sees the next step, which is retaliation. Google just bought all those Motorola patents, and having them shut down Nokia and Apple with all those 17-year-old cell phone patents would really be a step up in the Mutually-Assured-Destruction conflict, and everyone would suffer for it.
Taking this approach with the nukes in your back pocket seems much more civil than approach taken by the others.
In a lot of areas where research is done on things which don't work yet -- rockets, bridges, transmission systems, etc -- there's a general idea of how things might be able to "scale up" to meet the goals.
Is tokamak fusion really in sight of being commercially viable source of energy? If we need unobtanium to make a commercially viable reactor, wouldn't it make sense to wait until the materials are viable before making even larger tokamaks? What do we learn from making these new, bigger, more expensive reactors?
Or are we trying to build ever-bigger spark gap transmitters as a way to make radio better? Maybe we should look at other schemes?
Or, alternatively, we know of a nice, large, gravity-fed fusion reactor fairly nearby, is the engineering simpler to harness energy from that on a large scale?
Not very usable for things that we need super-fast FFTs for, a gazillion times a second, like LTE.
I wonder if this is just re-discovering compressed sensing.
C++ is the best example of second-system effect since OS/360.