as always - things that HAVE NOT EVER been a problem.
Sure, but much of what I use the internet for doesn't "solve a problem," it's just convenience. When I wanted to look up a word, I used to grab a dictionary, instead of googling. When I wanted to learn about some event in history or similar, I'd grab the encyclopedia, instead of wikipedia. Once laptops became commonplace, it was about the same speed to look it up on the net (assuming I had to wake up the laptop first); now that smartphones are ubiquitous, it's decidedly faster to just whip out your phone.
The only "problem" that was solved was some gains in efficiency. Now it seems we're at the point of diminishing returns to be sure, but that doesn't mean that setting an alarm/adding items to a shopping list/etc. can't be streamlined a tiny but more.
Personally, the killer app as I see it is having a more robust silent alert (I sometimes don't feel the vibration from my phone) along with the ability to quickly see if I should just ignore the alert or address it (it takes about 1s to glance at a watch and determine if something requires my attention, vs. several seconds to take my watch out of my pocket -- and the former can be done without significant hand movement).
My limited experience with this is based off of a $15 "smart" watch which was extremely flaky, but when it worked I was certainly happy with the workflow.
...44.1kHz 16 bit audio is relatively trivial...
So, is is there an analogous specification for video cards? The 44.1kHz @ 16bit is pretty easily justified (Nyquist–Shannon + reasonable dynamic range). Can a visual equivalent be easily justified? That is to say, at sort of "eye limited" (retina, in Apple lingo) resolution and field of view, how many polygons can be said to make up the human perception of reality, and what sort of graphics processing muscle would be required to drive this?
I of course have no idea, just wondering out loud. Just trying to approach OP's question in a pseudo-scientific fashion.
A modem is a baudy house.