I have been observing experiments at SIGGRAPH several decades. Pixels resolution has gotten close to reality. Color systems like Samsungs are getting good. Six colors add some. What is really mindblowing is good dynamic range- capture bright light sources and deep shadows. A good HDR display starts to look like that desired window.
Its not just the monitor. The full system requires a compatible camera and encoding protocol. The recently announced HDMI standard upgrade helps.
I am impressed by the speed and accuracy of object recognition in the google ML self-driving system. These could examine to objects being purchased as a security backup, much like supers use weight now.
For the economics to work. Its 7 to 15 cents now according to RFID Journal (google).
At one time Walmart was talking about rfid-tagging everything, but settled at the pallet level. I dont know what the bottleneck was.
I like my library system for automatically checking out and returning books.
The first era of silicon valley was the commercialization the transistor between the 1950s and early 1970s by Shockley and his renegades from the east coast. This established the culture of quick startup companies and nomadic engineers you really hadnt seen elsewhere in the world.
Then when the number of transistors on a single chip exceeded a thousand using cheap CMOS technology you could put a whole CPU on chip and complete computer in a box for a couple thousand dollars. This lead to the second era of the personal computer in the 1970s and 1980s.
Rodenberry said in his book The Making of Star Trek that he eschewed shuttles and invented transporters to cut the cost of filming and avoid the extra time it would take to depict a shuttle in the show. Ditto for many of the other devices. Constrast this to the Iron Man movie why they glory in showing expensive FX gizmos.