I'm not arguing against 4K resolution per se. Personally, I would really like to have 4K, 8K resolution or even higher. For tasks such as word processing (or any task that involves working with text or letters) and getting desktop estate it is the more the merrier that applies, at least for now at the screen resolutions that are available for current desktop or laptop PCs. I totally agree with what Linus Torvalds said about
this a while ago.
For FPS gaming on the other hand, I agree that 4K is overkill, at least with the polygon capability of current gen GPUs. I think that when dealing with photo, a resolution beyond 1080p (and perhaps 720p) is probably not very beneficial to the experience of immersion. But then again, I have yet to see a truly highly-detailed video-clip at 4K, perhaps that would be a mind-blowing experience. When looking at IMAX in theatres it is indeed a more capturing experience than regular 35mm footage. But the experience will be greater when it comes from say outdoor shots with a nice view and a lot of details from say trees and foliage than from camera shots taken inside a room with much less details.
I find the "Uncanny Valley analogy" to be very inappropriate here because firstly "uncanny valley" applies to human-like robots vs humans which is a very different story, some aspects of why this is different is discussed e.g.
here, and secondly, the higher resolution makes the fps games look
less realistic than at lower resolution. The high resolution reveals how "empty" the artificial world really is, something that could be concealed behind a blur or a coarse matrix of pixels which is now floating up to the surface.