8k won't be ready for anything any time soon. HDMI 2.0 doesn't even support 8k 30Hz, and few TVs have Displayport. 4k Blurays are taking their time arriving to market, and 50GB arguably won't be enough for 8k without a codec upgrade which would itself require a new disc player. What portion of existing bluray players have old HDMI ports or processors that can't handle 4k content? It's not like 4k TVs are high-margin items anymore -- I saw a nice 50" one at Walmart for $699 a few weeks ago, and there were cheaper ones online. The price has hit rock bottom before there's even the demand for them. Unlike 4k cameras, there are only a couple prototypes of 8k cameras, so almost all content will be rendered CG for a while.
I'd read countless arguments on Slashdot that human eyes can't discern resolution higher than 1080p in a 50" TV over 10 feet or so, before I actually watched a demo 4k TV running 4k content, for about 15 minutes. If you have a 50" TV in your bedroom, 5 feet away from where you're sitting, you can definitely notice a huge improvement in detail. I stepped about 15 feet away and in most scenes it was still usually an obvious, substantial improvement over 1080p.
An electronics retailer in Europe held a contest, setting a cordon that people had to stay behind, more than 10 feet away from two televisions, and were asked which was the 4k tv and which was the 1080p. 98% of people correctly guessed which was which. Maybe people asked others who cheated, but it suggests that "most people can't tell" is bullshit. I seem to recall when the Apple retina display claims first came out, a scientist mentioned that humans' actual acuity was about 50% better than what Apple was claiming. It's also worth noting that while a single still retina image may be at a certain DPI, there are psychovisual effects (like depth perception) that can improve the resolution inside the brain, beyond what one retina picks up at one time. The eyes also saccade all the time, which I seem to recall can be interpolated to improve resolution.