Your arguments have significant problems, even though your conclusions are mostly sound.
If the only problem with 30fps is mismatched frame rates, then I shouldn't be weirded out by "the soap opera effect" from higher interpolated frame rates in movies. Game makers should also be able to force their games to output exactly 30 fps and simply idle while waiting to process in between frames, resulting in perfect gaming motion. But they don't, despite the fact that it would allow them to run higher resolutions on cheaper hardware with zero sacrifice? They aren't idiots. Optimization is HUGE in the gaming industry.
The evidence shows that 30 fps is absolutely not an upper limit on human perception, although the brain generally does a good job of smoothing out motion at frame rates as low as 24 fps when they are displayed uniformly.
The whole "you can't see pixels" argument is bullshit. Individual pixels are generally detectable at resolutions orders of magnitude higher than 1080p (i.e. something much more in line with the 576 megapixels cited in the original article. To wit, I can not only see individual pixels on screens, I can often see the inter-pixel spacing (screen door) on display devices. Most people report that they are seeing pixels only when they see the gaps that differentiate them, but the gaps are far smaller than the pixels themselves.
What we actually have trouble differentiating is differences in color over tight pixel spacing. If the image you are observing is a high contrast grid (e.g. a literal screen door) and you have good vision, you will have no difficulty in differentiating between different resolutions. But if you're viewing a display where the inter-pixel spacing is sufficiently tight that light visually bleeds to fill the gaps and the color contrast between pixels is within normal ranges, your brain will smooth the image and you functionally can't tell the difference in resolution.
However, this smoothing doesn't always happen, and the difference can be significant in some applications. For instance, when drawing in CAD, even on my 4k monitor nearly anyone can see the "jaggies" that occur when a line is displayed at a slight angle rather than directly along a row of pixels. The jaggies are obvious not because our eyes suddenly got better at detecting resolution, but because the CAD software is not using anti-aliasing algorithms on the line to trick your eye into thinking that it is straight by blending the line across pixels rather than just jumping on/off from one to the next.
In general, I'd agree with your conclusion that going beyond 4K isn't needed for TV and movies, since the images they render have inherent anti-aliasing. (Note: this happens to the source material naturally when filming in digital since the camera sensor pixels within edge areas are exposed to light from multiple contrasting sources and therefore report the weighted average results for each pixel.) There are still some specific instances in which naturally filmed artifacts may be visible at 4k that would not be at higher resolution; however for 99.9% of consumers those artifacts will not rise to the level that they need something else.