The real "ceiling" for games was hit years ago, and it really has very little to do with FPS, resolution, graphic shaders or any of the tech.
That ceiling is the quality of the underlying game - the gameplay, which is a combination of writing, game mechanics, music, and any number of other things. There are games from the early 90s with story and MIDI 'soundtracks' that I can recall almost instantly simply because they were good, immersive games. You kept opening avenues for exploration as the game progressed; the sum of the game wasn't largely explored in 2-3 hours.
The tech itself was 'good enough' for most games right around the time that the games started getting horrible, too - I'd say around 2003-2008 timeframe. There are absolutely exceptions, but that's when the console FPS graphics bonanza took off and started nerfing the quality of literally every game and franchise, and the game industry started trying to pump out endless titles of the same derivative thing without an attention to quality. They simply rely on the tech to make up for it.
Maybe part of it is nostalgia on my part - almost certainly. But you can't tell me that there are many games releasing games with playability like Civilization anymore (not many) - even Civilization isn't as good (though there are games like Stellaris, which is the only exception I can think of in that genre). FPS games are dull and derivative, and the story in long format single player games has become horrible and breaks the third wall constantly. The number of 'good' titles seems to have been narrowed down to 1-2 every several years, as opposed to several annually - despite the very significant increase in number of games and developers producing them.
Even games like Cyberpunk 2077, which in its -current- version is an enjoyable game, took 2 years after release to get there. The first iteration was basically unplayable and had the gameplay and world depth that didn't compare with even the original Grand Theft Auto.