The same can be said about generative anything. As I see it, AI can be just another kind of procedural generation. Will it be meaningless? Yes, in a sense. But no more meaningless than any other procedural generation. In sandbox games, you supply a lot of the meaning yourself.
There was an obscure genre of games, I don't remember what they're called, but they were text-based "life sims" where you simply were a regular person in some setting, and you got a series of pick-one choices, from a random draw of situations a person in that setting could come into. The game would then track attributes like health, money, attributes etc. possibly affecting which situations and options you might encounter later, until inevitably you die of course. Think like Slay the Spire, but only events, and a lot of them.
The problem was that all the events were handwritten, and to make matters worse the games were often didactic, to "teach" e.g. economic responsibility, or what life is like in some war-torn country. You could of course learn these events and game it by giving the answers the authors approved of. Leading to people telling stories of how their poor character from a war-torn hellhole had three children as a result of rape in war, yet had managed to become a multibillionaire at 35.
So, a game like that, but actually good. Not predictable, not intentionally didactic. Should be possible to do much better with an LLM in the pipeline.