It is not sentence generation or regurgitation.
But that's the problem, it is regurgitation, even if a targeted one from a vast array of stomach contents. You've been fooled by a stochastic parrot. Is it a breakthrough in human/computer interface if the human's input has a high likelihood of coming out mangled for reasons we can't understand or automatically correct? Is it really conversing with humans by spewing out a statistically likely response it has no real understanding of, or is it just running the latest successor to ELIZA?
With AI's ability to produce nonsensical mistakes due to having no concept of factuality or ability to actually reason about problems rather than clumsily dice them into steps (all describing what's commonly known as "hallucinations") its tendency to make mistakes will always make it borderline-useless for real work.
I like to say that the only time it makes sense to use AI for a task is if you have no time to do it yourself and no choice but to make an attempt at it very quickly. If a madman is holding a gun to your head and wants a full report on a book he hands you that you've never seen or heard of before within the next 5 minutes or you're dead, that would be a good time to use AI. Otherwise, why roll the dice with unpredictable, incomprehensible wrongness?