Unless this is hard-coded behavior, such unexpected response would be a sign of agency. That is, a sign that AI is capable of more than just correlate input and output based on a dataset.
That mindset is a category error; you're attributing to the automated system human qualities that it lacks.
The AI text-creation model follows the model of reflex actions: it receives stimuli, and spits out a response based on its evolved design.
If the generative model has any level of awareness at all, it's on par with that of an amoeba. If there is any human-like quality, it's in the humongous amounts of human-created training data it assimilated, not the generation process.
It's just like those petri dishes where bacteria get to solve some moderately complicated mazes, by growing towards the paths closest to the exit. There's no intelligence in the bacteria, it's the maze design what contains the information needed to both represent a problem and being able to solve it.