generative AI is fundamentally just predicting the most likely next token (word-ish) based on the prior, it's autocorrect on crystal meth.
it has no comprehension of anything it says or does, it just generates valid text based on the analysis of immense amounts of previously harvested text.
hallucinated citations are because AI does know what a valid citation should look like and where to expect to find one, so it puts one where you should find one, and it makes one that looks like it has seen before so the formatting and naming should sound plausible