Is AI good at summarizing
It's astonishingly bad at summarizing text. It will ignore important details and 'hallucinate' others. Oh, and if the thing you want it to summarize isn't accessible or doesn't exist, it will still provide 'summary'.
or do you just believe it's good because it's convenient?
The output looks really good if you don't bother to check it for accuracy.
Is it really doing what it says it's doing
They suck at summarizing text because they're not actually summarizing text. All these things do, all they can do, is next-token prediction. That's why it doesn't matter if there isn't any text to summarize. Next-token probabilities are produced the exact same way, no matter what the context happens to be.
do you just have a shortcut in your brain that says confident speech is probably right so you don't have to waste time thinking about it?
To be fair, I think we're all guilty of that. If not when it comes to AI generated nonsense, then a book or some other media. It's impossible for us to be experts in everything, so we all lean on expert opinion. We also tend to associate confidence with certainty, which is fine most of the time, provided we don't also mistake certainty for accuracy!