The dirty secret with the current AI "scam" is that there's a lot of manual tweaking and over-rides being added to hide shortcomings. A good example is that right now, if you ask google AI if you can cook food with gasoline it will flat out tell you "No." Which is demonstratably false, although it's not a great idea (for several reasons) you can indeed use gasoline as a fuel source for cooking. But google devs added that override because people were asking for recipes to cook with gas and the AI interpreted that as asking for a recipe where gas was an ingredient and happily hallucinating responses, which then got posted as memes leaving the AI with egg on its face.
Between infinite and short there is a big difference. -- G.H. Gonnet