They didn't have anything worked out in the 60s, what? I mean, I could have missed some underground research so please say it, but nothing public researchers did at this time worked or scaled.
And developing more complex ML AI doesn't bring us meaningfully closer to AGI.
...And this requires self awareness.
These statements don't cohere. They don't come from a chain of justification and implication. They make huge leaps without connection. Going to invoke Hitch's Law here, as it's your burden to demonstrate something concrete.
Here are my justifiable, concrete observations. ML has indeed brought us much closer to AGI, for any reasonable definition. Competence at zero-shot question answering without any constraint on the topic, is by definition general. Answering questions that require reasoning is intelligent. How do you think the progress has been on zero-shot learning? Check the literature - it's meteoric. Though you could say question/answering is a narrow thing. It's clear that the same technology can apply to action spaces and any other data modality.
It has demonstrated abstract reasoning, induction, deduction, and scales far better than anything else we've tried. The metric by which may evaluate intelligence (compression) has been progressing on each of the dimensions of model size, dataset size, and compute, shows no sign of plateauing on any. One estimate put it at 10,000-fold more compute before we reach the limits of language modelling due to intrinsic language entropy.
That won't stop the progress though. There is less entropy in other signals, like recording video of the natural world. Stuff doesn't happen randomly there, or if it does it's at nano scale where it doesn't matter.
Crazy to think, the LLMs of today just have a few thousand token context window, yet can pass more skill tests than the average person. When the model architectures are optimized enough to allow for multi-million-token context windows, the long range patterns it will discover, and model, will far exceed our own ability to recognize.