This is the same tired argument I've seen over and over again, but it's simply not true. While we don't have a consensus on universally accepted definition of intelligence, most researchers agree on what this definition must, at minimum include (as I noted above - inference, learning, goal seeking, planning, etc). I don't think AGI will arrive as an announcement from some group that "AGI has been achieved!", but rather will creep into our technology over time and will probably not be accepted as true AGI until it can no longer be denied.
Take speech recognition for example (since it's been in the news recently with the launch of Siri). This type of technology will continue to infiltrate more and more aspects of our lives and continue to get more and more capable. Though increasing it's "understanding" capabilities to the point of passing the Turing test may be a ways off, it doesn't matter. It will still offer more and more functionality and capability, even to the point that it's better at "understanding" within a domain specific application than a human would be.
Think about it this way. Bi-pedal robots still have a difficult time performing anywhere near a human at the task of walking, navigating and maneuvering over difficult terrain (such as stairs, slopes, etc). However, we have machines that can zip along our highways and 100+ miles per hour, far exceeding the capability of humans on foot at the task of long distance travel. In a similar way, AI technologies will first be applied to areas where they can outperform humans (either by being better, faster, more accurate, or some other metrics). This is already happening in many areas of our lives, whether we are aware of it or not (e.g. Navigation systems, High Frequency Trading systems, Cell phones, Information routing systems, etc).
This idea that AGI implies mimicking a human is simply the wrong way to look at the issue. We already have enough humans, we don't need to create artificial ones as well. What we need are tools that can take the capabilities of our limited organic brains to the next level to solve problems our wetware simply is not capable of solving.