You laboriously hand analyse sonar sensory data from chairs, contrasting it to that from other objects, until you've identified one or more fuzzy sensory patterns that are unique to chairs.
Well, we actually don't laboriously hand code the percepts. We did for the first two or three, but now we let Basil build his own. He still requires some help, but he does most of the work. The dialog looks something like this:
Basil: What is the name of this object?
Human: round-table
Basil: What class does it belong to?
This is where basil connects the symbolic tag into his ontology. We give him a category to assign the new object to.
Human: table
Basil: What is the distance to the objects centroid?
At present, Basil relies on us giving him the distance, rather than deriving it by rolling around the object
Human: 1600mm
Basil: What is the orientation of the object?
Human: 0 brads (this is arbitrary)
Basil then snaps about a hundred sonar snapshots, with small wiggles in between each snap. This addition of noise allows him to statistically analyze the profile.
Basil: Is there more to see?
If the object looks different from different angles, we present Basil with different views, and we can give him views at different ranges. Basil then takes all the data and builds a collection of templates that can be used to recognize this object.
He also compares these templates to the existing templates to see if there are overlaps.
As soon as this is done, Basil can recognize instances of this class of object, and add them to his mental model so that he knows where they are, and in what pose.
You've bridged the sensory-symbolic divide (in a rather useless way)! The rest of your robot can now operate at the symbolic level of it's internal model and perform party tricks like fetching beer.
I'm not sure that the bridge is useless, we have already shown that we get improved performance using the techniques, versus not bridging the divide. We are still collecting the data to show the benefits. But Basil is capable (with some assistance) to recognize that it has run into an object that it does not know about, and add a description of that object to its knowledge base. I totally agree that if the robot cannot learn, it isn't much more that a toy.
It may be able to fetch a beer, but it's not going to take over the world, or for that matter even recognize that the "beer" it's brought you is actually a rusty tin of dog shit with the same sonar signature as a beer.
However, your last point is dead on - if two objects have the same sonar signature, Basil cannot tell them apart. Of course, if two objects have identical signatures for a human's sensors, the human won't be able to tell them apart either.
Jim