At least we're past the first hurdle. And I'll take "syntax is insufficient for semantics" as your thesis.
If this seems unclear, consider that everyone who denies the CRA on some variation of the robot reply necessarily accepts the argument.
Not really. They suggest that most symbol manipulation can't produce understanding, but manipulation of symbols of that have a causal connection with the outside world can. Essentially, they're saying "syntax is insufficient for semantics, but the extra needed component is still compatible with computationalism". But I don't find this argument very compelling, at least on its own.
On the sidelines, where you'll find magical thinkers offering variations of the systems reply, there's a strange sort of denial. To accept the systems reply is also to conceded the argument entirely as it necessarily introduces non-computational aspects.
This is simply false - they're just suggesting that two minds can be produced by the same object, like two programs running on the same computer, so the the man's lack of understanding doesn't mean that there can't be understanding somewhere else. If I'm wrong about it, just name the non-computational aspect in the systems reply. :)
And it's especially amusing that you think that these guys are the magical thinkers. All they're saying is that one part of something can understand things that other parts don't, which anyone who knows what 'subconscious' means or what brain injuries can do to a person should accept. On the other hand, you seem to think because you don't know how X could produce Y on its own that there must be a Z to produce it, even though you can't point to Z or describe how it produced Y - much like dualists or creationists.
It's probably why you'll find so few condemnations that actually address the crux of the argument (syntax is insufficent for semantics).
Except for all the people pointing out the problems with that assumption - masked man/problem of other minds issues, the reliance on intuition, the origin of something mental that isn't needed to produce behavior, and above all the complete lack of suggestions (or even hints) about what would be sufficient for semantics.
And I'm serious about that last part. Give me one solid lead on the source of semantics, understanding, qualia or any of the rest of the vaguely-described subjective stuff that separates mere computation from thinking, or a way to test something outside my own mind for any of those things, and I'll cede the entire argument.