it is reasonable to expect that AI can overlap with the category of intelligences that have such motivations.
Fair enough, but we aren't dealing with the belief that AI can in principle have such motivations, but the belief that any intelligence will have such motivations.
I wasn't aware that saying something is "antimaterialist", especially when it's not, was somehow an argument that anyone would take seriously.
That wasn't supposed to be an argument, in and of itself. I think this "vessel" viewpoint is a kind of closet dualism often exhibited by self-proclaimed materialists when pop psychological notions aren't closely examined.
Then the model of body (and also, the organ of the brain) as vessel for mind is demonstrated by actually being able to move the mind to a new and demonstrably different body.
But this seems to rest on an assertion that it would be the same mind. Set aside whether or not it's possible in principle to "transfer" the intelligence to a new type of "vessel" and just consider the old teleportation problem: is the new copy "you?"
Say, an alien transforms you to a silicon-based machine while preserving your mental processes
Again, that this is possible in principle is just an assertion. It is also possible that one's mental processes can by definition not be preserved in a silicon-based machine, so long as direct simulation is excluded. (I do think that if you simulate every atom in a brain using a physically correct simulation, the simulated brain would feel like a physical one "on the inside," all though this still leaves us with the "is it you" problem.)