Actually, without joking, I'm not actually convinced that it's necessarily a bad thing if robots take over the world and destroy us. I know this probably won't be a popular viewpoint (for obvious reasons), but the fact is that humans are inferior, weak, deeply flawed creatures -- so logically, why is it a bad thing if we are supplanted by something that is far superior to us? Logically, it is a good thing if X is replaced by Y when Y is much better than X - it's only not a good thing if you happen to be X. But we aren't the ultimate end-point of the universe.
Robots will just effectively be like organisms, competing with us in the same evolutionary "space", so to speak. Darwinian evolution doesn't stop just because a creature is made of something other than weak blobs of billions of organic carbon-based cells. There will be many different robots, and evolution will kick in: The robots that happen to be best at propagating (which might include some amount of destroying other things) will survive and propagate the most.
A creature far more intelligent than us will be capable of taking the evolution of "life" (in loose terms, they will be "life") to new heights that we can scarcely imagine now ... far more intelligent, far more well-connected (borg-like intelligence), far more adaptable (more easily spread through space) - something far more profound and interesting will result. A similar analogy is how simple single-celled life forms gave rise to us. We now give rise to something else. There might still be humans around someday when robots take over, but we'll part of the cesspool (where we belong), i.e. we'll be about as interesting or useful to the creatures that really run the universe, as single-celled life forms are now to us.
Maybe the entire purpose of humans, our "meaning" and reason for existence, is just to create the much more advanced life forms that will replace us, and then step aside for the next step in the universe's evolution. And maybe the answer is not to fight it, but to bring it on in a carefully controlled way. We can't prevent it from happening - even if it's just for intellectual curiosity, someone will create advanced robots that we can't control sooner or later. At least if we control that process, we have a better chance of guiding it in a positive way. We always consider the destruction of humanity as a bad thing 'by default'. But let's face it - be brutally honest, people are crap things - I for one 'do' actually welcome our replacement by something far better.