Free will: the power of making free choices unconstrained by external agencies
http://wordnetweb.princeton.edu/perl/webwn
If you "Creating a three-laws-safe robot", you are, by definition, not giving the robot free will.
Secondly, you are assuming that within the robot there is some sort of physical override: "if a human is in danger, move these parts until they are no longer in danger;" and in their mind they are dreading the action. First of all, the robot has to determine whether or not they are in danger. This is subjective. They could find some loophole in the code and think "they are not actually in danger right now, the fire is still 3 feet away," if they didn't want to do it.
It's much more likely that instead of a physical override, we would implement "basic instincts." I could, technically, climb on my roof and dive off, head first, onto my driveway. But i'm not going to. Ever. It's not because I physically can't, it's because it's instinct. I don't want to hurt myself. Similarly, if my house caught on fire and my wife and child were inside, and I knew that "[my] chance to survive is almost zero", I would still run in to try to save them. Once again, I don't have to. It's just that my desire to see them live is greater than the desire to see myself live. If I had to sacrifice myself for them, I would in a heartbeat.
That is how you program the three laws into robots. By making them desire to do the 3 laws more than anything else.