I don't disagree with your points. While I wouldn't consider myself purely utilitarian, I also don't believe that we'll ever truly satisfy everyone. In light of that, and given that there are far too many unknowns to account for, I would argue that we need to take what reasonable precautions we can while making an effort to move towards addressing those unknowns. I'm merely arguing that there are some risks that need to be taken, carefully, and that it's okay if one of the things we learn is that we shouldn't take that same risk in the future.
You mention the hypothermia experiments as an example of useful but morally objectionable research. What if those participants were willing (and we didn't have the implicit end point of their demise)? What about the Minnesota starvation experiment? There's very useful research that we could do, using individuals who value the potential benefit as greater than the risk, but that we choose not to on moral grounds.
There's a bit of a disconnect where people get idolized for signing up to die on Mars, but we demonize other attempts to kill people for science.