I'm not speaking as a nurse, but from what I understand, it's tough to have that attitude towards your "career" when you don't get paid what you deserve, and your job is more about dealing with the fact that your employer wants to be as cheap as possible in every regard, rather than providing care to patients. Honestly, I think a major outbreak in the US would be a good thing because it might make people think more about the value that they assign to the people actually doing most of the work caring for them when they go to the doctor's office.