My initial though is, I am entitled to healthcare because we live in a civilized country, and it is the moral thing to do. And people have a basic right to live out their natural life alive. If someone gets dragged into the hospital dying, your choices are to help him or let him die. Human beings, for the most part, are designed to be compassionate. The "I'd rather just let him die" people are in the definite minority. (Again, leaving the financial argument out of it) Now you just have to figure out how to pay for keeping that person alive.
And to me, it only makes sense, purely from a financial perspective. I pay $XXX per month, my employer pays $YYYY for "Health Insurance". I use quotes because it's not really insurance. I carry car insurance because if something catastrophic happens to my car I will need a new one. If I don't have car insurance I don't get a new car. If I don't get a new car, my life will be significantly inconvenienced, but I'll still be alive. I carry health insurance because if I don't carry it and something catastrophic happens they will still do what is reasonable and necessary to make sure I don't die. Wait, what? Yup, it still gets paid for. They will ruin me financially if I don't have insurance, but they aren't going to lock me out of the hospital, and SOMEONE is going to end up paying for it.
So, now we have the insurance companies. Their job is solely to sit between me and my hospital. And they "earn", collectively, over $13B in profit annually to do so. That's a shit load of money. And that is just the profit. That come's after they have paid their thousands and thousands of employee's salary. I know it's the norm to hate on insurance companies, but holy shit, what value do they add to this equation?