Let's be honest here, the free market has absolutely consideration for the well being of people. Its sole focus to create wealth for those who desire to have more wealth. If a company can make more money by not insuring someone or refusing to pay for a procedure, the incentive is strong to do so. Not to mention, the natural evolution of a market is towards monopoly, where not only do they control their own industry but also vie for control in government. For everything else outside of people's health, the market can work, fine, whatever.
I wouldn't consider universal healthcare to be a right, but it is damn something I would associate with any society which wants to be considered 'advanced.' Seriously, what's the point of all this stuff we call life if we can't even be healthy enough to live it. The health of a nation's citizens should be the number one objective of any country.