Since when does an employer have the obligation to provide anyone with health insurance? My company does not offer health insurance and I don’t feel one tiny bit oppressed or victimized.
When did we all start to believe that we have a right to insurance coverage provided by an employer? We don’t have such a right.
Good for you.
When did we all start to believe that we have a right to insurance coverage provided by an employer? We dont have such a right.
Employer provided health plans have been the norm for decades so that's the system we are dealing with. I never said employer provided health insurance was a right. In fact I believe I said in the post you responded to that if the employer doesn't provide health insurance he should at least pay his employees enough so they can purchase their own.
But I'm all for moving away from an employer/insurance model and for going directly to a single-payer system supported by tax revenues.