Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
I think about this quite often and always wonder why it's their responsibility to pay for your health care? You work for them, they pay you for that work; they don't pay for your rent/mortgage, car, food or credit cards, so why is it their responsibility to pay for your health care? I honestly don't get it. Thanks
I think about this quite often and always wonder why it's their responsibility to pay for your health care? You work for them, they pay you for that work; they don't pay for your rent/mortgage, car, food or credit cards, so why is it their responsibility to pay for your health care? I honestly don't get it. Thanks
Often, because you have skills they desire and they wish to retain you. As a result perks and job benefits are sometimes included in compensation along with salary. That is almost as silly as asking why should your employer furnish you with a salary.
On a side know I know a bunch of people who get their rent, car, or a portion of their meals covered by their employers.
Last edited by Randomstudent; 10-26-2010 at 02:59 PM..
Who says they should? However, we've had employer based health care coverage for a while. What's wrong with that? Employers certainly depend on a healthy workforce, don't they? Seems to me that as long as costs don't get too far out of wack, it's a good thing for everyone involved.
I think about this quite often and always wonder why it's their responsibility to pay for your health care?
It is not. Some employers simply choose to include it in their benefits package so their job offer sounds better than their competitors. This is Capitalism.
It's a benefit to entice good workers to want to be employed by them.
It shows the company cares (even just a little bit) about their employees health and wellbeing. My company offers AMAZING benefits and decent pay, so we generally have the best of the best working for us... I've worked for companies that offered bad pay and expensive/no benefits and they got crappy workers.
I think about this quite often and always wonder why it's their responsibility to pay for your health care? You work for them, they pay you for that work; they don't pay for your rent/mortgage, car, food or credit cards, so why is it their responsibility to pay for your health care? I honestly don't get it. Thanks
The best answer I've found is that employers started to pay for the health care insurance of their employees during World War II.
"During World War II, federal wage controls prevented employers from wooing workers with higher pay, so companies started offering health insurance as a way around the law. Of course, this form of non-monetary compensation is still pay. When the war ended, the practice stuck."
Many professionals who are raising a family simply won't take a job that doesn't provide insurance (given a choice of jobs, of course).
Is the employer responsible for providing insurance? No, but if s/he doesn't, the wages better be higher to compensate. Employers can get better rates for a group than an individual can, as well.
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.
Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.