launderess
Well-known member
Until Now, the Answer Is Yes
There isn't a mandate for any employer in the United States to provide insurance, though that will now change under Obamacare.
Indeed it has been said one of the disadvantages some companies and businesses located outside of the United States had was that they didn't offer health insurance, thus had lower costs per employee.
There isn't a mandate for any employer in the United States to provide insurance, though that will now change under Obamacare.
Indeed it has been said one of the disadvantages some companies and businesses located outside of the United States had was that they didn't offer health insurance, thus had lower costs per employee.