All insurance leeches off of someone or something. And it's certainly easier to buy assets when you've got disposable income, and smart to protect those assets with what you can. But why is HEALTH INSURANCE tied to your employer?I'd rather have universal healthcare in America, but I can't help but think of how much wasted effort, time, and added exploitation that comes from having basic health needs tied to where you're working. This shit is so outdated, but because businesses are required to supply it (often at the bare minimum), all that money is being kept from the workers or being spent on learning the latest games of the insurance brokers. You'd think we'd be able to negotiate the coverage we need instead of knocking around from provider to provider or accept a wage that might never increase.
Isn't this just inherently a terrible and unamerican idea? Is it too deeply engrained to separate these things, if it would obviously benefit businesses and remove a HUGE time-suck/anxiety? (obviously we would need effective universal healthcare first, lol)