(Mods please remove if this is not acceptable, I apologize if it's not).
I love how jobs are tied with insurance, which is tied with care, which is tied with PTO (or lack thereof), and is all tied back to policies which we, as employees, have absolutely no control over whatsoever.
I love how you can “buy-up” to the plan you had the year before, spend more money for less care and shittier accommodations if you're lucky enough to get any, and then Healthcare.gov has the audacity to show these spokespeople getting health coverage for $16 a month.
I fucking pay $500 a month for my spouse alone just for the privilege of them being able to see a doctor. Not including deductibles, not copays, or any other of the bullshit terms they use which really mean “money you need to still pay even though you have insurance.”
Why the fuck do we work? Why the fuck do we even have “insurance” that doesn't insure against anything?! I get that this isn't a new issue, but holy shit has this gotten bad in the last few years.
I'm so fucking sick of this horseshit, and honestly at this point I would almost welcome a full-blown societal collapse just to get it the fuck over with.