I’m sure there’s been a lot of posts like mine but the fact that insurance in the US is almost solely tied to employment is a joke. I’m in need of a necessary medical device that costs thousands of dollars. Now because I mostly work in the summer due to being a full time student I have healthcare through the state. Unfortunately the office that is supposed to be supplying this device does not deal with state insurance and in order to receive the NECESSARY medical device I’m expected to pay for it out of pocket and self file a claim with the state who will hopefully reimburse me. WTF is wrong with this country!?