I switched careers about three months ago. I’m now on salary plus commission, have health insurance and will start having paid time off next year.
These “benefits,” have made it hard to admit that I am drowning in anxiety and self loathing from work. Every day I fall short of my daily goals at work, it weighs me down I finally spoke to a psychiatrist today because I feel myself breaking.
I got prescribed a medication that would cost me $400 with my insurance.
Why is work ethic and growth so praised in our society when in reality to keep up with the pressures of a ”good job,” you have to invest hundreds more into self care just to keep sane.
I’m not looking for advice or suggestions. I’m just venting because I feel stupid right now for briefly falling for the propaganda that a job could lead to a better life.