Basically the title. I really hate what American work culture does to people, and what it's done to me.
Been at this awful company for nearly 4 years and it's been a roller coaster. Unorganized, unethical, uncaring, stingy as hell (despite the fact they make plenty of money…the owner is loaded). I'm always incredibly broke and incredibly stressed. I only ever stayed because they send us out to clients, and I feel obligated to them. The clients themselves haven't done anything wrong, and my current ones are so nice. But the nature of the work is so physically and emotionally difficult that 75% of people burn out on this job and leave.
I should've quit ages ago, but something would always be in the way – I had to finish school, the pandemic happened and there's no work elsewhere, I really like this particular client and want to work with them while I still can, and on and on.
Got injured on the job and nobody really cared. It was pretty traumatic and I sobbed in therapy for the entire 50 minute slot after it happened. Afterwards, supervisor says to me, “Please don't quit. We've lost so many good people lately. We need you.” I said I'm not going to to (this was a few weeks ago)…but I've since reconsidered. I can't do this shit anymore.
People in my life are telling me I should wait until I'm done with school but I'm basically on the verge of getting 5150'd with some of the intrusive thoughts going through my head. I need a break, and already have a job lined up to work for a friend of mine (low stress, admin stuff). I'm planning to send in my resignation in about a week and talk individually to my clients. Telling everyone it's due to health reasons. It's the utter truth. But I still feel guilty and like I'm going to chicken out.
If anyone here wants to encourage me to finally just quit, please do. I need all the encouragement I can get.