I used to believe you shouldn’t complain but instead improve your conditions, but this modern world doesn’t seem to accommodate for that.
I’ve been hopping around jobs for the better part of a decade now, and I took in a lot of internalized blame for leaving places, but more and more I see it’s bad management most of the time. Everything’s feeling more and more scummy, jobs I’ve had have no issue selling things people don’t need, either a useless product, or not a fit for a customer. It’s all about doing meaningless work that profits it feels.
Most recently I’ve joined a lawn care company, a few months ago a coworker passed away at 37, it was upsetting, but we’ve heard not any more news other than he passed. One day I was given his extra responsibilities out of no where, no questions, no additional training aside from how to use a specific spray.
To make matters worse I was working with chemicals so dangerous I was told to be bring an extra uniform and that it would be illegal to use a truck that I didn’t have a full covered locking bed to store the chemicals.
No pay raise, no real training or certification, no explanation of what to do in case of emergency aside from the basics. To make matters worse I was told I needed to help them get done over 60 houses as they “forgot” they needed to replace the man who passed away.
I’m honestly worried and depressed and it feels like no matter where I go there’s some kind of scam or grift involved.