I’ll start by saying I work in healthcare and I don’t get any sick days.
I’ve never understood the push for healthcare employees to just go to work when they’re sick. You have vulnerable populations and you’re just going to expose them to more illnesses?
On top of that, I get sick of this general assumption that it’s okay to come into work sick, just sit at your desk in your cubicle and take cough drops and keep tissues nearby. Not everybody has a job where they can just sit all day. I’m sick of the same standards being held to employees who have physically active jobs.
I interviewed 2 years ago at a different hospital in my city. They asked me, “is it EVER okay to call out of work?” The way they stated the question, their body language, I could tell they were just daring me to say yes. I don’t know what kind of ass backwards answer they wanted during a pandemic of all times, but it just blows my mind.
Excuse my ranting, and I’m sorry if this post is considered too low effort for this sub, but why is it that healthcare pushes so hard for nurses and other healthcare workers to come into work sick?