Just curious. Want to hear from men and women. I used to smile a ton at my last job, but one day after coming in from needing a couple sick days (because my coworker could never bother to stay home when she was sick and got me sick) I wasn’t really in a great mood. I had the integrity to stay home so I don’t infect others with the cold but my boss wouldn’t even let me work from home even though I could do 100% of my job from home. Unpaid days I had to stay home sick.
But anyway, when I got back into the office, the ONE day I didn’t feel like smiling, it was “you need to smile, it’s affecting the energy” even though I was literally still sick, just not as contagious anymore.
I quit that same day plethora of other incomprehensible grievances I had dealt with but won’t get into it. Just curious about your thoughts on a comment like that and if you’ve ever been told it before.