I worked accounts receivable and felt compelled to become angry with customers who were delinquent. I didn't give a shit if they paid or not. Didn't affect me. After calling them I felt conflicted because the frustration I exhibited with them was fake.
When a co-worker doesn't get me an important report. I feel the need to express how upset I am with them to my manager/coworkers. I actually don't give a shit about the report.
I feel compelled to behave in ways I otherwise wouldn't outside of work because I need to ensure that my job security is solid. I don't actually care about the actual work. It's not my company.
I think there's a misconception that in order to appear like you care, you have to get angry. My work ethic doesn't work like that. I feel foreign.
How do companies promote this sort of culture? The ones I've worked for don't explicitly teach it, but it's tangible.