I thought work was supposed to be professional but so many people in the workforce don't act like it. The relationships people have at work can often feel like you're in highschool again. People ostracizing someone, gossiping about someone, etc. There can be drama too. I remember I had my first work meeting and although it doesn't matter, I wore work uniform just in case and felt left out when everyone wore what they would outside work and all sat next to their work friend then the people who would talk to me during shifts acted like I didn't exist. It's ridiculous how most of them were adults too.
Don't you hate that? That people at the place you work at act like there is some kind of social status when as long as we respect each other that's all that matters. I know it's all about your goals and not what your social status is or what friends you have there but sometimes it can affect your self esteem if that's how others see you.