It’s already bad enough that employers try to tell you what you have to look like at work, but then there’s tons of people going along with it that leave snarky comments to people that don’t go along with it. When I was younger and I got my first tattoo all I heard was “good luck getting a job”. I heard the same when I dyed my hair bright red. I’ve taken jobs before that told me I had to dye my hair a normal color and they were always awful people to work for.
Anytime I see a video of someone getting a tattoo, a crazy haircut, stretching their ears, or anything like that at all, I always see comments about jobs. Someone always says “I don’t know why they would want to do that, they’ll never find a good job”. Or people that did those things in the past and then change the way they look to “be more professional” and talk about how they miss it. Why are people so okay with jobs telling them how they can look? Then on top of it, why are they so concerned about it they feel the need to try to make other people follow it too? It’s as if they’re saying this person is horrible piece of trash because they don’t want to let a hypothetical job dictate the way they look.
It goes even further than appearance too. For example I live in a state where weed is legal, yet there are jobs that will fire you over it. I got flamed in the comments of a different sub for saying I would rather smoke weed than take a job that doesn’t allow me to smoke weed. I got called all kinds of names including “immature”. I’m immature for not wanting to give everything up in my life that I enjoy to spend everyday at a place that makes me want to blow my brains out? I genuinely feel bad for anyone that’s been brainwashed into thinking that’s all that their life amounts to and the only thing that matters. When I’m on my deathbed I’m not going to be happy that I threw my entire life away and everything I enjoy for a shitty person giving me a mediocre paycheck.