It has been a fact, and also a discussion, that when one enters into a workplace, one's constitutional rights are forfeit. There is no freedom of speech, i have been kicked out of a department store for openly filming, your dress code might include a uniform, etc.
It's said that we agree to these terms upon entering their doors, but the begged question here is why are there constitutional rights -at all- if you never get to actually have them in life? If your freedom of speech is gone when you have to go get that money, what point, purpose, or significance did they ever have?
How many people need to suffer indignities before everyone stands together with those who are treated poorly trying to get the scratch that you're getting?
I've seen so many instances of blatant abuse from employers directed at people who have recounted their experiences here that I'm just sick to my stomach. Because it's hard to know that it is still happening, and also because I can relate. I think everyone has been there, having some horrible stuff done to them just because some idiot wields power over you in a work environment.
I'm very happy to see people not putting up with it anymore. I think that if there are actual rights we have in this country that they need to come with us into every working environment in their entirety.