American culture has become so completely focused on businesses and their inherent “right” for profit. We have allowed exploitation of workers and destruction to our environment because fair wages and protecting the planet THAT WE LIVE ON are a hindrance to profit. We've been taught to feel fucking BLESSED to have a job, ANY job, and to do whatever is asked of you. And, by God, be fucking GRATEFUL!
Here's the thing, though: working for a company is not a privilege. The employee/employer relationship is, technically, a symbiotic relationship. A business has work that they need done, an individual needs money to live. WITHOUT WORKERS, THERE ARE NO BUSINESSES!
It's time we remember our power, bring unions back, and fight for what's right.