It occurs to me that executives and people who make financial/business careers go to school to get degrees who's curriculum puts profit over people. I feel like until businesses school teaches some sense of business ethics when it comes to workers, that people who get into management positions will keep supporting the status quo.
And there's no incentive to teach business ethics, because who would hire a graduate from a program that literally taught them to cut into profit margins to help out the workers?