Obviously we see a ton of bullshit companies who don’t give a shit about employees or pay them what they are worth— but (as someone looking for a career change to escape the worst end of the capitalism nightmare) where is a positive place or industry to work?
Sure any for profit business is part of the capitalism game but surely there’s SOME places that aren’t total hellholes, right???