More and more, I feel that the idea that I need to provide goods or services to other people in order to be able to take care of myself is demeaning, even if I’m working in a white collar job. The idea that we must always be contributing to society to be worth anything rubs me the wrong way. Even though I’m not working in the sex industry, it feels like I’m a whore, exploiting myself for someone else’s gain. It feels like my employer only cares about me insomuch as I provide value to them. I know there’s no way around the way society is set up, but I would really like to find a way to not have to work anymore. Anyone else feel this way or am I being entitled?