I have nothing against working, I really don’t, but every job I’ve been in has not been right. I work in a job that’s very dead end right now, barely gets me by, and doesn’t fulfill me at all. Everyone in my circle seems to think I should move up in this field, but I hate the fucking field. I hate the corporate world. I feel like a zoo animal surrounded by the fakest people to ever live.
I want to do more. I want a job that lets me do something I am proud of, but it is so hard. I don’t have a degree because I flunked out of college due to my mental health. I wanted to be an animator when I was younger, but I live in a state with no opportunities for that and my family told me as a teenager I’d have to do graphic design like my dad; I saw the boring stuff he made and decided I didn’t want that, so I went to school for journalism and it ruined me. I have some modest debt. I can live with that.
I thought about being a screenwriter, and I was actually making progress on finding work and had a contact until he told me 70% of the studio he worked at was laid off.
I feel hopeless and kind of angry about it all. I asked my friends for advice and they’re telling me to work at Costco or in a call center, and I don’t want to do that. I actually felt insulted, and then they look at me like I have three heads when I say if I’m going to be somewhere for forty hours a week it better be something I actually enjoy, to do something that people will recognize me for. Meanwhile some woman can get rich by saying some dumb sex joke. It’s not fair. I don’t know what to do anymore. I just needed to let it out.