(Also sorry if this is worded weird, I’m a little scatter brained right now.)
I’ve been thinking a lot about the culture of the south recently. I’ve lived in Florida my whole life and I’ve never left the southern US, and one thing that’s been eating away at me is the idea that working hard for what you got is something to be proud of.
The thing is, I agree with the statement, I draw and write and when I work hard on a particular piece I feel really proud of myself, I get the same feeling when I cook an actual meal for myself, or push myself during a work out, but the thing is, I seldom ever get that feeling at work.
When I’m working, everything feels like an obligation, and any sense of pride I feel at an accomplishment is short lived and dull. It’s like doing the dishes because you want to vs doing the dishes because you were told to.
I don’t see any reason to be proud of selling my time away to some rich prick for little in return, but I can see myself being proud of building my own house from the ground up, of waking up early to tend to crops and livestock, of making home cooked meals, of selling my products I made myself at market, of fixing my own car, stuff like that.
I believe that that is what this whole “work hard for what you got” thing is actually meant to be, work hard so YOU and your family can reap the fruits of your labor, not so some rich man that doesn’t give a shit about you can get richer and richer.
But people don’t realize that it’s changed, and if they do they don’t acknowledge it.
This probably isn’t a steaming hot take, I’m pretty sure MANY people have said this before, but it just really sticks with me because that’s the life I want, but will probably never have. I can’t afford to buy land, much less build a house on it.