Honestly I can't think of a coherent reason why this horrific corporate dystopia we live in isn't the life the average American deserves. Like, why the fuck do we deserve anything else as a people? It's not even over America's bloodstained past that we deserve this, nor America's belligerent foreign policy, or America's cancerous effect on world politics in general.
I think it's just because of what empty, hollow, servile people Americans are at their core. To me, Americans are born cowards, and most US adults are hollow. There's nothing inside the people of this country, every thought and feeling they convey is corporate stuffing. I just don't care anymore, I used to be a socialist, used to organize tenants, but I just don't care anymore.
We live in a world we deserve. We suffer as we deserve. If the average American spends the tiny bit of free time they have defending landlords, corporate shareholders, CEOs, banks, corporations, and politicians, why should I give a single solitary fuck about any of these people?
Isn't this what Americans want and wanted? Full capitalism without limits? The market's violent penetration into all spheres of life? Didn't these fuckers want profit-driven dating apps, Healthcare subscriptions, the death of communal spaces? Didn't Americans want Hell and worship monsters like Reagan? Don't half this country think this hellworld is fine and dandy and constantly getting better?
Convince me these people don't deserve this, ignorance isn't an excuse, Americans are maliciously ignorant, they don't support evil shit because they're stupid, they choose to be stupid because they are evil and would rather believe a cruel and absurd lie over a truth that helps another person.