(Apologies for any typos/weird formatting, I'm just ranting) Let me start off by saying the obvious: America was/is built on blood and violence. It has never truly been free or good. That being said, I understand that for decades in the past, America has been one of the best options for people needing jobs or safety.
However, as we all know, that's not the case today. Women's and queer people's rights are being stripped from them as we speak, and hundreds of innocent people of color, especially black people, are killed by cops because the government is too lazy to do anything (and let's be honest, they just don't care). Nearly every week there's another public shooting with neglectful cops. The justice system is extremely racially biased and not even trying to hide it. The poor are dying because businesses and the governments are refusing to raise the minimum wage to even a liveable wage. More American people are in poverty and the lower class, and genuinely can't afford to even get sick or injured.
You need at least 2 minimum wage jobs to scrape by with the essentials (a low cost apartment, cell phone because yes they are essential nowadays, food, bills, car and gas, etc etc), and yet other Americans tell people working 10x harder than them that they should just “give up non-essentials”. Nobody should have to give up small, joyful things to LIVE.
This is just the tip of the iceberg too. America is a misogynist, racist, transphobic, homophobic, classist, ableist, xenophobic, anti-semetic country that tears it's citizens apart for every cent they have. It's sad seeing other Americans being sucked into the “American dream” mindset. Seeing non-Americans tell us how good we have it is fucking disgusting. I understand that many Americans still preach about how “amazing” this country is, but that doesn't mean you get to ignore the other people suffering here. We lie to everyone, even ourselves, about how “easy” it is to live the American dream here.
If you aren't a cishet, white, christian, rich man in America, then you're basically screwed. Non-Americans really need to stop saying how good we have it, because they're clearly still under the delusion we sadly set centuries ago. If you're thinking about coming to America to fix your life, look elsewhere, because here you'll either be in the same situation as before or worse. This country is a mousetrap to everyone in and out of it.
Small edit: Some people really just want a reason to be mad at this. I never said America is the worst, and I know it isn't, my entire intention of the post is summarized in the title.