I was born in this country and used to be proud to be American, sorry to say that’s no longer true. The US is not a country anymore is a corporation, everything done for profit to suck the money out of our pockets. Work life balance is a joke, we work too hard and don’t receive hardly anything back for it. Not to mention healthcare, and everything else wrong with this country. I’ve been wanting to leave for a couple years and toying with the idea, but now I don’t see myself living here. I’m not sure even where to begin or where I could go as an expat. Has anyone left the US and gone to a different country? Is the grass really greener on the other side?