I'm fed up. In a serious debate with my husband about why the fuck we live in the US with the current working conditions. Looking for some points to bolster my point that we (everyone) were sold the “American Dream” that if you were well educated and determined enough you could make it wherever, which is obviously not the case these days.
Edit/ i have a BS he has an AS. Just very frustrated that we were both told “go to college, they can't take education away from you”.. . We haven't had our education taken away but the social contract of higher education=higher paying jobs is total crap at this point. We live in a low cost of living area compared to most, and we are still struggling. My work recently made a big deal about giving its lowest people (me lol) a wage increase of $1 which puts us about even with those working at our local Subway. My husband works for the County government making barely $20/hour.