I see a lot of posts on this here. Yes, work is bad in the US and people are mistreated all the time. You can read about it from reputable journals, not just Reddit anecdotes. Congratulations, your country offers vacation and decent salaries and treats people like human beings and whatnot. Yes, we would like that too. Do a little research before being dismissive.