The U.S. is not the greatest country in the world.
I’m feeling the weight and bruises of our toxic work culture and uncaring politicians.
When I take time off (unpaid of course), I feel guilty when I go to stores because I see how under compensated, over worked these employees are yet I patronize these institutions. Problem is we don’t have a choice here in America.
Are there any countries where governments and companies care about its people and enforce a healthy work culture? A place where people are treated like human beings? A place where people have lives outside of work?
Thanks everyone.