Categories
Antiwork

Working in US is depressing

I recently just moved here in the US from a country where labor laws are fairly imposed. Even before moving here, I already have an idea of how companies/corporates treat their workers. I just got a job in a well-known mall. I love the nature of my job. But the managers… gosh! No wonder why staff are often miserable…managers are, for the lack of better term, egotistic. No care for their staff at all! They don’t even address staff by name. Worst, there is never a feeling of certainty and security. I know I am new, but ever since I started, I always wonder if the next day I go to work they’d just tell me I’ve been fired. Yesterday, I was doing a job of 2 people (only employed for 2 weeks) and even extending for 2 hours, I still left a bit of job undone. So this is…


I recently just moved here in the US from a country where labor laws are fairly imposed. Even before moving here, I already have an idea of how companies/corporates treat their workers. I just got a job in a well-known mall. I love the nature of my job. But the managers… gosh! No wonder why staff are often miserable…managers are, for the lack of better term, egotistic. No care for their staff at all! They don’t even address staff by name. Worst, there is never a feeling of certainty and security. I know I am new, but ever since I started, I always wonder if the next day I go to work they’d just tell me I’ve been fired. Yesterday, I was doing a job of 2 people (only employed for 2 weeks) and even extending for 2 hours, I still left a bit of job undone. So this is amplifying my feeling of insecurity.

Is this how it really is for everyone? Or this is just a newbie syndrome? If it is, it is indeed depressing. Geeezz…

Leave a Reply

Your email address will not be published. Required fields are marked *