I feel like with every new job, I'm really excited for new opportunities and I am so happy to work for a company that “supports its employees” but then down the line I realize they were just sweet talking the job so hard that I totally believed it.
I just left a job that promised me the world and never followed through. I stayed for a year and was so hopeful the whole time. Now my ex-boss has texted me 3 times this week to either come back or feel bad about leaving. I will do neither.
What are some of your tell-tale signs that a company is just talking itself up and won't put their money where their mouth is?