I have worked for a family owned business for 20+ years. They used to be the best company on the area to work for, owned by a woman and great pay. The more money they make, the more they take away from their employees. Every time someone quits they have someone add the responsibility to their job. They do not pay for being trained in multiple departments, they have lost MANY employees strictly because of their greed. I used to love my job but now dread going to work. All departments are loosing people regularly and it is so sad. I truly want to believe that the owner does not realize all the changes that have been made and the affect on the employees. I think all she sees is that she is taking home more money than ever and she is blind to what is happening in the warehouse/office. Supposedly an Open-door” company. Should I go to her or is it a waste of time?