I'm almost finished with my degree, and lately I have been thinking extensively about future career options. My research mostly turns up reasons NOT to work for a particular company or organization, so I want to flip the script, if only for a minute. What companies or organizations do you know of that do right by their employees? Where have you worked that you felt valued as an employee and as a person? Which companies work to make their employee's lives better and easier, have made positive changes recently, or have been doing things well all along? I figure those places must be out there, if only because, statistically, there must be someone doing things right.