Teachers should be one of the highest paid jobs but they get paid so little because education is underfunded in the United States.
Imagine spending your best years, energy and nerve to teach but at the end receive so awfully less, not to mention in many parts of the country, teachers live below the family living wage
Is it really worth it to choose teaching as your career cuz it’s your dream job?