After a long while of reading post after post of employer requiring this or that and doing things that are illegal or immoral. I have found myself wondering why employers aren’t required to pass a hiring/employee management test so they know the laws and rules of having an actual business. I am just spitballing here but what if anyone who registers a business / has a business / is hiring employees, or whatever. Have them take a basic understanding test of employee treatment, wage practices, etc…