I know this sounds lazy and thats not the case, I promise!! First of all I feel like people should not HAVE to work to live. Atleast not in normal society's view of work. We should work for ourselves and eachother. It should be mandatory in school for us to be taught real life skills and how to live off the land. Ex: farming/building, along with the other classes we take. Some may say “then no one will want to be drs, join the military, etc” many of those who choose to join careers like that are passionate about their job. Chances are, there may be a shortage, but there will still be people who want to work in those feilds. We should also be taught how to take care of medical issues just incase we are somehow in danger. Self defense classes should be taught. We really need to change the approach we take in teaching our younger generations. I can't stand this idea of a whole 9-5 job and slaving away a majority of my time to a company when I could be enjoying art, learning to cook new meals, having a plant nursery. Another issue you may have is that there will be no one wanting to work fast food jobs. Cool. If we are taught to live off the land, then there would be no reason for fast food. We should all as adults know how to prepare and cook meals. Fast food is unhealthy anyway. I hope you enjoyed my ramble. Feel free to leave advice on how to deal with having to work my life away and if you have any life hacks to help me out. Id love to hear it.