Is there a certain point where corporate jobs are actually worth it?
My wife and most of her friends work in the same industry. They make at least 6 figures (Wife currently makes twice what I do), get bonuses, have amazing benefits, plenty of PTO, etc. Yet they're absolutely miserable and hate it.
It's amazing to me that there are so many people out there who work shitty jobs for minimum wage but don't even bitch about it. Yet many people I know in corporate office jobs make 10 times what the minimum wage worker does and they're facing existential dread about their lives, even though they're otherwise comfortable. And a lot of corporate people I know started off in retail in high school and college, so it's not like they never had to work for a living. Yet the corporate jobs get all their ire.
The bonuses and raises make it worth it….For a while. But within a week they're back to complaining.
I'm really curious if there are any studies about happiness in minimum wage jobs vs. corporate jobs, and which one provides more benefit to the worker over the years. Because I sure as hell don't see factory workers bitching like corporate people do.