I've been starting to see lots of noise that we may be seeing the tailend of the pandemic.
For some reason, this gave me an anxious feeling in the pit of my stomach that society will go back to life before the pandemic. I feel like the pandemic helped to give people time to stop, analyze, and re-evaluate life. No one wants to work for shit companies. No one wants to work for shit pay. No one wants the stress and pain of trying to make ends meet by side hustling or selling out their hobbies because corporations want to keep paying shit wages. No one wants an imbalanced work-life ratio. People want to actually enjoy life and not spend majority of their waking hours at work.
Right now, I feel that majority of the population is really making it apparent to corporations and recruiters that they're not playing their game anymore. Companies are being forced to realize their shit wages and benefits are the cause of high turnover. That workers aren't going to jump through hoops or take the abuse anymore so they'll put in notice, walk out, or straight up no show.
I'm worried that companies will take away remote work. That corporations see the growing negative sentiment of the population demanding for better wages and benefits but are trying to circumvent it by saying the pandemic is over and returning to life before the pandemic. It's harder to demand change when corporations have the upperhand.
What are your thoughts on this?
Do you feel like this sentiment will die out once we're back to 'normal' or that corporations will be forced to make changes?