tipping culture in america has gotten ridiculous from what i’ve seen. you’re asked to tip at almost any place now, even if the service provided was minimal. and you’re seen as a asshole for not tipping. employers and business owners has convinced everyone that it’s part of the customer’s responsibility to pay for their workers wages just because they’re too selfish to give them fair pay. it’s their job as owner/management to pay for their fucking workers. it’s the customers job to pay for whatever service or goods we are wanting to receive. what do you guys think?