genuine question from someone trying to understand american work culture better. i keep hearing stories about people who are absolutely miserable at their jobs but won't leave because the pay-benefits are "too good to walk away from." like they'll complain constantly about how soul crushing their work is, but then in the same breath talk about how they can't afford to leave because of health insurance or their mortgage or whatever.