I was with a bunch of other men my age, and the statement was made about "yeah, of course we all played football". Which we all did.
And when I was in middle school, football (American) was sort of the default sport, if I wanted to play another sport I could but I had to play football. I kind of just thought it was just my parents/my school but maybe it's more of a universal American (or part of America) thing?