Sports

Stacy Revere / Getty Images

Football Has Always Been a Battleground in the Culture War

The NFL has become increasingly central to how America perceives itself, which means that sport and politics can never be divorced.