logo
logo

logo
-
/ 5
votes

The end of victory culture

The End of Victory Culture is an autopsy of a once vital American myth: the cherished belief that triumph over a less-than-human enemy was in the American grain, a birthright and a national destiny. This book is a compelling account of how a national narrative of triumph through which Americans bad always sustained themselves as a people underwent a vertiginous decomposition from Hiroshima to Vietnam.