This article is part of a series on the |
Culture of the United States |
---|
Society |
Arts and literature |
Other |
Symbols |
United States portal |
Southern United States literature consists of American literature written about the Southern United States or by writers from the region. Literature written about the American South first began during the colonial era, and developed significantly during and after the period of slavery in the United States. Traditional historiography of Southern United States literature emphasized a unifying history of the region; the significance of family in the South's culture, a sense of community and the role of the individual, justice, the dominance of Christianity and the positive and negative impacts of religion, racial tensions, social class and the usage of local dialects.[1][2][3] However, in recent decades, the scholarship of the New Southern Studies has decentralized these conventional tropes in favor of a more geographically, politically, and ideologically expansive "South" or "Souths".[4]
{{cite web}}
: CS1 maint: archived copy as title (link)