Hello, I'm new here but I have a burning question already!
I recently wrote a blog post about dystopian literature (see my profile to get the link-any views greatly appreciated) and as an 'off-the-cuff' remark I said there was seemingly a trend where the USA or North America are the society that falls.
I base this suggestion on books like The Handmaid's Tale, The Hunger Games, to some extent 1984, but I wanted to know if anyone else had spotted this trend too. Is there a pattern, particularly in more modern dystopian literature, where the USA is made the fall guy?
And if so, why are they? I would say it was because the Western world symbolises power and organisation and so it has more of an effect if the US falls.
But what do you think? I'm interested if anyone has any evidence to suggest there is a pattern or isn't.
Thanks!