I and a couple of friends are going to the States for the first time of our lives this summer for a road trip. Now I thought it was about time to start planning what I should read, both before going and while road tripping; therefore, I thought I should seek your advice!
What I want are those kind of novels that really define the United States of America. They could be old or new, long or short, but they must be the kind of book that never could've been written by a foreigner. I hope someone understands what I'm aiming at although I'm unable to explain myself very well here.
I'm really looking forward for your recommendations here!