BurtWorm
(1000+ posts)
Send PM |
Profile |
Ignore
|
Wed Oct-15-03 10:45 AM
Original message |
What Books Explain America (and Americans) to the Rest of the World? |
|
Edited on Wed Oct-15-03 10:45 AM by BurtWorm
Here's a request from an Italian, I think, on alt.society.liberalism:
Hello,
Anybody could give me books recommendations?
What I'm interested is not simple American History, but I would like to read books to understand why Americans are the way they are. Why they think that this country is the top of the world and nothing else counts? Why are they interested only in what is going on in their own backyard and don't care about their neighbors, and especially not the rest of the world? Why the East and West coasts are more liberal than the middle of the country? Why don't they travel outside their borders? Why don't they care about anything else but their own wellbeing?
I want to understand where, when and how it started.
Any book suggestion is really appreciated. Thank You Roberto
|