This might be controversial, but ne'ertheless...
You know, being an American who's been raised with an international background (I moved to Kuwait 2 months after I was born and never moved back to the states until 7 years later, not to mention living in Thailand, Ethiopia, China and UAE) I have to put up with people constantly putting down the States. Saying things like "Americans want to rule the world," and such. And it hurt me when I heard things like this, but what hurt me even more was, almost 50% of the things they said, I almost have to agree with them sometimes (not the "Americans want to rule the world part," that's just complete nationalistic nonsense). Like it's saddening that I have to talk of my own country this way, you know? I remember when it truly hit me, it was during 6th grade, I was living in Hong Kong at the time, and I was on AIM with some of my friends back in the States. When one of them made the following statement:
"Yeah so do they speak Japanese in Korea?"
And that's when I truly realized the barrier around the United States, where nothing truly international ever comes in. We don't see the barrier, and a lot of people don't ever notice it because they've been living in the States their whole life. The only reason I notice it is because I'm always traveling to and from the States and I have to have the same discussions as above all the time with people. It's annoying. I just wish that my fellow American citizens would just for once consider the international viewpoint instead of what's just good for them/what's going on in America.
And it's very true, not all Americans are this way. Whenever I'm outside the country (which is all the time, actually) I always make sure I'm a good representation of what I believe Americans are supposed to be. I let people know that Americans aren't the bigoted, ignorant nationalists that many people believe we are based on what has been heavily exemplified during the Bush administration (sorry I tried to make this non-political but I couldn't help it). But whenever I have to return to the States for summer holiday or something like that, I'm just flabbergasted. I literally walk out of the gate at the airport and go: "FUCK, nothing's changed since I left!"
This fact just sort of depresses me, and I hadn't thought about really expressing the way I felt about it until this past summer when I went to Key West to visit my sister, and I just felt like almost everyone I saw was a total incompetent slacker/moron, and I truly wanted to believe (and still want to believe) that that's not America. America is the land of the free and the home of the brave. Not the land of the morons and the home of the whopper.