It depends a bit on the context, but generally speaking I'd say the term is usually meant to include North America, Europe (western somewhat more so than eastern), Australia & New Zealand and parts of Central and South America (the stronger the European influence the more "western" and the stronger the indigenous influence the less "western"). There is also an certain additional correlation with both industrial development and wealth as well as urbanisation (the wealthier and the more urban the more likely people and places are to share "western" cultural values and heritage).
The good life is one inspired by love and guided by knowledge. Neither love without knowledge, nor knowledge without love can produce a good life. - Bertrand Russell A herring's blog Johari / Nohari
It is a very poor concept, which has become mixed up with America and her allies vs everyone else. This is a very, very fluid way to define "Western" which rests too much on political decisions, and not enough on something concrete.
I would say "Western" is synonymous with "White" in this sense, and therefore all countries with a ethnic European majority group are Western. Whether a country is Western then depends on what the dominant ethnic group is, and this is practical as demography ultimately determines what cultural values are allowed to exist in a country.
Western countries (currently) include all of Europe and North America, Australia, New Zealand, Argentina, Paraguay, Uruguay and Chile.
I always think of Western as any culture that has a heritage that includes all that biz about identifying with Greco-Roman culture, (post?) Christianity, Enlightenment, etc. Usually western cultures are marked with the emphasis on the rights of the individual
I used to assume everyone considered the whole South America 'western'. Plus, we aren't 'oriental' in any way.
I feel excluded.
"Western" just means countries with people we consider similar to ourselves. It doesn't really describe countries or societies that are actually similar in any way. It just describes what people think is similar. The impact of Europeans and the U.S. on the global stage means that there are a lot of places that I think should be considered "Western" if the term meant anything, but it doesn't.
You could say "Western" means having a basis in Christianity, but Eastern Europe does, and is not always considered "Western". You could say it means having a democratic government, but Japan and India have one, yet are never considered "Western."
It's a stupid term that doesn't mean as much as people think it does.
Forget the dead you've left; they will not follow you.
The vagabond who is rapping at your door, is standing in the clothes you once wore.
Good grief, the rest of us have being planning this invasion for ever and we still have no idea where or what this "western world" even is? No wonder our scouts keep sending us money and never come back with information.
Good grief, the rest of us have being planning this invasion for ever and we still have no idea where or what this "western world" even is? No wonder our scouts keep sending us money and instead of information and never come back.
or they could've gotten jobs as models and get all the hookers, drugs, and booze one could ever want.