The term originally had a literal geographic meaning. It contrasted Europe with the cultures and civilizations of the Middle East and North Africa, Sub-Saharan Africa, South Asia, Southeast Asia, and the remote Far East which early-modern Europeans saw as the East. In the contemporary cultural meaning, the phrase Western world includes Europe, as well as many countries of European colonial origin with substantial European ancestral populations in the Americas and Oceania.
These terms were coined before the Americas were discovered by Europeans.
Europe was the West, then you have the Middle East, then the Far East (China, Japan). That was the whole world to them pretty much. When the Americas were discovered they were known as the New World, and eventually became known as the West as well (Due to the fact that most of our culture comes from Europe, eg. Western Culture).
European nations aren't "western"?