Main Page | See live article | Alphabetical index

West Coast of the United States

In general, the term "West Coast" is a nickname for the coastal states of the Western United States, viz. California, Oregon and Washington. The West Coast is a subset of the West.

It has also come to be called "The Coast", especially by New Yorkers, or the "Left Coast," a pun based on its lefthand position on a map of the US as well as its reputation for being more socially liberal -- or left wing -- than the East Coast or Midwest.

The term has been taken by rap music performers when used to refer to a particular school of artists. The East Coast/West Coast dichotomy has led to violence and much rhetoric.

See also: Geography of the Western United States