Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: CaspersGh0sts; LucyT; theothercheek; Just A Nobody; All
By the west, what's what's usually inferred is the mountain west.

I don't know who taught you US geography, but the West (notice the capital "W") includes both the West Coast states and the Mountain West. Most also include Alaska and Hawaii in "the West." You left out Montana, BTW, in your list of Mountain West states.

12 posted on 12/10/2009 8:52:12 PM PST by justiceseeker93
[ Post Reply | Private Reply | To 10 | View Replies ]


To: justiceseeker93

Look at you trying to be all defensive about the West thing. It’s cute, really.

And no, I didn’t leave out Montana. I left out Utah. Nice attempt at a dig.

As for including Hawaii in that list, well, it shares nothing in common with the rest of the aforementioned states culturally, but if you want to go with that, run along. And culture is really what it comes down to with these states as much as anything, which is why I say the term “The West” largely refers to the Mountain West.

California, Oregon, and Washington are liberal states for that primary reason: they’ve been over-run by outsiders who have changed the frontier spirit forever. The pockets of it that remain can’t compete with the coastal jumpers.


25 posted on 12/11/2009 12:34:31 PM PST by CaspersGh0sts
[ Post Reply | Private Reply | To 12 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson