I don't know who taught you US geography, but the West (notice the capital "W") includes both the West Coast states and the Mountain West. Most also include Alaska and Hawaii in "the West." You left out Montana, BTW, in your list of Mountain West states.
Look at you trying to be all defensive about the West thing. It’s cute, really.
And no, I didn’t leave out Montana. I left out Utah. Nice attempt at a dig.
As for including Hawaii in that list, well, it shares nothing in common with the rest of the aforementioned states culturally, but if you want to go with that, run along. And culture is really what it comes down to with these states as much as anything, which is why I say the term “The West” largely refers to the Mountain West.
California, Oregon, and Washington are liberal states for that primary reason: they’ve been over-run by outsiders who have changed the frontier spirit forever. The pockets of it that remain can’t compete with the coastal jumpers.