The home of racism in America resides and has always resided in the heart of the Democrat Party. Yet, nearly every writer and every liberal takes it as dogma that Conservatives need to justify their "racist attitudes" or what have you. Of course this is nonsense but it has become so deeply ingrained in American thought that even Conservatives themselves seem to feel the need to sheepishly qualify the most inoffensive positions.
Well said, Stormhill. And you’re right - this can be described as one of those “sun rises in the east” commentaries. However, I’m amazed at the number of people in the US who continually look to the west in anticipation of its arrival.