The word “gay” used to be such a joyous word. (e.g. “Don we now our gay apparel, fa la la la lala lala la”). Now, it’s associated with a sexual orientation.
Maybe someone can educate us as to when that cultural takeover started.
***Maybe someone can educate us as to when that cultural takeover started.***
Only time I can think of is an old Cary Grant movie in which he has to wear a woman’s robe and says to a visiting woman who asks why he is wearing it.
He says “All of a sudden I have gone GAY!”
I did a lot of thinking about when America lost its way. What was the moment when things started to go south and it came to me that the moment was the Civil Rights movement in the 1960s. Why? It had such lofty goals? What went wrong? It was in making American Blacks as victims, unable to help themselves and had to be given a hand up—help from the “Government”—They became the first Victim class. People got rich off of this idea and got votes. Then everyone wanted to be a victim and as such got special superior rights (in their mind) Homosexuals, Women, Latinos, the list is long—Trying to “Help” all these groups wrecked everything.