Some think America sucks now. Others that it will in the Zukunft.
I personally think America will come back to herself. Right now we have a media and “entertainment” group that has been paid for by foreign money who hate what America stands for. I think we should close our borders, remove illegals who really do hate us, not knuckle under to “new comers” who try to change our way of life to theirs. For too many years we’ve changed many of American descriptive words because they OFFENDED some one. We need to clean our our school systems from kindergarten to universities of teachers who do not like our America. Maybe even send them to countries they “might” prefer. Just my thoughts.