Shows that Hollywood doesn’t make movies to entertain or even to make money. It makes movies to promote the producer’s vision of the world as it should be.
Oh well. Maybe if Disney and Hollyweird stopped dissing Americans both on and off screen, and make a decent wholesome movie, we’d go to one. Can’t tell you how many movies lately start out with blatant sex scenes or some other vile thing.
I personally have not been to a theater in over a year — when I last went, I had my I-phone stolen.
It’s just not fun anymore, either, to try to watch a movie amongst the current crowds of folks that make up the general public. Going to a movie is literally a risk to your life.
I’d so much rather go out to dinner with friends, or my husband ... and just enjoy some peace.