What is wrong with Western women? I’m an American myself, but lately they’ve been coming across as constantly emotionally immature and literally walking around messing up on their jobs and everything else, messing up everyone else’s lives.
What is wrong with people in general?