The US is becoming more and more Anti-Women, and the Left is driving it.
Exactly.
The broader agenda is to softly chip away at our culture and social norms. It starts even before kindergarten but that is when they generally are first able to influence the minds of future voters.
Today it is fairly obvious to those who wish to look that it has spread throughout the education system to the arts, the MSM, sports, the churches, the military, you name it, anything that is today "woke".
Sadly, we seem as a nation unable to marshal the political forces necessary to even identify the threat and give it a name.