Children are being taught that America was "discovered" by genocidal white racists, who murdered the native peoples of color, enslaved Africans to do the labor they refused to do, then went out and brutalized and colonized indigenous peoples all over the world.
But who is responsible for this?
WHITE PEOPLE! (actually, the White elites)
If you destroy the white elitists in government (and this includes education), this will go back to their natural state.
Like PolPot and Cambodia?