I don't think "American Studies" is even allowed to be taught anymore. In fact, I don't think any woke DEI companies would hire someone with an American Studies degree.
-PJ
Most universities still have an American Studies department. But it’s going to be heavy on cultural and critical theory.