Thank you for your recommendation.
I am currently choosing either to learn about economics, so that just like every economist in the world I will have absolutely no employable skills except on liberal television networks (and likewise some government),
or else America brings jobs back to America - increasing our nation’s savings, and paying for things here.
One of those two. I can’t really decide which is more useful.
Let’s hear your ideas. How do you propose to “bring back American jobs?”
Most “economists” dont know the first thing about economics. You do not need a university degree to understand how markets and how investors act.
Pick up Atlas Shrugged by Ayn Rand, for instance. That will teach you all you need to know about economics.