You’re in dire need of a lesson in economics.
Jobs aren’t created because people “need” them, they are created because investors hire folks so they can make a profit using their labor.
Thank you for your recommendation.
I am currently choosing either to learn about economics, so that just like every economist in the world I will have absolutely no employable skills except on liberal television networks (and likewise some government),
or else America brings jobs back to America - increasing our nation’s savings, and paying for things here.
One of those two. I can’t really decide which is more useful.