Posted on 10/31/2006 7:19:14 PM PST by Logophile
After WWII funding for the the sciences lapsed ...until sputnik.
You'd think that 9-11 and the anthrax events would warn us to stay ahead of the game but somehow it's not happening.
Any professor who puts teaching ahead of research will find it very difficult to earn tenure or promotion at a major university. The old adage "publish or perish" has been replaced by "get funded or get out" (at least in the sciences and engineering).
The guy I'm thinking of still publishes fairly frequently (and has some good funding). Somehow, however, he seems to take a personal interest in his Organic Chemistry students, and try to help them master their subject. I don't know how he pulled that; but he always struck me as the exact definition of how a professor should be.
And it was that "publish or perish" that got all the basic research done.
So the theory was fully developed 200 years ago? That was long before the National Science Foundation was created. How did Fourier manage without a government grant?
If you are saying that a sound theoretical foundation is important for developing technology, I will not disagree with you. (Although I would point out that the development of the technology sometimes precedes the development of the theory.) But that does not mean that tax money is required to do theoretical research.
Perhaps so. But it has also harmed teaching at the universities.
The difficulty is that years ago large corporations like ATT used to fund pure research facilities with no expectation that anything marketable would come out of them for many years, if ever. They were looking for Nobels, not money (in the short term). Bell Labs is an example. NV Philips still has a lab in Briarcliff Manor, New York. But years ago the accountants started demanding, well, accountability. And the scientists who were paid to sit around with their feet on their desks, creating new branches of mathematics and thinking their way to new knowledge, were asked to begin producing some product that could be sold.
I have friends who worked for Philips back in the golden days. These were the people whose random doodlings produced the compact disk. They were expected to spend much of their day staring out the windows and thinking beautiful thoughts about science. Today no one is going to finance that. There has to be direct market applicability in all the work the lab turns out. This means that only the federal government and sometimes major universities are going to generate novel concepts that are pure science with no short-term place in the market.
Well, I'm not sure what golden age of teaching in the universities you are thinking of, but my undergrad days were half a century ago and I can count the inspired teachers on the fingers of one hand...and have plenty left over.
Yes, I have heard that argument frquently over the years. However, the federal funding agencies do not fund truly novel concepts. The people at NSF, for instance, will tell you that they do not fund new or "risky" research; they want to see published results first.
I know they have a lot of fun with different branches of science at NIST, fun with physics at Batavia, fun with math at the University of Chicago, and fun with biomedical work at a lot of major university hospitals.
I said nothing about a "golden age of teaching." No doubt that inspired teachers have always been rare. The difference is that now inspired teaching is not rewarded at many universities.
There was a time, before the Second World War, when science and engineering professors at most universities were expected to teach three or four courses a semester. Now a teaching load of two courses a semester is considered heavy.
As teaching loads decreased, enrollments increased. How have the universities managed? One way has been to increase the class size; another has been to hire graduate students and adjunct faculty to teach.
I have observed these developments firsthand. When I was on the faculty of a large Midwestern university, I used to teach a course that enrolled 800 to 1200 students every fall. One department head at the same university bragged that 85% of the undergraduate student contact hours were taught by people other than regular faculty.
I said "fund" not "fun." (I am sure that they have both at the places you mentioned.)
Yes !!! If it looks like a money maker private citizens will invest.
self-ping.
Please FREEPMAIL me if you want on or off the
"Gods, Graves, Glyphs" PING list or GGG weekly digest
-- Archaeology/Anthropology/Ancient Cultures/Artifacts/Antiquities, etc.
Gods, Graves, Glyphs (alpha order)
Are you really suggesting that the government should lavish tax money on researchers in hopes that they will discover something that might be useful in 200 years?
Or to put it differently, if Fourier had not done that particular work when he did, would semiconductor devices be impossible to make today? Lithography was used in printing long before it was applied to semiconductor processing. If the diffraction integral is so important to making small semiconductor devices, I daresay it (or its equivalent) would have been derived in the 20th centutury when it was needed.
No private entity would fund science that has no possibility to be used in the near future.
Apparently some of them do. And more did in the past, before government money crowded out the private money.
But without govt. support, there would be little basic science.
According to the article, private money financed basic science before WWII.
The comparison to IBM is deeply flawed. IBM had a monopoly in high performance computing till a decade ago. As with every monopoly(Bell labs under AT & T is the best example), they guzzled dollars into basic science without a second thought. The payoff came decades later and most of the patent income this article cites probably comesfrom patents filed on the slightly modified basic science. No private company is in a position to guzzle zillions on basic science today without an immediate payoff.
I have to disagree on several points. First, IBM did not have a monopoly in computing. Yes, they were the biggest; but they had competitors, both here and abroad. They did research to stay the biggest. (Full disclosure: I worked for IBM for a short time.)
More to the point, IBM has competitors today. And yet, as the article points out, IBM continues to excel in research, both applied and "pure." (Dr. Kealey argues that the distinction between pure and applied research is somewhat exaggerated anyway, and I agree with him.)
AT&T was a monopoly for many years and yes, Bell Labs was a tremendous source of basic and applied research. But AT&T was unique in that respect. Many other companies that were not protected monopolies conducted
No private company is in a position to guzzle zillions on basic science today without an immediate payoff.
I see two things wrong with this statement. First, you seem to imply that basic science requires the "guzzling" of enormous amounts of money to thrive. I am not sure that is true. (Much of that money is wasted, at least at the universities.) Second, companies are not the only private entities that fund research: foundations and individuals do too.
By "fun" I meant "doing pure science for the love of it, not because there's going to be a dollar bill at the end of the project." That of course is how scientists operate, or want to operate: many of them would do the work even if they had to pay to get into the laboratory rather than the other way around.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.