Well, then, here’s some more mind opening stuff in the area of speed of light constancy.
I encouraged a Physics student to go into this many years ago. He was looking for an area to explore in Physics to do his PhD and I suggested that the speed of light showed signs of not being a constant. If he had pursued my suggestion, he’d be at the top of his game right now. So here is my similar suggestion to others: Don’t take for granted those points at the far end of the graph. If they all “conspire”, it could mean that they are all dependent upon some variable that we have not yet discovered. One of the greatest physicists in our generation, Richard Feynman, won the Nobel prize following this line of attack. I’ll reprint some of his story here, which I found also posted online at
The 7 Percent Solution
The problem was to find the right laws of beta decay. There appeared to be two particles, which were called a tau and a theta. They seemed to have almost exactly the same mass, but one disintegrated into two pions, and the other into three pions. Not only did they seem to have the same mass, but they also had the same lifetime, which is a funny coincidence. So everybody was concerned about this.
At that particular time I was not really quite up to things: I was always a little behind. Everybody seemed to be smart, and I didn’t feel I was keeping up. Anyway, I was sharing a room with a guy named Martin Block, an experimenter. And one evening he said to me, “Why are you guys so insistent on this parity rule? Maybe the tau and theta are the same particle. What would be the consequences if the parity rule were wrong?”
So I got up and said, “I’m asking this question for Martin Block: What would be the consequences if the parity rule was wrong?”
Murray Gell-Mann often teased me about this, saying I didn’t have the nerve to ask the question for myself. But that’s not the reason. I thought it might very well be an important idea.
Finally they get all this stuff into me, and they say, “The situation is so mixed up that even some of the things they’ve established for years are being questioned — such as the beta decay of the neutron is S and T. It’s so messed up. Murray says it might even be V and A.”
I jump up from the stool and say, “Then I understand EVVVVVERYTHING!”
They thought I was joking. But the thing that I had trouble with at the Rochester meeting — the neutron and proton disintegration: everything fit but that, and if it was V and A instead of S and T, that would fit too. Therefore I had the whole theory!
That night I calculated all kinds of things with this theory. The first thing I calculated was the rate of disintegration of the muon and the neutron. They should be connected together, if this theory was right, by a certain relationship, and it was right to 9 percent. That’s pretty close, 9 percent. It should have been more perfect than that, but it was close enough.
I was very excited, and kept on calculating, and things that fit kept on tumbling out: they fit automatically, without a strain. I had begun to forget about the 9 percent by now, because everything else was coming out right.
The next morning when I got to work I went to Wapstra, Boehm, and Jensen, and told them, “I’ve got it all worked out. Everything fits.”
Christy, who was there, too, said, “What beta-decay constant did you use?”
“The one from So-and-So’s book.”
“But that’s been found out to be wrong. Recent measurements have shown it’s off by 7 percent.”
Then I remember the 9 percent. ....
I went out and found the original article on the experiment that said the neutron-proton coupling is T, and I was shocked by something. I remembered reading that article once before (back in the days when I read every article in the Physical Review — it was small enough). And I remembered, when I saw this article again, looking at that curve and thinking, “That doesn’t prove anything!”
You see, it depended on one or two points at the very edge of the range of the data, and there’s a principle that a point on the edge of the range of the data — the last point — isn’t very good, because if it was, they’d have another point further along. And I had realized that the whole idea that neutron-proton coupling is T was based on the last point, which wasn’t very good, and therefore it’s not proved. I remember noticing that!
And when I became interested in beta decay, directly, I read all these reports by the “beta-decay experts,” which said it’s T. I never looked at the original data; I only read those reports, like a dope. Had I been a good physicist, when I thought of the original idea back at the Rochester Conference I would have immediately looked up “how strong do we know it’s T?” — that would have been the sensible thing to do. I would have recognized right away that I had already noticed it wasn’t satisfactorily proved.
Since then I never pay any attention to anything by “experts.” I calculate everything myself. When people said the quark theory was pretty good, I got two Ph.D.s, Finn Ravndal and Mark Kislinger, to go through the whole works with me, just so I could check that the thing was really giving results that fit fairly well, and that it was a significantly good theory. I’ll never make that mistake again, reading the experts’ opinions. Of course, you only live one life, and you make all your mistakes, and learn what not to do, and that’s the end of you.
Evidences of scientific controversy:
LSCI 106: ONLINE RESEARCH 1: INTRODUCTION TO ONLINE RESEARCH
Is the speed of light slowing down over time?
The punch-line to the familiar joke says the only things you can count on are death and taxes. In the scientific world of physics one key fundamental that could be counted on is the speed of light remaining constant at a speed of 186,000 miles a second, over time. Much of physics is based on this assumption. But now the times, they are a-changin’, and so are the fundamental constants of physics, an international group of physicists reports. After analyzing light from distant quasars, the team has concluded that the fine-structure constant, which is related to the speed of light, has shifted over time (Seife 1410).
Why is this such a big deal? Einsteins Theory of Relativity would be wrong. The universe would not be as old as previously thought. While scientists cannot find over 90% of the matter needed to make the Big Bang a feasible theory, faster light speeds would explain it while rendering it unworkable. It would agree and substantiate the Second Law of Thermodynamics. Much of astronomical theory would need to be rethought. One thing is for certain there will be much debate and research regarding the constancy of the speed of light.
In 1989 it was claimed that room-temperature, resonant-bar, gravitational-wave detectors saw events that correlated with supernova 1987A. But in the very paper that announced this finding, M Aglietta and co-workers said that if our current understanding was correct, the energy seen by the detectors was equivalent to the complete conversion of 2400 solar masses into gravitational waves (1989 Il Nuovo Cimento 12C 1 75) The authors agreed that this was incredible, but nevertheless thought they should report what they had found in print in case something odd was going on. Nearly everyone else thought that the result was wrong, and a critical paper was published that tried to show that it was the outcome of inadvertent statistical massage (1995 Phys. Rev. D 51 2644). Last year, in an internal report from the University of Rome La Sapienza, the original authors rejected the criticism.
Consider the deep disagreement about SN1987A discussed above. Observations, better experimentation, more knowledge, more advanced theories and clearer thinking have not settled the argument - at least, not to the satisfaction of all parties. What happens in deep disputes like this is summed up in the grim Planck dictum: scientists do not give up their disputed ideas, they only die.
Raina said: There actually is some real scientific controversy over whether or not the speed of light has changed.
This would not rescue Young Earth Creationism, note the bold font. Consider SN1987A: SN1987A was a supernova observed in the Large Magellanic Cloud in 1987. (The progenator was a star blue white supergiant catalogued as SK-69 202). SN1987A has a primary gas ring that allows us to calculate it’s distance using simple triangulation. That distance is 168,000 light years. Ergo: SK-69 202 blew up 168,000 years ago or about 160,000 years before you believe the univrerse was created if you’re defending YEC. So we know the universe is older than 6,000 - 10,000 years years, because in 1987 we observed the light of a super nova which actually occurred in 166,000 BC.
We also know the light from 1987A has not slowed down during transit because if it had, among other enormous physical problems, events on 1987A would be in ‘slow motion’ and they’re not, again direct observation. SN1987A also gives us rock solid evidence that radiodecay processes operated at the same rate in the remote past as they do today. During the super nova explosion exotic isotopes were created with short half lives such as cobalt 56 and nickel 55. We can observe the decay sequence of those isotopes in the spectral emission of 1987A. They match exactly the empirically measured rates on earth which are also the theoretically predicted rates universally applicable in the entire universe. Thus SN1987A is a ‘twofer’ in falsifying YEC.