Well, the energy output would depend on the rate of reaction. The theory seems to tell us how it might happen, but not how often.
Simple energy tests can surely show us something is going on, but beyond that, they are pretty variable and unreliable.
Spectroscopic and electron-diffraction analysis, if the tests were performed over a wide variety of elements/isotopes, would give much, much more concrete results. And as a bonus we could get some ideas about the rate of reaction, thus predictions of energy output.
But there is an important point - as the paper correctly points out, elements below iron give up binding energy when fused. That’s what makes a hydrogen bomb blow up.
Elements above iron need to have energy ADDED to make them “fuse” (or assume stable configurations of higher atomic mass).
So even if it works for the heavy, dense elements, there ends up being an energy cost not an energy surplus.
Simple energy tests can surely show us something is going on, but beyond that, they are pretty variable and unreliable.
Spectroscopic and electron-diffraction analysis, if the tests were performed over a wide variety of elements/isotopes, would give much, much more concrete results. And as a bonus we could get some ideas about the rate of reaction, thus predictions of energy output.
Pretty much I agree, but it's been over 20 years for cold fusion. At best, it's still a scientific anomaly. It's getting moldy and rotten from being in the basement of science for so long.