Completely wrong.
The degrees of freedom are generally calculated by subtracting the number of independent variables in the model from the number of observations. By itself, the number of variables in the model do not tell you the degrees of freedom in the analysis.
BTW I didn't like what I read in that abstract. If the relationship is not statistically significant, it's not statistically significant. Words like "suggestive" in describing the relationship between the model and the data have no place in a scientific report.
My error- it’s been a while.
degrees of freedom (df) is a measure of the number of independent pieces of information on which the precision of a parameter estimate is based. The degrees of freedom for an estimate equals the number of observations (values) minus the number of additional parameters estimated for that calculation. As we have to estimate more parameters, the degrees of freedom available decreases. It can also be thought of as the number of observations (values) which are freely available to vary given the additional parameters estimated.
In fitting statistical models to data, the vectors of residuals are often constrained to lie in a space of smaller dimension than the number of components in the vector. That smaller dimension is the number of degrees of freedom for error.
It is probably a P.C. thing. They want to keep their day jobs.