I think a more accurate headline is "Redesigned form increases conversion 25-40% (who knows why?)"
"While it's possible these adjustments also contributed to the increase, it's unlikely they were solely responsible for it"
What was he smoking when he wrote that? There's absolutely no justification for that statement that I can see.
I think the mad-lib form technique is interesting, and I'd love to see some good A/B testing of just that one change, without all the rest. You could also test reducing the number of input fields, which is also likely important, clearly marking what's optional and what's not, and hiding optional fields behind a link instead of presenting them all at once.
But you're right, of course, there is no reason to assume that the change was from the obvious of the 7 changes instead of from all (or one!) of the other six changes. Did anybody notice that the button call to action changed? I've seen A/B tests where that did 20% by itself. Did anybody notice that the header call to action changed? I've seen A/B tests where that did 20% by itself. Did anybody notice that they added the words "Thank you."? I've seen A/B tests where... actually, I haven't for "Thank you." (if you've got one, please share) But you get the general idea.
A fractional factorial design (http://en.wikipedia.org/wiki/Fractional_factorial_design) of the experiments and other statistic techniques (http://en.wikipedia.org/wiki/Bayesian_experimental_design) may decrease the amount of traffic needed.
I don't think the headline or article is wrong, though I suppose he could say, "your mileage may vary".
Regardless, I think it's compelling enough to try it out on a few forms. I'll try it on our site and crow if there's a meaningful change in either direction.