I'm not an A/B expert by any means, but it seems like there were far too many changes made to this form to give credit to just the 'mad lib' aspect. I've seen Patrick (patio11) mention that he's often surprised by how the tiniest change can result in a huge conversion difference. The second form is VERY different than the first.
I think a more accurate headline is "Redesigned form increases conversion 25-40% (who knows why?)"
I came here to say this, you beat me. Also, it's important to note the context of this "conversion". It's sending a message to someone you don't know. No one likes to send cold emails, so pre-writing the message takes a lot of the burden off. I'd wager just having the "Comment" textarea pre-filled with a message would have similar impact.
Yeah, there are a lot of differences here that could increase conversion. The second form actually includes some information about what this message is about, while the first form is completely bare and doesn't clue users in to what they are inquiring about. The first form has a wall of 9 input fields (with no indication of what's required or optional), while the second has 6 with a relatively discreet link for adding an optional seventh. The second has a lot of friendlier language overall, and has the less relevant links at the bottom separated out from the form better, reducing the amount of information that you need to pay attention to when submitting the form.
I think the mad-lib form technique is interesting, and I'd love to see some good A/B testing of just that one change, without all the rest. You could also test reducing the number of input fields, which is also likely important, clearly marking what's optional and what's not, and hiding optional fields behind a link instead of presenting them all at once.
They could be more certain with a multivariate test. I count 2^7 = 64 possible variations (counting the madlib/form as just one point of difference), which would take an awful lot of traffic to discriminate between. So I sort of kind of understand the desire to wrap it up in one simple A/B test.
But you're right, of course, there is no reason to assume that the change was from the obvious of the 7 changes instead of from all (or one!) of the other six changes. Did anybody notice that the button call to action changed? I've seen A/B tests where that did 20% by itself. Did anybody notice that the header call to action changed? I've seen A/B tests where that did 20% by itself. Did anybody notice that they added the words "Thank you."? I've seen A/B tests where... actually, I haven't for "Thank you." (if you've got one, please share) But you get the general idea.