int new_sequence_number (int old_sqn, int foosoft)
if (foosoft == 2) // version 5.34.2 of the protocol
return old_sqn + 3;
else if (foosoft)
return old_sqn + 2;
return old_sqn + 1;
int new_sequence_number (int old_sqn, int step)
return old_sqn + 1 + step;
One big red flag here is that the new function body isn’t what you’d expect from its signature. Why is the sequence number incremented by step+1 instead of step? The only reason I can think of is that they want to maintain compatibility with an old callsite, but that doesn’t make sense with the category change of the argument.
Ideally, the function signature itself should change so that the compiler will point out all of the obsolete usages to you. If you can’t do that with the type system, change the function name.
(And that one person proving me wrong and reading the 10,000 lines would then proceed to nitpick endlessly on yet more details irrelevant to the blog post's point....)
Because the problems now cut across responsibilities of the components built on certain assumptions made for the sake of removing the spaghetti. Because now you need to account for "global" parameters (protocol version) in lots of disparate "local" scopes - individual functions who are quickly losing their degree of encapsulation.
We're discussing a hypothetical code-base, so focusing on the details isn't probably going to be too helpful. In any case however, I in no way see how the argument "flows from the outcome" of the refactor you're focusing on.
Specifically, he’s using the failed outcome of this hypothetical refactoring to argue that new_sequence_number should never have been broken out into a separate function in the first place because it separated the foosoft==3 case from the conditional, which made the bad refactor more likely to happen.
Certainly, in this hypothetical 1500-line function, it’s possible that they would have appeared close to each other and it’s possible they wouldn’t have. In this particular case, it’s entirely possible that the refactor was made easier to execute successfully because the programmer can leverage the compiler to find all the locations that need to be updated.
In the end, he’s demonstrated a flawed hypothetical procedure and asserted that it’s worse than an imagined alternative.
As usual, the argument being unsound doesn’t mean the conclusion is incorrect, only that it hasn’t been demonstrated here.
Personally (anecdotally), I don't believe that this statement is true. In my experience, an appropriate refactor will always be more readable than any spaghetti code. I may be wrong, your statement may be true, but without good examples of such refactors, who can say.
The above commenter is just pointing out that the example given in the article is absurd. Which it is. And with that, the argument is lost.