Hacker News new | past | comments | ask | show | jobs | submit login

Regardless of reuse, controller actions can get awfully large if unchecked. What would you say is the maximum LOCs for a public controller method?

Callbacks in Controllers and AR::Models tend to make things worse IMO for anything other than authentication; and for intricate actions interactors seem like the best defence.

I have had the (mis)fortune to jump into a number of large Rails codebases and I can say with hand-on-heart that the only ones that made any sense off-the-bat were ones using a this-or-similar pattern.

The community is embracing policies, interactors and decorators for a reason. Real developers are having real problems when Rails apps get a certain size, and there is no endorsed method on how to handle these problems.

Surely a couple of asides on the Rails docs and some official endorsement could help point new developers in the direction of a possible solution? A solution the industry already seems to be taking; regardless of whether 37Signals deems it fit for their particular domain.

> What would you say is the maximum LOCs for a public controller method?

As small as it needs to be in order to get the job done, and as large as it must be to get it done clearly. Sometimes that's zero lines of code; sometimes it's 500. You can usually factor down a 500 line method, but it may be worth asking what the breadth cost is should the code really be single-use.

If your metric is anything else, then you're playing Stupid Metrics Games, like that espoused by Code Climate.

You're really against Code Climate huh? Just what did it do to you to garner such hatred? I'm not personally a fan but I can see how it may help large companies in certain situations.

Anyways, when I asked about a "maximum" I was not asking in absolute terms. What do you see as a reasonable max for the majority of use cases?

At which point do you start feeling bad about your method?

I'm against Code Climate because it promulgates nonsensical metrics in ways that are actively harmful to good engineering. Just like Bob Martin and his concept of “clean code”. All of these things are at best sigil markers. They aren't solutions, and they can't provide any answers.

I don't have a max length to a method, because I'm not a prescriptivist. What I have is a point where I stop understanding what a method is doing, or I can't explain it cleanly. Sometimes I keep the method as long, but better comment it. Sometimes I factor it out. The point at which I do so varies based on the code, time, and many other factors.

I've worked on a C++ codebase that had a 3,000+ line function that was nearly impossible to factor out because of a non-trivial amount of state that would have either:

a. required passing much of that state as parameters to the factored out functions/methods (even if you put it in a wrapper object/struct); or

b. extracting all of the (working) behaviour into one or more objects whose sole purpose is to encapsulate the behaviour of this function.

I absolutely hated working on that function, but it was essentially the main loop of the program. You could step through those 3k lines and get a fairly decent feel for what the program was going to do, when, and how often it would repeat. The hardest part was where people before had extracted code out…that was called exactly once. We tried three times to extract the code—and failed three times, ending up leaving the code the way it was because our extractions were more complex and less understandable than the existing crappy code.

My point is that there's no such thing as a reasonable max for the majority of use cases. It depends on what you're doing and in what code.

"had a 3,000+ line function that was nearly impossible to factor out"

Sounds like the only reason it was nearly impossible to factor out was because someone didn't stop at line 500 and say "hey this is getting out of hand".

I hope I wasn't a contributor to the 3,000+ line function you're describing.

No, that's an earlier company I worked at. It existed before I got there, and I suspect it's still there now.

Do you have an example of a public controller method larger than even 25 lines that does its job clearly?

Controller method that I can share? No.

This method in mime-types is reported by Code Climate as a code smell. They're wrong: it's the smallest it can possibly be while still correctly performing the necessary goal; any smaller, and you have to break it into multiple smaller methods that provide no value except keeping Code Climate happy. https://github.com/halostatue/mime-types/blob/master/lib/mim... The method is ~35 lines long. There are other cases I can provide from open source work (https://github.com/halostatue/mime-types/blob/master/lib/mim... is a good example: deprecated code, parser for a file type where splitting into multiple methods only complicates the logic flow).

It is true that the larger a method gets the less readable, understandable, and clear it gets—but that should never translate into the sort of nonsensical request you made, asking about maximum LOC. Asking that paints you as a prescriptivist who doesn't bother understanding the context the code requires.

I've been writing software for a long time, and while I try to write methods as short, clear, composable “paragraphs”, I sometimes will write something much longer than is readable because I can't figure out a meaningful way to break it down. That comes over time and reading and interaction with the code.

The best developers in my experience are pragmatic. They are aware of design patterns, but don't treat them like sewing patterns. They are aware of development practices, but don't treat them like holy writ. The GoF design patterns book is an excellent descriptivist treatise, but then people started treating it like a cookbook and looked for reasons to implement Patterns everywhere, rather than extracting the patterns from their code.

Blog posts like this one (that tell me that I should use an Interactor) pattern are actively dangerous, because they provide dicta without properly explaining the pain that the pattern evolved to solve, or the proper evolution of the pattern.

The current code base I work on shows a lot of evidence of Rails Fads just like this one, and I'm trying to get my team to step back and ask why we do things certain ways and to go back to first principles. Don't just reach for a Presenter because it's what was done before. Don't even reach for an ActiveModel::Serializer because it's what we're preferring now for API representations. Figure out your problem. State it clearly. Write the solution clearly. Find code that works similarly and figure out (a) if and (b) how they can be extracted into common code.

There is a cost to adopting things like Presenters and making smaller methods: your interface becomes larger. You can complain all you like of large files and functions, but code bases that have large numbers of classes whose purpose aren't clear…are harder to navigate and understand. (I have, in the current code base, unextracted code from external classes when that external class is used in one place and it makes the behaviour more understandable. It also provides a better place to understand where similar behaviour may appear later so that we can properly extract code if and when it is necessary later.)

> They're wrong: it's the smallest it can possibly be while still correctly performing the necessary goal; any smaller, and you have to break it into multiple smaller methods that provide no value except keeping Code Climate happy.

Wrong? Smallest it can possibly be? Those are strong claims.

I definitely disagree with Code Climate from time to time. But I try to keep an open mind.

I've found that trying to follow dumb rules can be enlightening, even when my first reaction is that I disagree. This goes for Code Climate as well as arbitrary rules, such as Sandi Metz': https://gist.github.com/henrik/4509394

I looked at your `priority_compare`, for example. As someone who did not write that code, I find it a little hard to follow. It has conditionals four levels deep (counting the ternaries). Your `Mime::TYPE` class is quite long, as is its test.

I would let the Code Climate feedback inspire me to attempt to extract a method object in this case, along these lines: https://gist.github.com/henrik/9355101

That would result in more code overall, but smaller objects on average, and to my eyes readability is improved because the complexity is a little abstracted. Do you truly feel it provides no value?

Your extraction attempt actually proves my point: your code is more complex, is less readable, is demonstrably wrong, and destroys performance in at least two ways (I’d benchmark my “ugly” code against yours every time and win).

More complex and less readable: you’ve taken the comparison out of context of the class which is being used for comparison and put it in a different class and file that has to be opened for understanding a priority comparison. Worse, that separate class and file isn’t going to be reusable for anything else, because it depends heavily on the (admittedly large, but justifiably so) public interface of MIME::Type. So, readability is made worse for the sake of a dubious improvement in testability and no value in reuse.

If your editor supports code folding (it should, and you should be using it), then you don’t even have to see the implementation of `priority_compare` if you don’t need it.

Demonstrably wrong: you’re calling `#simplified_value` twice. You could make that not as ugly by caching, but that’s already known to be a waste of time because your extraction…

Destroys performance: `priority_compare` is used for sorting (as the code comment states outright). Creating a new object for each and every comparison is going to send sort performance to hell in a handbasket with rockets on it. It’s the same problem that plagues people who use decoration heavily: performance sucks and memory use gets ugly quick. The calling of `#simplified_value` twice doesn’t help here, and caching the value only makes memory use worse.

The code comment on `priority_compare` is clear: it’s a specialized implementation of `<=>` used for sorting from MIME::Types#[]. That should immediately halt any consideration of extracting the code into an object that needs creation, and continuing on with that is a failure of good software development in favour of purity suggested by stupid rules. Even extracting it into a class method is of highly questionable value, because now you have to (as you do with your extraction) know the left and right comparison objects, rather than (safely) assuming that the undecorated version is the object itself. It’s rather the difference between operators overloading in C++ where you need to provide both left and right values vs only providing the right value because the left value is the object itself.

I considered separating the if/elsif conditions to separate methods, but you still need the if/elsif conditions, and all you’re doing is pushing code around in ways that, once again, will not be used outside of `priority_compare`.

I had considered implementing `priority_compare` at the call-site, but it’s used at two places in MIME::Types. So, the `priority_compare` is reused, but the component parts aren’t, really.

I will admit to laying a trap with `priority_compare`: I’ve worked on this code off and on for ten years. Code Climate, being a naïve program, cannot understand how the code is going to be used, but only understands how it’s defined and measures it against some fairly simplistic and stupid metrics. They can be interesting when you are first approaching a program to understand hotspots, but their value drops to zero very quickly compared to the understanding you have from working with the code regularly.

Where I think Code Climate (and similar tools; they just happen to be fairly high-profile in the Ruby community) goes wrong is that a lot of inexperienced and naïve developers tend to treat it as Truth that Must Be Followed. Many of these folks are also fans of so-called “Clean Code” techniques and otherwise lack the experience necessary to know when not to follow these sorts of rules and assume that Code Climate is good becaus it simplifies the score to a GPA/grade.

Wrong: I'm sure it is. I did it without tests and didn't go through the logic carefully since that wasn't at all the point. The design was.

Performance: could be. If your context is one where performance is paramount and trumps readability/maintainability, that may certainly be easier to achieve with less factored code. I spent zero time trying to optimize the code.

I suspect this design doesn't make that as hard as you make it sound, though. My team writes code in the style of my example, and we don't have problems optimizing when it's called for. Though we avoid optimizing at the expense of readability unless we see a real problem.

I find it interesting that you think my example is clearly "more complex" and "less readable". I, of course, feel it is the other way around.

The fact that you've worked (mostly on your own, going by the contributor stats) on this library for ten years will of course mean that you know your way around it.

As a visitor to the codebase, I see big classes and big chunks of code that require me to load a lot into my head and digest it before I can make sense of it. Breaking that up a little means I can make sense of a smaller piece of more self-documenting logic, then dig down to lower abstractions as needed.

If you're saying that you find your way around your code better than my refactored code, I believe you.

If you're saying you think the refactored code would be harder to understand for the average developer, I think you're dead wrong.

Re: wrong: it's not only wrong, but your admission that you didn't go through the logic carefully (that is, you didn't pay attention to the comments, you didn't pay attention to the spots where it was called, etc.) says that you didn't actually pay attention to the design of the system. You refactored for a nonsensical version of “object purity” (trying to satisfy for those metrics) instead of correctness. You claim to have tried to make a point about the design and failed, because your refactoring at best did not help readability, didn't actually improve any of the metrics that Code Climate reports about, and objectively made the code perform worse in the context of how it's called.

This is what happens when you refactor in a vacuum, and the danger of following tools like CC or even the “advice” given in the article at the top of this chain. I would never let a developer who works with me get away with the sort of refactoring attempt you did.

As far as your code being more complex and less readable, you have traded minor depth complexity for greater breadth complexity without simplifying the main comparison at all, and (as I keep pointing out) made the performance worse, to boot. Your code is less readable because now I've got some kind of comparison object that I have to instantiate—and it takes two objects called “one” and “two”. Now, when looking at your priority comparison object, I have to jump between mime/type.rb and mime/type/priority_comparison.rb to understand the data values that are being compared.

The reality is that the MIME::Type class is dead simple: it's a validated data object with the self-documenting descriptive properties. MIME::Types is even simpler: it's a container that knows how to search for its contained class. I've done some fairly aggressive refactoring (it just used to be two files) to extract parsing and file interaction from the main classes, and what's left is just what's required to interact with the data provided by a MIME::Type and to work with a MIME::Type collection (although some of it is going away because the previous on-disk data representation was too anemic, and the current on-disk data representation provides just what's needed in a fairly efficient manner).

That you look at these and say “big classes!” says much more about your own biases than about the code. mime-types provides very little deep functionality, but a lot of documented and self-documented code. It just happens that the data included has a lot of attributes, and I'm a big believer in backwards compatibility. Next year, when I release mime-types 3.0, I'll be removing the deprecated functionality—and it'll cut the code in MIME::Type by about 35-40%, but it will still leave things that will trip up naïve complexity advice and refactorers.

I misspoke, though, about improvements to priority_compare: when I feel comfortable making mime-types use refinements, then I would probably make an in-place refinement of Nil to support #<=>. That probably won't happen until 2015 or 2016, though—I have to keep supporting versions of Ruby that don't support refinements until after the interpreter version EOLs.

Also glanced at the load_from_v1 code.

I can't argue with the "deprecated code" bit – if you never need to understand or edit that code again, by all means leave it. There will be no payoff for spending time reducing the complexity.

But "splitting into multiple methods only complicates the logic flow" sounds less like Code Climate being wrong and more like you not having found good refactoring techniques, maybe.

I see several things that I think have a good chance of helping readability: extracting the whole thing to an object, then replacing the inline error handling with an error handling method ("begin" is a smell), extracting (and naming) whatever happens on each iteration of the "data" and so on.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact