Hacker News new | comments | show | ask | jobs | submit login
Design Patterns – A comprehensible guide (github.com)
268 points by kamranahmed_se on Feb 16, 2017 | hide | past | web | favorite | 110 comments

In a galaxy far far away, there was a kingdom where people didn't know structured programming. Their programming languages only had (conditial) branches, jumps, or GOTOs (as they were colloquially called). And they had a very influential programming book, called "GOTO design patterns". Here's a few examples of design patterns from the book:

- "IF pattern" - If you need to execute code only if some condition is true, you write a conditional GOTO that lands after the code that is to be executed conditionally.

- "IF-ELSE pattern" - If you need to execute code only if some condition is true, and some other code if the condition is false, you use IF-ELSE pattern. It's similar to IF pattern, but at the end of the code in the IF pattern, you add another unconditional GOTO that branches after the code that is to be executed if the condition is false.

- "WHILE pattern" - If you need to repeatedly execute code and check some condition after each execution, you add a conditional GOTO at the end of the code that goes back to the beginning of the code you want to repeatedly execute.

- "BREAK pattern" - This is a pattern to use with the WHILE or UNTIL patterns. If you need to, on some condition, stop repeated execution of the code in the middle of the code, you add a conditional GOTO into the middle of the code that just branches after the repeatedly executed code block.

Everyone was happy with the book. No more messy spaghetti code full of random GOTOs (as was common before the book appeared), because young apprentices memorized all the patterns in the book, and they just applied them rigorously. Also, understanding code became easier, because everybody instantly recognized those patterns in the well written code, and they had a common vocabulary to talk about these things.

One day, a traveler from another world came into this kingdom. He looked at the book and wondered: "Why don't you just add these patterns into your computer language as abstractions? Wouldn't that aid comprehension even more?" Unfortunately, he was executed for the heresy.

This is, and always has been, a poor criticism of design patterns, by people who think they're just a library of code that idiots are supposed to cut and paste to be better programmers.

A common example from the GoF book is the Strategy pattern - most functional programming snarks will jump in and say you don't need this! We have first class functions you can pass around! And it's entirely true that an implementation of this pattern (a good, simple one even) is just accepting functions as arguments - cool, you probably get it! But the _design_ principle is about making strategies _explicit_ - allowing decisions to be made or policies to be enforced in ways you don't anticipate. I still see _tons_ of libraries written by smart people in functional languages that will still accept an enum of some kind to tell them what do in certain situations (e.g. a lot of HTTP libraries' extremely limited retry logic). That's an explicit design decision - to limit options instead of opening them up. Having a language to talk about the _design decisions_ at a higher level, instead of conflating them with language features, is the whole point. Everybody knows you can pass functions around as arguments, that doesn't necessarily prompt you or make it easier to discuss implementing a Strategy.

Design patterns aren't a static, finite set of code snippets, and I actually think things like the linked post that mostly treat them that way aren't helpful. They give you a way of naming and talking about components of a design, possibly at a higher granularity than your programming language (and sure, different languages have different granularities).

We've been hearing this same exact criticism for literally 20 years and it's boring at this point. I get that most people don't bother creating new design patterns in public these days (because they've internalised that modern languages have 'design patterns' built in), but if you look at something like React and think it's just JavaScript and there are no design decisions in there worthy of reifying, I think that's sad and unproductive.

My main point (just to make sure we are on the same page) was that whatever patterns are, we should also tell computers "look, here I use this pattern", and not rely just on humans to see it. Unfortunately, computers are idiot savants, so this very well might require those to be formally defined.

Anyway, I think we have an ontological disagreement here. I don't believe the "design patterns" which cannot be formally defined are useful, or even that they exist. I consider them to be just phantoms, which when you attempt to formally define them, you either understand them much better (to the point where you can see how they are related), or they disappear as something that was misconceived in the first place. You seem to think that there is something real.. fine.

Possibly, why I see it that way is my mathematical training. In mathematics, this battle was fought more than a century ago and the formalism camp won. That doesn't mean that intuition is completely useless.. just that people mostly keep their intuitions (and inspirations and analogies) for themselves, and when they communicate, they strive to be formal.

To your React example - it's not a pattern, it's a piece of concrete reusable code. We don't need "React pattern", but we need React. Just like mathematicians, we should communicate "patterns" via reusable code.

But you don't _talk_ exclusively in formalisms to other mathemeticians, do you? You don't immediately jump into a specific form of calculus upon first hearing of a concrete real world problem?

And I'm not saying React itself is a pattern. I'm saying if you wanted to create your own React-like framework, or to communicate the design decisions and mechanisms that make it different from other frameworks, there is value in extracting and naming those concepts. This way, people can appreciate, understand, compare and contrast them _outside_ the context of one piece of concrete reusable code. As an entirely throwaway and unthought-out example, you might say 'Differential Rendering' is philosophically different to a 'Mutable Document'.

But if you think all things in the design of software are reducable to reusable pieces of code, more power to you.

> But you don't _talk_ exclusively in formalisms to other mathemeticians, do you?

In informal setting (like lecture or workshop), maybe not. But when publishing, you always talk in concepts that are defined formally. That doesn't mean that you use formal language all the time, but every time you introduce some definition or term, it is understood (obvious) that you could provide completely formal definition. Also, having a formal definition doesn't preclude using motivational examples.

So no, I don't think a book like "Design Patterns" could happen in professional mathematics today. Mathematicians simply don't introduce shared vocabulary of informal definitions. But it wasn't always like this - Euler used words like "curve" without having a completely formal definition (and without worrying about it). But eventually we learned this was a problem, and I think we will learn the same about programming patterns (and that's why people IMHO largely stopped trying to name new patterns).

> And I'm not saying React itself is a pattern.

Yeah, I understand, I was just taking a rhetorical shortcut. What I am saying you should be able to point at a set of functions of React and say "these embody one of the ideas behind React" (although there can be technical reasons that prevent doing exactly that on the current React codebase). Then the "pattern" you have in mind is well-defined by these functions.

> But if you think all things in the design of software are reducable to reusable pieces of code, more power to you.

I think it must be, because the "pieces of code" is the ultimate end goal. So there must be a point where the informal definition of pattern gets translated into formal code. And to do this translation reliably, you need to be able to enumerate (or parametrize) the choice of formal options that stem from the informal concept, and this process is actually the formalization of said concept.

I would say the software design patterns are rather special case, because they involve code readable by computer. In contrast, the original design patterns in architecture don't need to relate to computer code (there is no code reuse), so they are perfectly fine with not being formally defined.

I guess I just don't inhabit this world where there is nothing more to software design than reading and writing code. I write comments, I read readmes and documentation, and I find shared, human descriptions of larger granularity design abstractions useful. ¯\_(ツ)_/¯

> I write comments, I read readmes and documentation, and I find shared, human descriptions of larger granularity design abstractions useful.

Oh, I have nothing against that, except the word "shared". I just don't think that "design patterns" (from the GoF book) helpfully contribute to that. In practice, it really depends what you want or don't want to gloss over as a technical detail, and just saying "there is this pattern" doesn't give you enough flexibility to do it.

What I am saying there is a trade off - if you decide to name something, you should define it well enough not to be misunderstood. If you can't or don't want to define it well enough, better not name it at all (except maybe temporarily), just describe your intuitions/feelings about it. The authors of "design patterns" seem to be oblivious to this trade off (although I think it wasn't really their fault that the names for patterns got so popular, I think they just wanted to have fancy chapter titles).

It's interesting though. When I describe how something works or should work, I use terms like "client", "server", "memory", "unique random number", "sort by", "find maximum", "queue", "remote call", "common API" etc. But I never found a compelling reason to describe things in names of (standard) design patterns. Why are not these terms considered to be design patterns? In my opinion, it is because the design patterns tend to be overly specific in some ways (talking about class/interface relations) without adding enough substance to be meaningful on their own.

So I don't find them useful to describe design (overly specific), nor I find them useful to describe code (not formal enough). They are just kind of slippery for both use cases.

There are downsides to abstraction, too.

The result of the urge of explicitly abstract every pattern can end up being a brittle network of dependencies. It can become really hard to adapt the formal abstractions to small requirement changes—either that or the abstractions end up byzantinely parameterized.

Or you have to spend decades doing category theoretical research to discover the platonic essence of your patterns...

Formalization hasn't really won in math communication, has it? How much math researched is published as Coq terms or whatever? Maintaining all of math as a network of formal abstractions is really difficult.

Basically the insistence on maximal abstraction seems to imply that you try to minimize the length of application source code and as a side effect you maximize the complexity of programming languages and frameworks.

By the way, Richard Gabriel discusses this issue in a chapter of his book "Patterns of Software" (free PDF available).

I think formalization and abstraction are two different things, and you conflate them in your comment.

Formalization, in my view, is about "knowing what you are talking about". So you have a perfectly defined set of basic operations on the concepts you use, and you know what you can do with them and what happens if you do; you cannot cheat by making stuff up as you go along.

On the other hand, abstraction deals with "how general I can make this thing" (AKA reusability).

What I mean by "formalization has won in mathematics" is that when you write a paper, all concepts you are using have to be formalized, or at least it must be obvious how they can be, typically through a shared context of standard theories (so you don't have to use formalism everywhere, although it would certainly be desirable and eventually I think it will happen).

But that doesn't mean you have to always use the maximal level of abstraction. It's perfectly valid mathematics, for instance, to prove a theorem just for Euclidean spaces even though there is nothing that could stop you from proving it for topologies. (And I think Bourbaki group learned about the cost of abstractions the hard way.)

The same is fine with code - if you think abstraction hurts it, absolutely go for lower level of abstraction. But it will hurt reusability of said code, no doubt.

But if you want to talk about some reusable concept, you should be able to show the code at the correct level of abstraction (or at least be able to show that you are able to show it).

So that's my beef with software design patterns - they are abstractions, but aren't formally defined. It's like eating a cake and wanting it, too. They are at best transient thoughts ("funny, this thing here looks awfully similar to that thing there, how so") before somebody should abstract them formally. And I cringe when people think that they help communication, because in mathematics, we already went through that.

I was using abstraction in the sense of "lambda abstraction", like a formally specified function with parameters.

Well, I had to remind myself what the actual design patterns "movement" is, and I get the impression that it's a dead movement with no development since the classic book and a very limited influence.

It seems that maybe the bigger problem is that there is no "theory of software" aside from a few interesting attempts like "algebra of programming" etc. That's why I can't really blame people for using software concepts that aren't formally specified.

What's your favorite alternative to design patterns, or like the path you think is more fruitful and good?

After few years of skepticism, I got convinced that pure functional programming (ala Haskell) is the right path forward. (What convinced me was that you can do the same things as in Lisp, with type checking, see http://www.haskellforall.com/2012/06/you-could-have-invented... and more recently http://www.haskellforall.com/2016/12/dhall-non-turing-comple... .)

I think having very strong foundations is very comforting, and the separation of side effects (and introducing them into the type system) has important consequences for composability. But it is still hard and work in progress.

One of the more subtle criticism I have of the classic design patterns is that they often rely on interfaces as descriptions of objects (in the general sense). But it is a very weak description, and indeed, the most useful type classes (which is comparable to interface) in Haskell actually also have laws (not expressible in the type system, for better or worse) that describe behavior. This was a point made by Erik Meijer, that really, you want to describe the objects (that your abstractions can work with) with the laws governing them, not interfaces.

So yes, I think the type classes with laws that are commonly used in Haskell (which is actually pretty much application of mathematical concepts) are the way forward, because they are both better defined and more fruitful (thanks to being explicit about laws) than design patterns.

In particular, I suspect the whole field of group theory is still waiting to be discovered by programmers; I had this idea of looking at error handling and recovery, or transactional processing, as a group operations within a certain subgroup (of valid system states). So no operation or empty transaction is a neutral element, and recovery from action or transaction rollback are inverse elements in some group. I am not yet clear whether this is helpful or trivial, and I am not really sure how to proceed with it. But Monoid abstractions are already quite used in Haskell, so why not Group ones?

I'm super interested in all that stuff, but I also feel that it's not for everybody. Like just yesterday I saw someone on the Rust subreddit who mentioned that they think Haskell is cool but it tends towards too high abstraction levels for their taste, which I totally understand.

> You can do the same things as in Lisp, with type checking

Can't easily do even simple things like variable argument functions, or polymorphic lists (required for representing source code for macro programming).

Lisp has loops, variable assignment, stateful objects of various types and so on. Also, strict evaluation for straightforward, deterministic sequencing.

I only touched that point but as I said - I used to think the same, but then I actually studied it and changed my mind. You can do all that in Haskell too, including interpreting homoiconic code and build DSLs (which was my main point), and it's all built on the more solid foundations with integrated static type checking. It just works quite differently.

I didn't mean to imply that Haskell isn't Turing complete, of course.

It's not about that. Haskell has tools to do all that, and they are better. You can write imperatively in Haskell, you can write strictly or lazily in Haskell, you can have dynamic types in Haskell. There is actually more choice.

However, one (unfortunately) needs to study it to see it. I like Lisp, but curiously enough, it has its limits too.

> "Why don't you just add these patterns into your computer language as abstractions? Wouldn't that aid comprehension even more?"

I often asked myself this question! Why do I have to say

    public class DatabaseConnectionSingleton {
        public static DatabaseConnectionSingleton getInstance (){ ... };
and then get an object using:

    var db = DatabaseConnectionSingleton.getInstance();
Why can't I say

    public singleton DatabaseConnection {...} 
and just go

    var db = new DatabaseConnection();
and let my runtime make sure that I get the appropriate instance? Left aside the discussion of how useful that pattern is, I would like my language to be more powerful. I am aware of alternatives (Borgs in Python) and that many people consider Singletons an antipattern, as it helps introduce globally shared state, etc. but then, why do I have to care about how instances are implemented?

After all, in C#, foreach interacts with iterators, why not push other mechanisms down into the language in the same way?


    // Unit of work, commits changes before the reference gets removed from stack
    public unit CustomerForm { } 

    // There can be only one.
    public singleton DatabaseConnectionHighlander {}

    // Any method called updates values in some way
    public builder QueryString {}

Just one of many reasons that Scala is such an attractive option for Java developers:

// There is only one object DatabaseConnectionConnor {}

It's true that other languages often don't require certain patterns. Strategy and Visitor are often unnecessary when a programming language has functions as first class citizens.

I found that it often helps to remember that the GoF book was written about object oriented patterns with C++.

I appreciate the effort, but one of my pet peeves with OO / design pattern examples is when they're completely contrived.

Once you've understood the very basics of OO, I think talking about a door factory or a burger builder tends to obfuscate real-world usage more than it aids comprehension. I have this problem with most science/technology analogies, to be honest.

Reminds me of physics books from college where they might say "a horse is approaching a barn door at relativist speeds..." - always seemed much too contrived and I'd maybe have preferred "an exotic particle is moving towards an electron at...".

I know I might be in the minority, but this kind of stuff doesn't bother me at all. The silly abstraction is essentially there to highlight which part of the problem / lesson you should be focusing on, and to stop you from going off on a tangent and getting bogged down in the gritty details.

Going with your exotic particle example, once they've framed the problem like that, there's an implied expectation of reality and correctness which is likely going to be missing from an introductory example.

Check out Soviet books. You don't get the feeling the authors assume you're mentally challenged as you do in most other books.

This is all fun, but I think knowing how to apply knowledge is an important part. Many don't have a problem with electrons and pulleys, but put a real world example and they're out of their wits compared to someone who uses to dead reckoning.


There's an interesting book on the site by Belikov: "General Methods for Solving Physics Problems".

The physical behavior of a horse is better understood than that of an electron and might make it easier to consider the subject at hand, though. We know what path the horse will travel and we know that when the horse reaches the barn, it will go splat (well, in this case the barn will go splat). There are several competing theories that could describe what happens with the electron and the exotic particle, though.

Also, it's fun to picture a horse travelling at relatavistic speeds :)

In addition to contrived examples impairing comprehension I even think a real-world example fails to help the reader capture the most important piece of information about design patterns, and that's the context around when to use them. Giving a worked-out example while teaching the concept has the limitation that the reader didn't discover the pattern themselves in code, instead it was ripped out of a code base and handed to them without context of its use. And instead of seeing it as an option that was selected to be used in a particular code scenario it is instead presented as a fully-worked solution that can be plugged in anywhere and describing when to use it in words isn't as effective as discovering it being used in a code base you've been learning from.

Quite frankly, in my radical opinion, I think learning about programming through reading about it (books, articles, blog posts, etc) isn't very useful once you are past being a beginning programmer. To be a better programmer reading code is the only way I've seen good progress with learning new abilities that don't cause me to over-engineer my own solutions. This might not work for everyone but for me reading code daily has been a silver bullet to programming knowledge that a book or design pattern article could never capture for me.

Right, stop making pained carpentry analogies. I couldn't build a wooden box to save my life, but I do have 7 years of professional software experience. I would much prefer if you drew from that experience.

Be careful what you wish for. We can always take the "zoo animals with varying legs" classes and interfaces out of the box again.

It would be great if there was a resource with real world examples, since I have this exact same problem.

I implemented some of the more popular design patterns using more real life like examples. You might find them useful :


I'd say when I was first learning about many of the design patterns, this was my biggest hurdle as well. The examples and analogies were unrelated to real-world scenarios and didn't do much to help me understand the concepts.

It wasn't until I came up with my own real-world examples that were specific to my domain and really applied the design patterns in practice that I really started to understand everything. I don't think it's really the real-world example that helped so much as the actual process it took to arrive at an implementation.

Grady booch was working on a "handbook of software architecture". What he was doing is go to companies and look at how systems have been built.

there's a presentation https://pdfs.semanticscholar.org/242a/61777fd7f0f7ded8d9326e...

He was looking at the structure, and the patterns in real world systems

Disclaimer: I haven't looked at it in many years, but I seem to recall Fowler's Enterprise Patterns[1], while covering a higher layer of abstraction, grounded in real world problems.

[1] https://www.martinfowler.com/articles/enterprisePatterns.htm...

It's not so much they're contrived but that starting with real world, physical examples is ultimately not helpful. OO objects are not physical objects and the sooner someone realizes this, the better.

A while ago, as an exercise I decided to implement some of the more popular design patterns in Swift, using more real life like examples. If you're interested you can find it here: https://shirazian.wordpress.com/2016/04/11/design-patterns-i...

Questions like “Which paradigm is better for modeling the real world” are virtually meaningless. The choice depends upon how the world is conceptualized, and what additional properties are expected of the model.

The main value of the design patterns is that they have given names to things that were already common and fairly intuitive to a half decent programmer.

But I've come across many people trying to use them as a guide for how to write code - and misunderstanding them. Factory in particular is often misunderstood. It is mainly useful when you have parallel class hierarchies not as some odd wrapper around all the constructors in the software.

I'm not sure what you mean by parallel class hierarchies. I use a factory if I have a class that needs to be able to make an instance of some other class, but shouldn't know about how to construct it (i.e. it doesn't need to know about the dependencies that other class might need to function).

I also do not understand "parallel class hierarchies".

Here is another use case for a factory: Building an AST data structure in a compiler. There are nodes to represent Constants, Addition, FunctionCalls, etc and you have to build a tree structure, which represents a program.

Now consider the case, where you build the addition of two constants ("36 + 6"). Using constructors, it could like like: new Add(new Const("36"), new Const("6"));

Using a factory: f.newAdd(f.newConst("36"), f.newConst("6"))

The advantage of the factory is that it can optimize on the fly and return a Const("42") node instead of an Add.

To be a cool compiler, you don't want to optimize this in the AST building phase. The user may be running in a mode of printing the AST only, to e.g. implement their own autocompletion. This code obfuscates the truth by doing this.

I remember that at some point gcc -O 0 was doing exactly that and it is pretty irritating if you write a decompiler and want to test it. Fortunately clang does it properly.

I shouldn't speak for the OP, but I think they are referring to the fact that the factory pattern very naturally shows up when you also have the bridge pattern. You have 2 (or more) parallel class hierarchies and you need a way to construct the objects from the correct hierarchy.

You are also correct that it happens in other places too. Some people overuse it, though. They abstract out the construction of their objects for no reason at all. You already have a constructor -- usually there no need to have an object that contains all the constructors.

You have to be more clear with your terminology. The thing you are talking about is called "Factory Method Pattern". Don't call it just "Factory Pattern" because it is not really a factory. Instead it uses a factory to achieve a very specific goal.

No it's an "Abstract Factory Pattern", but clearly I did not succeed in describing my use case adequately ;-).

The Abstract Factory Pattern is one of those patterns which I never fully understood because I never found a good real-world example which relates to my line of work. That's the reason why I misunderstood you ;)

Don't underestimate the power of naming things. If you can't name a paradigm you used then your solution is just something that does the job.

The value of design patterns and giving them appropriate name is to have a common language, a basis for communication. Sharing ideas or giving instructions is much simpler when you can actually name something with a few words.

The one you mean is called "Factory Method Pattern" in the Gang of Four book. The other one should be called "Simple Factory" or something like that.

They are very different things which unfortunately have very similar names.

I don't really see what you call a "Simple Factory" as being covered in the Gang of four book. I should have been clear about meaning Factory Method but the terminology is often apparent in class naming. So for the "Abstract Factory" pattern, I would use "Kit" in class naming. I would reserve "Factory" for the "Factory Method" pattern. Pointlessly wrapping constructors in a separate class and calling it a factory doesn't really get you anything. Neither does contriving a parallel hierarchy to pointlessly allow the use of the "Factory Method" pattern - something that I have also seen.

The reason I like Lisp is because programming in it teaches you how to organize your code well using the most basic and important design pattern: the function.

Design patterns are patterns which can be used for designs. All patterns can be used for designs; "design pattern" as a phrase merely signals that the pattern was described in a certain book.

Functions are too simple and concrete to be patterns. A function is a mapping from one set to another. Even category theory manages to fully generalize functions several different ways without managing to define something as general as a pattern.

So, you might ask, what are patterns? Patterns are things which repeat somehow. That's it. There might be more to it, but I have yet to find it.

I'm glad that you like Lisp, but it's not the beginning nor the end of programming. For example, the Post correspondence problem is Turing-complete. So are Wang tiles. Where's your functions now?

I believe ktRolster may be refering to "Procedural Abstration". The term is used in SICP [1] (the Wizard Book using scheme, previously freshman programming text at MIT).

Abstration can take many forms. But can often be described as "seperating things that change and/or repeat from those that don't", or something like that.

The last paragraph may be a joke, or based on one of the Principles in the O'Reilly Design Patterns [2] book.

[1] https://mitpress.mit.edu/sicp/ [2] http://shop.oreilly.com/product/mobile/9780596007126.do

I've read your comment several times, but I'm not sure what you're saying, actually. Certainly functions have been described in several books. No one is said that functions are the only design pattern.

You can implement design patterns with functions, but functions themselves are not a design pattern.

Why not? They are general repeatable solution to a commonly occurring problem in software design - ie, a design pattern. You had non-structured code, then people started using small chunks of code that you'd jump to and back from multiple locations. It's a pattern, that just happened to get encoded directly into some languages, just like others (Norvig talks about the levels of implementation in his presentation on the subject: http://www.norvig.com/design-patterns/design-patterns.pdf)

To me, functions alone aren't abstract enough to be considered a design pattern. If they were, any arbitrary chunk of code, however disorganized, could be considered an implementation of that pattern as long as it used functions, which seems counter intuitive.

There are good ways and bad ways to use functions, just like any design pattern.

I think they're just a victim of their own success. Particularly the iterative language concept of 'function' is by no means universal. They were kind of grafted onto mainframe architectures, and even the MIX from the original The Art of Computer Programming used self modifying code to perform subroutine calls.

Functions are an extremely common pattern more or less expected by every language, but I'd definitely call them patterns still.

A function is not a design pattern. A design pattern is a design that solves a problem. Without stating the problem to solve, you have not got a pattern. The OP stated that they used functions to organise their code, so you can imagine some problem that they were solving with functions. For another problem you might solve with functions, I was reminded recently that functions are an implementation of the state monad. If you read up about where you might use the state monad and why you need it, then perhaps you can get a better feeling for why a function is a good implementation of that pattern.

Says someone who's never used a language that only had gosub. Figuring out how to use functions appropriately is one of the most important skills of software engineering.

>Says someone who's never used a language that only had gosub.

If using GOTO in Windows batch files counts, then I kind of have.

>Figuring out how to use functions appropriately is one of the most important skills of software engineering.

Yes, but that's not so much a design pattern as discipline. Knowing what patterns to use and when.

I think the reason that some of these patterns seem so terrible in real code is limited static type systems / OO - I'm specifically looking at Java, and it's terrible type erasure, and poor (but getting better) functional programming support.

Stop the for humans meme. Try-hards use it to signal that they're "in" but its outlived any meaning beyond some meta-meaning which is no meaning at all. Just make your project quality. Don't proclaim it, just do it.

Note: this comment should not apply to Kenneth Reitz.

One can argue these topics until the cows come home, and possibly later. One can strongly argue for/against imperative programming, functional programming, OOP, design patterns, Design Patterns with Cool Names, Design Patterns as Implemented in Your Favorite Language, etc... At the end of the day, the only thing that matters is whether your team thinks your source code is structured or not. If you're on a team of one, well then you have your future self to contend with.

In my opinion, iluwatar's design pattern guide [1] provides more examples than this. It gives not only implementation examples but also a class diagram and use case for each design pattern. The examples were implemented in Java.

[1]: https://github.com/iluwatar/java-design-patterns

I wish there was a resource that goes over design patterns with more abstract concepts/classes. I found the example in GoF to be somewhat useful but at a certain point (near the middle) I got lost in the details.

That's why GoF book is available in the library.

People write more concrete tutorials and publish online. Abstract literature is for researchers and Concrete literature is for people who need a quick solution.

You want more abstract examples? Or more abstract patterns?

More abstract examples.

Thanks for this. And I just realised, examples are in PHP!

Object-oriented programming is a disease, and I can't wait until we're collectively done with it.

All of these abstractions, when rarely they are actually needed to express a program, can be naturally expressed within the type system, in a proper functional language.

The fact that there's this encyclopedia of hundreds of discrete things with arcane toxic names that practitioners are required to individually learn and carry around in their heads, is a crime against our profession.

> All of these abstractions, when rarely they are actually needed to express a program, can be naturally expressed within the type system, in a proper functional language.

If you think FP is mutually exclusive with OOP, you understand neither.

OOP is very powerful to organize the architecture of your code and capture very common patterns such as reusing an existing piece of five functionalities while overriding one (something that's still awful to achieve in FP, regardless of the language you pick).

FP operates at a more granular level, basically how you implement your methods.

And by the way, FP has design patterns as well.

FP and OOP are extremely complementary.

>OOP is very powerful to organize the architecture of your code and capture very common patterns such as reusing an existing piece of five functionalities while overriding one (something that's still awful to achieve in FP, regardless of the language you pick).

This is not true, ad hoc polymorphism works fine without subtype polymorphism.

You're absolutely right but you're not answering my objection.

Show me an example of a type class with five functions and then how I can reuse it while overriding one of these five.

You don't need overriding, in fact you should avoid it. All five functions are already valid operations on the type class (the data you have). If you want to add an operator, then you define another function (with another name) and call it. If the existing modules need to call that operator instead, change them to be explicit about what you're doing.

This is where OOP gets it wrong, because this situation leads to logical contradictions (breaking the type system), in particular, breaking of the LSP.

> Show me an example of a type class with five functions and then how I can reuse it while overriding one of these five.

Huh? As I'm sure you're aware "type classes" != "classes", so I'm not sure why you're expecting them to be 'interchangeable' in this manner. It would be better to ask for some functionality (not a mechanism, as you're asking) and then show how one could be achieved[1] in one, but not the other.

Personally, I think any kind of implementation override is a code smell. If you need common functionality, just have a plain function outside your class and use that from both interface overrides.

[1] "Elegantly" or "expressively", one presumes. Of course everything can be achieved in any TC language.

If two things (typeclasses, interfaces, whatever) are very similar, I just want to write how they're different. Without something like inheritance I also have to redundantly restate all the ways they're the same, which nobody benefits from, it's only an opportunity to make a mistake.

You do not understand what a typeclass is or how it's meant to be used. If you are redundantly restating a single thing in a Haskell program, you have taken a very wrong turn.

Give me a single example of something that it turns out is truly better expressed through inheritance, and you will be on the shortlist for a Turing Award.

My point is that it's not "(typeclasses, interfaces, whatever)". They're similar, yes, but my point is that they aren't similar enough in the "mechanical" sense that the parent poster seems to be assuming.

EDIT: Btw, I think it may actually be possible to encode what the parent poster wants by just having a type class per method, but it's obviously a weird and non-idiomatic encoding given the presumed lack of any laws.

What you're describing can easily be done by making a module that re-exports four functions from another module and implements the fifth itself (in haskell, anyway).

Yes, like I said, OOP does this a lot more elegantly.

Agreed. Anyone who thinks this hasn't played enough with FP languages.

This sort of heated rant makes for a bad HN comment, regardless of how correct your views are. It leads to flamewars, which we don't want here, so please don't post like this.

Even if you took out the rantiness, the comment is still too generic to actually be saying anything. So to convert it to a good HN comment, you would need to both de-flamebait it and add information (e.g. specific examples).

i enjoyed the conviction.

There is far too much conviction in the world. One of the reasons HN is so enjoyable is that the discussions tend to avoid the heat that you can find just about anywhere else.

> There is far too much conviction in the world.

Are you sure? :)

I can appreciate that, but empirically we know from long experience that such comments lead to downward spirals. That's why the site guidelines are the way they are; it isn't to dampen conviction.

so, if i'd noticed you were dang before i responded i probably wouldn't have. only because i was trying to form a consensus. i didn't realize the consensus was being explained to me. :)

Not sure I know what you mean but I'm glad you responded—I like it when people respond :)

i appreciate your rational/educational/hands-on style of guiding culture.

so i was trying to say, without sounding like a pushover, that i may not have responded in fear of undermining your hard work.

cheers & thanks sir!

I dislike OOP. It's my least favorite paradigm. The "objects-first" approach to teaching programming was such a bad idea.

All that said, there is a place for OOP. And while a world without OOP would be a better place than a world without FP (if we had to choose), I think you'll be waiting a looong time for that world to materialize.

I've spent the vast majority of my career in OO-centric projects, but between my experience in Javascript and Clojure I've begun to see OO as FP with a layer of syntactic sugar. Objects are just closures at the end of the day, after all.

The mental fun was when it dawned on me that most of the benefit from the FP I was falling in love with was coming not from FP itself, but from the liberal use of immutability... and that OOP can be written in a way that similarly embraces immutability.

I don't know if my OO code got better or worse from an outside perspective once I realized that, but it certainly made it easier for me to reason about my experimental side projects.

That world is already here, just not evenly distributed.

> arcane toxic names

Oh come on. It's not "arcane" and I don't see how "toxic" has anything to do with the names of these patterns.

Just an inflammatory comment.

Yes, and quite rich coming from the crowd which throws around Functors, Applicatives, Monads, Kiesli, CoYoneda etc.

I downvoted you - these are, unlike design patterns, very well-defined mathematical objects, but furthermore, you shouldn't use such a broad brush.

> these are, unlike design patterns, very well-defined mathematical objects

IMO naming these programing constructs after their arcane mathematical roots is toxic to the advancement of the field.

OO design patterns do not have mathematical foundations, but that does not make them arcane or toxic.

> you shouldn't use such a broad brush

Probably not, I agree.

> IMO naming these programing constructs after their arcane mathematical roots is toxic to the advancement of the field.

I disagree. Why not call a duck a duck, and instead invent your own terminology, just because you're in a different field? Mathematicians invented those concepts first, whether you like it or not.

It's actually one of my three criticisms of (OOP) design patterns. Often they are just another names for a contrived way of how to pass functions as arguments, so they are actually bad abstractions (too little bang for the buck). (The other two criticisms are lack of formal definition and lack of representation in the computer language.)

> I disagree. Why not call a duck a duck, and instead invent your own terminology, just because you're in a different field? Mathematicians invented those concepts first, whether you like it or not.

Mathematicians discovered the existence of these structures first and gave them names. Programers also came across them, and there is no shame in giving the structures better names more appropriate to their own field - whether mathematicians like it or not.

> It's actually one of my three criticisms of (OOP) design patterns. Often they are just another names for a contrived way of how to pass functions as arguments, so they are actually bad abstractions.

They are not, and have been employed by couple of generations of programmers to create useful software.

> Mathematicians discovered the existence of these structures first and gave them names. Programmers also came across them, and there is no shame in giving the structures better names more appropriate to their own field - whether mathematicians like it or not.

Actually this is not what happened here - when programmers added the concepts you mentioned, they were very much aware of the mathematical counterparts.

Regardless, I think it is a shame to call the same thing differently just for the sake of it.

> They are not, and have been employed by couple of generations of programmers to create useful software.

I can somewhat recommend this presentation: https://www.youtube.com/watch?v=Rmer37g9AZM

The concepts presented are very sound, however the presentation could be a little better.

> Actually this is not what happened here - when programmers added the concepts you mentioned, they were very much aware of the mathematical counterparts.

Yes, I am aware of it. I have read Wadler's papers pointing out that list comprehensions are a specific instance of a Monad and proposing Monads as a way to structure functional programs. But I wish the Haskell did not directly copy the term Monad into the language, setting off this bad precedent. Like CAR and CDR in Lisp, unfortunately, this one will stay in Haskell. Hopefully future languages will introduce more sensible terminology, like how Lisps have evolved over the years.

> Regardless, I think it is a shame to call the same thing differently just for the sake of it.

We have to agree to diasgree on this. IMO the intended usage of a concept should dictate its name, not how humans first arrived at it.

So what names do you think should be used instead of Monad, Applicative, Functor, .. ? (There was an interesting discussion about this on HN: https://news.ycombinator.com/item?id=9887728)

Funny, I actually prefer CAR and CDR in Lisp to FIRST and SECOND, and I learned it only a few years ago. (It seems to me that it's more poignant since the lists are composed of cons cells.)

I think it ultimately doesn't matter - the usage of the term dictates its meaning as much as the other way around (but this can again be my mathematical background talking, which explicitly tries to unteach you to intuit based on the names of things).

Why is it a disease, honest question.

To do OOP good requires a lot of forward thinking and careful planning. Though most of the two are lacking supply when you're dealing with large teams of programmers and you end up with some IFactoryAbstractInterfaceInvoker eg... the kingdom of nouns game.

The other main concern was it was over-hyped with consultants running around to all businesses during the dot net time with xml with no empirical evidence to backup their claims.

The whole OOP was a giant experiment.

Just for clarification.

I'm not dishing OOP per say. My main concern is OOP emphasis (Java) is everything needs to be a class. There are times when there isn't a need for a noun for a bunch of functions that operate over a collection of abstract data types. The other problem is how OOP is first introduced to programmers. They're taught they need to categorizing everything into a taxonomy (classes) and inheritance. So the emphasis is incorrectly put on the art of taxonomy instead of getting the job done the most straight forward approach.

Please let's not equate OOP with Java.

There are actually OOP proponents like James Coplien and Trygve Reenskaug who think that Java is more "class-oriented" than "object-oriented", making it really hard to do proper OOP in Java.

There was a great talk by (David West) (https://www.youtube.com/watch?v=RdE-d_EhzmA&index=2&list=PLe...) that does make the point that objects have little, if any, value as a programming concept Because all the arguments about objects occurred in the context of programming they focused on things that really don't matter much, like single versus multiple inheritance, dot notation versus explicit messages for getting and setting or invoking methods; class hierarchies, friends, even the idea of a class.

He is more in favor programming of conceptualizing the real world as entities interacting with each other then designing systems that model those interactions. Thinking in objects allow one to design better systems that may not nesseary be represented as OOP in the language.

Jumping to the conclusion of his talk, he more or less advocating in using Abstract Data Type instead of objects. That's what I got from it anyway. Though it kind of reinforces that OOP changes depending on who you talk too.

Kudos for mentioning David West! I strongly recommend his book "Object Thinking" as the book about what OOP really is, how object orientation is achieved and which languages' philosophy is OO compatible and which ain't (spoiler: C++ and Java are in the latter group). While Java is my favorite language so far, I certainly hope that some proper OO language will emerge in the near future.

I'm on the fence about Object Thinking. Most of OOP books I've read are very anecdotal without any empirical evidence to back up their claims. I've read a couple of research papers that used used (empirical evidence) to see if OOP was better than any of the competing other languages. The paper conclude's there isn't really any significant difference. The only difference they did find is higher languages where less code, but the complexity per line was higher than earlier languages.

One recent paper I read is called (The Confounding Effect of Class Size on the Validity of Object-Oriented Metrics) concluded when you've control for class size all OOM disappeared. In other word's, you can ignore most OOP Metrics and simply run a line count over a single file and it give you a better indication bugs/faults a class may have over running the OOP metrics.

I do think David West is a smart guy, but nothing from what I've seen in `Object Thinking` comes with any significant examples, sample group, or control study that measure his approach of Object Thinking.

why on github and not on like... medium?

Off the top of my head:

1. GitHub is a broadly-adopted tool that the author probably already knew how to use

2. It's hard to accept PRs on Medium

3. It allows the author to list the repository on his profile, thereby helping them convey competency to potential employers

4. It's easier to archive a git repo than a Medium post, and easier to rehost elsewhere in the future if/when GitHub dies.

5. Git allows change tracking.

6. It uses Markdown, which can be processed through automated tools to produce a variety of formats, from PDFs for email to EPUB to send to a physical publisher.

lol - that's a good list. thank you.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact