
Why I hate all articles about design patterns (2016) - lkrubner
http://www.smashcompany.com/technology/why-i-hate-all-articles-about-design-patterns
======
apo
Bizarre. The article doesn't explain much about why the author hates design
patterns. The best I could come up with is in this passage:

 _Perhaps I’m being unfair to Design Patterns, but those articles [about
design patterns] seem like the worst kind of tech writing, the most likely to
inspire useless discussion that doesn’t help move a team forward. Catering to
software developers perfectionism tends to be bad for business. And this
particular type of article seems to be the most extreme in that regard._

To recap, DP articles:

\- are the worst kind of writing (still no explanation)

\- likely to inspire useless discussion (sometimes play is a lie?)

\- catering to perfectionism (this can be said about any tool)

The article reminds me of many folks I've met who despise analysis and
planning. They're constantly getting themselves into trouble that requires
quick action. When it works, the chalk the success up to their quick action.
They denigrate project planning tools of nearly every stripe. They're also
typically very bad communicators and don't manage time well.

But in the end, the need for quick action resulted from the immense dis-ease
the person feels around planning, brainstorming, and up-front analysis. The
cycle repeats on every project.

There must be an anti-pattern that describes this kind of behavior...

~~~
freshhawk
I thought it was explained really clearly at the very beginning: It's what
those cargo cult architecture conversations are full of. The ones where people
sound smart but aren't actually designing anything useful.

I think the author might be taking it a bit too far, but they are certainly
describing a real thing that I see all the time. Which is bad engineering
choices made without thinking and then justified with a string of design
pattern terms, used incorrectly.

~~~
wmccullough
> “The ones where people sound smart but aren't actually designing anything
> useful.”

I’ve been part of lots of teams and teams composed of great and poor levels of
talent and I’ve never seen this type of behavior which you describe. Usually
the folks getting up in arms over architecture are pissed because folks with
experience shot down their idea for reasons of pragmatism and the individual
couldn’t handle it because of ego.

~~~
milesvp
You may have been lucky. One of the worst messes I had to make work was
nonsense a senior dev was working on up until he quit. It was a mess of object
oriented wankery that soured me to even the term 'object' for years. He needed
to implement 6 funtions with a couple of sql statements to do session handling
in php, and instead there were factories and visitors and repositories and
facades and I couldn't tell you what else this guy was thinking. He was
supposed to be done, and it just needed deployment, but it ended up taking me
3 months, and a half dozen late night deployments and rollbacks before I
managed to get it working. If I'd been more experienced at the time I would
have ripped it all out and just implemented the 6 functions we needed, but I
really assumed he'd had a good reason for the nonsense he'd written.

~~~
josephg
Years ago I was a TA for first year computer science. I graded my students'
assignments on code style, with reference to automatic grades generated by a
test suite.

One of the most interesting things I learned was how much variance there was
in code size between submissions. I would first read an assessment from a good
student which was completely correct and 500 lines long. The code would seem
reasonably clear, compact and well written. Then I'd pick up the next
assignment, which had also gotten 100% on the automatic tests and it would be
only 150 lines of code. And the code wouldn't seem any more compact or
unreadable compared to the first - they both looked like decent solutions; but
one of those solutions had implemented the same spec in 3x less code!

And then I'd pick up a submission by a weaker student who got 50% of the
automarking results done. The submission would be 600 lines, and you could see
the sweat that had gone into writing it. The student was clearly struggling to
keep track of all the moving parts in their code. If that was all I looked at
I would have assumed the assignment would have taken 1000 lines to implement,
and be way too hard for my first year students.

(In case you're wondering, there was no difference in programming languages,
tooling or knowledge. All students were using the same environments to write
plain C code.)

The lesson I learned was that its very hard to tell whether you or your
coworkers are actually using a good approach to solving your problems. I never
hear people say things like "Oh, I think this different abstraction could save
us 70% of our code". But having seen enough examples I think its usually the
case that those abstractions exist. We just (bizarrely) don't seem to spend
any time looking for them. Intuitively I would expect that bad abstractions
would be obvious in code, but that seems to be often not true in practice.
After reading the first students' submission I usually would still have no
idea how long the average correct submission was going to be.

~~~
watwut
It seem clear to me that both students described in second paragraph produced
good code, although the shorter one was better. The second paragraph student
was clearly behind them. It just does not seem hard.

I heard colleagues say "if we split this away, join these two classes into one
and hide it under interface, it will be simpler/shorter". The word abstraction
was not used, but it was abstraction they were talking about.

So maybe it depends on local culture, whether people talk about how things are
done overall or whether they focus on idioms only.

Edit: fixed typo in clearn/clear

------
deepsun
> They had begun work in 2011, but they rejected all of the existing document
> databases (such as MongoDB or Riak). In 2013 they decided to work on version
> 2.0 of their software, which would be a complete re-write, but they decided
> to continue to keep JSON strings in PostGres.

That's speaks in their favor, in my opinion. We actually consciously moved the
way MongoDB -> PostgreSQL JSONB column, and are pretty happy with that. Sorry,
original author, but if you recommended us to use shiny new databases like
MongoDB or Riak, we'd probably think of you as a young and inexperienced
engineer.

Although I agree about Design Patterns. For example, you'd use/no use
ReactiveX regardless of whether you know about Observer pattern or not.

~~~
crb002
The #1 enginering design pattern is to obey speed of light transfer of
information latency. Moving data into and out of MongoDB has that unavoidable
cost, and it is a waste if the only thing you want is JSON formatting of the
same data already in your RDBMS.

~~~
b4lancesh33t
Overall latency is important. Where it comes from isn't very important.
Transferring data a few light-nanos of distance into or out of a colocated
Mongo DB instance simply isn't going to move the needle on any important
metric. (Disk IO latency might, but that has little to do with the speed of
light.)

------
mikekchar
I hate to be negative, but as far as I can tell, the only thing this article
does is indicate a lack of understanding of design patterns. Equating them to
a pseudo-intellectual exercise is missing the point by a proverbial mile.

Here's a brief explanation: Often when you are coding, you find an interesting
solution to a problem. Then when you look at someone else's code, in a
different domain, you realise that they are using a very similar solution. You
ask them where they got the idea and they say that they saw it in some code in
yet a different domain. So you think, "Hey, this seems to be a useful solution
to a problem that occurs in several domains". You tuck it away in the back of
your mind and when you see that problem again you say, "I've got a solution to
that problem!" and you use it.

That's a design pattern. _Everybody_ uses them. The alternative is to bloody
mindedly refuse to learn and to use completely novel approaches to the same
problems every time you see them. That fact that people write down these
solutions and give names to them doesn't suddenly make the solutions
ridiculous. The fact that some programmers mindlessly apply "patterns" in the
hopes that they can escape thinking doesn't make the solutions bad. That's
like saying we should never make libraries because some bad programmers think
they can glue 1500 libraries together to make a website without having to know
how to program.

What the author is complaining about has nothing to do with design patterns.
People sitting around building mythical designs that never get built has been
going on since before the time of Fred Brooks. There are whole classes of
architects, analysts, and consultants who have mastered the art of getting
paid big money to talk impressively about software without ever writing a
single line of code. Quite a lot of them write popular blogs! That's nothing
to do with design patterns.

I am half curious to read the author's rants about OO, but I suspect I'll find
more of the same -- some programmers wrote some terrible, terrible code and
called it OO. It must stop! Sigh...

~~~
Shorel
I'm partial of the explanation of DP as missing language features. In fact,
Design patterns all use the human compiler to check for correctness of
implementation.

PaulGraham said "Peter Norvig found that 16 of the 23 patterns in Design
Patterns were 'invisible or simpler' in Lisp."

With this mindset, functions (or subroutines) are language patterns in
assembler.

In the other extreme, if you use a sufficient advanced language, all your
language patterns are actually checked by the compiler and/or interpreter,
instead of the programmer. They stop being patterns and are just language
features.

[http://wiki.c2.com/?AreDesignPatternsMissingLanguageFeatures](http://wiki.c2.com/?AreDesignPatternsMissingLanguageFeatures)

~~~
pasabagi
Isn't there a point where it's more desirable to have 'features' available
through patterns, than through the language itself? The larger and more
complex the language grows, the more cognitive overhead - so programmers of a
hypothetical 'perfect' future language might still use patterns, simply so the
language didn't grow out of reasonable proportions.

~~~
sideshowb
If code is data then in principle you could implement them as libraries rather
than language extensions. I don't use lisp or haskell enough to know whether
someone has already done this.

That said I'm also not convinced that it would help in many cases. What would
you enforce about a wrapper for example other than it's constructed from one
argument and returns a different type containing a reference to the original
one? If you're not able to do that yourself then would a wrapper library
really be any use?

------
unclepresent
Not a good read to be honest. Author does not sound like a solid programmer.
Examples he is giving actually say he might be a “change chaser” - A person
who would make a change just because he can add something new to his CV.

The fact that guys he was talking about used PostGres to store JSON as a text
field literally means that they chose a simplicity over introducing another
dependency from no-sql db with all the cost of supporting it. Unless one needs
to search over information stored in JSON it is perfectly and probably
disarable to store it as a text field.

Author remark “they spent years for what they can spent weeks just by using a
framework” is a proof that he has little or no experience of doing such things
himself. No framework can save you so much time. I was designing UI using
frameworks and using just basic language tools. Difference is in trade off
obetween quality of standard framework components over flexibility of
desinging your own. Time is not so much different and might even take longer
with frameworks as adoption time for frameworks could be long especially for
teams with a high turnover.

I decided to not read till the end because author did not present himself as a
person whose opinion might be useful.

------
crimsonalucard
I get what he's trying to say. I think he expresses it very unclearly. I'll
try to elucidate the main idea in an example:

If I give a programmer a task. His task is to create an api that can email
someone a message with a subject and tell me whether the message is
successfully sent. All I need is this:

    
    
        func email(address: str, subject: str, message: str) -> bool
    

I do not need this:

    
    
        Class Email(AddressObject, SubjectObject, MessageObject, ConfigObject, ClientObject).send_mail()
    

The definition of the Email object vs the email function is about 10x more
complex with little benefit. The send_mail method is virtually IDENTICAL to
the email function, the objects and patterns you are building are just extra
cruft you are layering around the email method. Not to mention you literally
have to define 5 extra objects as well. It's obvious to me that most of the
code is 100% cruft yet why does everyone who is knee deep in design patterns
gravitate towards that Email object? I look at Email object and I think: I get
what you're doing, but WHY.. what is the point? Are people really unable to
see why this abstraction is unneeded?

Some clown once told me dependency injection (The example object pattern
above) helps with unit testing. I agree. Literally, the Email object has 5
extra objects that need to be unit tested so more unit tests!. More lines of
code = more unit tests = a better program... In all seriousness, the core
method is send_mail and that cannot be unit tested because it's IO. Everything
you build around that method is extra BS that only needs to be tested because
you made it exist.

Sometimes it baffles me how people just start using design patterns
everywhere. I think what's going on is that people reach this cathartic state
when they understand a new mind bending concept. Dependency Injection is a bit
of mind bender, and this catharsis people feel when they reach a point of
understanding blinds them from the downsides of patterns in general. Actually
dependency injection is a stupidly simple concept, it's the word "dependency
injection" that adds artificial complexity to the concept by making it sound
cool and harder to understand.

~~~
nathanaldensr
Of all the patterns you could pick on, you pick on one of the most useful in
modern OOP; however, I think you _actually_ intend on picking on dependency
_inversion_ , not injection. Dependency injection is simply an automated way
to satisfy constructor dependencies in OOP languages where your intent is to
invert a dependency graph. I won't go into why dependency inversion (see:
Dependency Inversion Principle, the "D" in SOLID) is so valuable because much
has already been written on the topic.

As far as your email example, sometimes the language's basic type system
abstractions (e.g., string, integer, etc.) do not encapsulate a real-world
concept sufficiently. Encapsulation failures can frequently lead to bugs when
edge cases are hit. I may, indeed, need to create an Address class to replace
"string address" parameters if there is validation logic inherent to the
existence of an email address in my code. If there isn't, then the base type
of string would do just fine, of course.

~~~
crimsonalucard
>Of all the patterns you could pick on, you pick on one of the most useful in
modern OOP; however, I think you actually intend on picking on dependency
inversion, not injection. Dependency injection is simply an automated way to
satisfy constructor dependencies in OOP languages where your intent is to
invert a dependency graph. I won't go into why dependency inversion (see:
Dependency Inversion Principle, the "D" in SOLID) is so valuable because much
has already been written on the topic.

Both patterns are total failures. You are talking about the most misguided
religiously followed pattern(s) in all of modern programming. AN equal amount
has been written about how horrible both DI concepts are if you care to look.
You think putting an acronym in your response and using big words like
injection proves a point? Not SOLID at all. It proves nothing.

You have literally ignored my main point. It baffles me how blind some people
are. The Email object requires 10x more code and 10x more abstractions and 10x
more unit test coverage to finish then a singular function. You haven't even
addressed this fact yet.

>As far as your email example, sometimes the language's basic type system
abstractions (e.g., string, integer, etc.) do not encapsulate a real-world
concept sufficiently. Encapsulation failures can frequently lead to bugs when
edge cases are hit. I may, indeed, need to create an Address class to replace
"string address" parameters if there is validation logic inherent to the
existence of an email address in my code. If there isn't, then the base type
of string would do just fine, of course.

Do you seriously think that encapsulating an email address in an object makes
a difference? Here's some "validation logic" for you:

    
    
        //raises error if invalid
        validate_email_address(email: str)
    

I cannot make an example more clear. If you want to send a VALID email all you
need is a function that validates an email and a function that sends it. You
do not need to shove validation into an extra object then inject it into
another one, then instantiate that object then call the send method... Doing
this adds NOTHING beneficial to your code.

This one guy told me that by encapsulating the email address into an Address
object I am effectively eliminating a runtime exception from the system
because the type checker is, in a way, doing the validation. haha. I cannot
comprehend how this guy cannot visualize the fact that all he is doing by
using an address object rather then a validation function he is just creating
a class then basically moving the validation function over to be a method of
that class. The runtime error can still occur... just in a different location
and obscured by useless abstractions and code surrounding it.

Many OOP patterns are pointless abstractions. Many smart people hate OOP all
together, (including the founder of YC and this site). Maybe you should look
into why.

You are basically the topic of this article.

~~~
watwut
> 10x more unit test coverage to finish then a singular function.

That is not true. Assuming that both have same functionality, e.g. both either
validate address correctness or both don't, you need exactly same test cases.
The only difference is that they will be structured differently - in one case
you validate address as part of `AddressObject` test and in another as part of
`func email` or `func validate_email_address`.

The tests with valid and invalid address however needs to exist in both cases.

Whether you go by functional or object oriented way, validation should be
separated from main. It can be in another object or in another function, but
it definitely does not belong to main function that does also sending. There
is very little difference between two, the only one is that whether you
validate before hitting send function of after.

> You do not need to shove validation into an extra object then inject it into
> another one. Doing this adds NOTHING beneficial to your code.

It is not big deal either. It does not make it harder to understand, both
variants are dead simple. Both are immediately readable. Frankly, people who
have low abstract thinking or were lazy to learn basics of OOP hate it. It is
the same as with people who hate any other paradigm - functional programming
and what have you. Functional programming is too hard to read unless you have
put work into learning it.

There is such a thing as bad OOP, but when you pick dead simple easy to read
example and claim it is something complicated, the the issue is not with OOP.

~~~
crimsonalucard
>That is not true. Assuming that both have same functionality, e.g. both
either validate address correctness or both don't, you need exactly same test
cases. The only difference is that they will be structured differently - in
one case you validate address as part of `AddressObject` test and in another
as part of `func email` or `func validate_email_address`.

Typically when you make a class. You create additional methods under that
class. Those additional methods such as getters, setters, constructors, all
need to be unit tested for full coverage. Like I said... Cruft code.

>Whether you go by functional or object oriented way, validation should be
separated from main. It can be in another object or in another function, but
it definitely does not belong to main function that does also sending. There
is very little difference between two, the only one is that whether you
validate before hitting send function of after.

We're in agreement here except I never brought up functional programming.
Where in my post did I say it? I'm talking about procedural programming. Like
C.

>It is not big deal either. It does not make it harder to understand, both
variants are dead simple. Both are immediately readable. Frankly, people who
have low abstract thinking or were lazy to learn basics of OOP hate it. It is
the same as with people who hate any other paradigm - functional programming
and what have you. Functional programming is too hard to read unless you have
put work into learning it.

This is besides the point. I don't care about how hard or how simple or how
lazy people are. I'm talking about pointless logic and unneeded abstractions.
This is what the article is talking about. Here is a way to make it even more
clear. Compare the two below:

    
    
        Class Email(AddressObject, SubjectObject, MessageObject, ConfigObject, ClientObject).send_mail(msg: str, subject: str, address: str)
    
        func send_mail(msg: str, subject: str, address: str)
    

Do you see how the send_mail method is IDENTICAL to the send_mail function?
The class and the dependencies getting injected into the class are a pointless
layer of abstractions when ALL you need to send an email is the send_mail
method. This is the core of the argument. Don't get into semantics about
people being lazy.

And also, again, I never mentioned functional programming. I'm not talking
about it. I'm talking about straight up procedural programs of just regular
functions that you see often in C code.

>There is such a thing as bad OOP, but when you pick dead simple easy to read
example and claim it is something complicated, the the issue is not with OOP.

I picked something dead simple and easy so readers can understand what I'm
talking about. You'd rather me hand you some OO source code so you can parse
that insanity?

~~~
watwut
> Typically when you make a class. You create additional methods under that
> class. Those additional methods such as getters, setters, constructors, all
> need to be unit tested for full coverage. Like I said.

That is great argument for why full coverage is nonsense. Why would you test
getters, setters and constructors if they do little?

> Compare the two below

I read and understand both just fine. Moreover, second one must have a place
where equivalent of config and client objects go - but I dont see where it is.
Or, you added do-nothing config and client objects go to OOP example to make
it look worst. OOP example also have address twice, subject twice and message
twice.

Here is more equivalent code:

* Class Email(AddressObject, SubjectObject, MessageObject).send_mail()

* func send_mail(msg: str, subject: str, address: str)

> I picked something dead simple and easy so readers can understand what I'm
> talking about.

The trouble is, that readers dont see it.

The point of functional was analogy.

~~~
crimsonalucard
>That is great argument for why full coverage is nonsense. Why would you test
getters, setters and constructors if they do little?

Errors and code changes can happen at anytime and anywhere and in places you
never look. You have full coverage for full safety.

>I read and understand both just fine. Moreover, second one must have a place
where equivalent of config and client objects go - but I dont see where it is.
Or, you added do-nothing config and client objects go to OOP example to make
it look worst. OOP example also have address twice, subject twice and message
twice.

The function doesn't need a config object or a client object because those are
patterns and abstractions that are unnecessary. That is the point of the
article. Patterns that are blindly followed religiously and abstractions used
in a way that is excessive.

>The trouble is, that readers dont see it.

It's not possible to speak about what other people see. You can only speak to
your own intelligence which is the only logical possibility here.

>The point of functional was analogy.

The point was that you think functional and oop are just two different styles
of programming that are both equal and good. This is incorrect. One is better
than the other but at the end of the day I wasn't even comparing those two
styles.

~~~
watwut
Which patter requires config object and client object for send mail
functionality? I read a lot about them and there is none. So, unless you can
name it, you are making stuff up.

It is not even about abstraction. I can write func send_mail(msg: str,
subject: str, address: str, from: str, server: str, port: num, <following 20
config variables>) and have procedural code with same functionality.

Writing test for every getter and setter will not give you safety at all. This
sort of test is completely pointless.

------
codedokode
Design Patterns are used very often; you probably just don't notice them. Most
of the patterns described by Fowler [1] can be found in large web application
frameworks.

For example, Ruby on Rails uses Active Record, Page Controller, Template View
patterns. If you look at an ORM, for example Hibernate or Doctrine, they use
even more patterns described by Fowler.

But of course they are mostly used in frameworks. If you are writing a CRUD
app, just take a web framework and make everything as simple as possible.

[1]
[https://martinfowler.com/eaaCatalog/](https://martinfowler.com/eaaCatalog/)

------
CoolGuySteve
All these comments are terribly reactionary. I completely agree with the
author.

Before you write a design pattern, abstraction, or interface, ask yourself: Is
this the simplest possible thing that can work?

And then: Will I ever touch this code again?

And then: How hard would this be to rewrite if the requirements change?

Design patterns were meant to future proof code, but usually they just
introduce complexity and useless abstraction, or serve to paper over
limitations of your language. You probably can't predict the future as well as
you think you can.

Your program is a document. You can change it. You can make it do anything.

~~~
couchand
> Design patterns were meant to future proof code

That's a misguided but unfortunately common view. Best practice is to refactor
to patterns, don't try to divining them from the start.

> paper over limitations of your language.

See
[http://wiki.c2.com/?DesignPatternsAreMissingLanguageFeatures](http://wiki.c2.com/?DesignPatternsAreMissingLanguageFeatures)

~~~
geezerjay
> That's a misguided but unfortunately common view. Best practice is to
> refactor to patterns, don't try to divining them from the start.

I would add that IIRC Martin Fowler himself, a manwho writes whole books
dedicated to desing patterns and software design, is also the man behind the
"monolithic design first" mantra.

------
northwest65
I too misunderstood the value of design patterns when I was lad. Dr Mike Lance
set me straight, and I'm forever grateful.

------
gopher2
Anyone have any good articles about design patterns that they've bookmarked or
feel like sharing?

------
jakelazaroff
Legit question: where does one learn new design patterns if not from articles?

~~~
jcelerier
... books by the authors of such patterns ?

------
codedokode
The problem arises when people use something just because they think it's cool
and someone wrote about it on Twitter and not because it has clear advantages
compared to other solutions. Do not blame design patterns if you don't
understand them. But you can blame book authors for not explaining it to you
properly; maybe they don't understand the patterns too.

Also I disagree about DB and normalization: in most cases the best choice will
be SQL DBMS and normalized data. Please let me explain why:

\- normalization saves you from duplicating data; MongoDB will make you do the
opposite. Updating the data will be more difficult and you will spend more
time writing the code.

\- fixed schema will make it easier to understand DB structure for new
developers and helps you to detect errors; with MongoDB all invalid data will
go straight into the storage and you will spend additional time fixing them
later.

\- foreign keys and transactions save you from saving inconsistent data into
the database; MongoDB won't help you with this.

Designing a database properly doesn't require much thinking, it is a natural
thing and you are not losing time.

But I understand what the author is talking about. Recently I was using two of
Google's products, Doubleclick for Publishers and Google Analitics. Their UI
is super slow, at least on my hardware. One Chromium tab with DFP page
consumes over 700 Mb of memory. When I decided to inspect it, I found that on
DFP page they used Dart "apps" compiled to JS; each of them was not less than
2.4 Mb. Even the "changelog" app which was just displaying information about
recent changes (yes, they did it as a separate app).

In Analytics UI, they decided to use Material Design with large number of
animations, it is very slow and UI elements are too big for 1280px wide
display. It looks more like your legacy enterprise app written 10 years ago in
Java and IIS, rather than a Google product.

They could just use something simple, like Symfony Framework and jQuery and
avoid most of those problems. I think the reason why they didn't is because
Google employs smart people and gives them a boring job; to entertain
themselves they tried to play with something new, like Dart or make Android-
like UI but didn't have enough time to do it properly.

~~~
pjmlp
> It looks more like your legacy enterprise app written 10 years ago in Java
> and IIS

I would like to see this, given that IIS never had an official Java ISAPI
module.

~~~
codedokode
Then let's change it to "IIS and Java Applets".

------
ThomPete
I love design patterns because they tell me where my company shouldn't be
focusing.

One thing we did very early on was going after clients where there are no
well-established design patterns.

------
dev360
I hate all articles about monads. There, I got it off my chest.

------
dev360
This is a really serious indictment, folks: “Also, they were doing some
unusual things, such as storing JSON strings in a text field in a PostGres
database”

~~~
Noumenon72
Don't understand if you are being sarcastic. I can say that the giant blobs of
XML in our text fields are very annoying. I have to compare them with Kdiff
and too many of them don't fit in DBVisualizer's buffer.

~~~
steve_adams_86
I think sarcastic. Psql has great support for json - they were probably taking
advantage of it. If you've already got psql set up in your project and have
minimal json storage requirements, it can be all you need. You also gain the
possibility of retaining some normalization of the data if necessary, which is
a lot more complicated if you throw it all in an entirely different database.

~~~
grzm
Small nit: psql is the command line client bundled with Postgres (PostgreSQL).
And yeah, the JSON/JSONB support in Postgres is pretty nifty. Postgres 10
added full text search for JSON/JSONB columns.

~~~
steve_adams_86
Oh geez, you're right. It's really the only way I use postgres so I kind of
equated the names, but yeah, different tools.

------
rpdillon
> Perhaps I’m being unfair to Design Patterns, but those articles seem like
> the worst kind of tech writing, the most likely to inspire useless
> discussion that doesn’t help move a team forward.

Design patterns have their place, though I can see merit to arguments that
assert they have been emphasized out of proportion to their utility over the
years. But this is a strange critique, given the content of this particular
article.

------
chj
The issue I am having with design patterns is that they are full of big weird
words that doesn't mean anything to a new programmer.

------
_pmf_
If you have only done greenfield development and source-based extension
(instead of having to either provide 3rd party extension points or use them),
your opinion on modularity is invalid.

------
smadge
Patterns exist whether or not you are aware of them or give them a name. Most
nontrivial programs will end up recreating patterns whether the programmer is
conscious of them or not.

------
ncmncm
It's much easier to express what's wrong with design pattern chatter:

Each design pattern represents a failure in the host language to provide
facilities sufficient to abstract the pattern as a library component. when you
find yourself relying on design patterns, it is a sign that it is long past
time to switch to a more expressive language.

~~~
blain_the_train
Indeed, I wrote a bit about that here [https://drewverlee.github.io/tags-
output/Design%20Pattern/](https://drewverlee.github.io/tags-
output/Design%20Pattern/).

