
Anti-Patterns - pplonski86
https://christianfindlay.com/2019/06/01/anti-patterns/
======
inflatableDodo
> _To qualify for the term “pattern”, an approach is not worthy until someone
> or some group has proven through some formal process that this approach is
> superior to all others, and qualifies as “best practice”._

Since when?

> _An approach that earns the title of “pattern” not only becomes one tool in
> the toolbox, it becomes dogma._

Well, this definition seems a wee bit dogmatic.

> _Everything that runs contrary to a given pattern becomes an “anti-
> pattern”._

This is nonsense.

edit - If I come up with a nicely thought through set of integrated UI
elements that have not gone through any formal process or testing and asked
someone what they thought of it as a pattern, I might expect to be asked, "How
do you know it is a good pattern?"

However I would not expect anyone to tell me, "You cannot use the term
'pattern' as it hasn't been proven through a formal process that this approach
is superior to all others, and therefore qualifies as 'best practice'." If
anyone did tell me this, I'd be looking for someone else to get feedback from.

~~~
EliRivers
_This is nonsense._

Is that not the very point being made?

~~~
stinos
Well, yes, but is there such nonsense to begin with? I.e. where does the
original statement come form? Where is the canonical source claiming whatever
is not a pattern, is an anti-pattern?

~~~
EliRivers
I have certainly worked in places where, for example, code consistency is
rated so highly that writing bad code consistent with the existing codebase is
considered preferable to writing good code. Quite literally pattern and anti-
pattern.

~~~
mwfunk
Are you talking about formatting conventions or something else? Because if all
you’re talking about is coding style, then yeah, it’s infinitely better to
comply with the conventions of whatever your working on than do your own
thing. For something as subjective as coding style, consistent vs inconsistent
way more important than whether one style is used vs another one. That almost
goes without saying.

~~~
EliRivers
No. I'm talking about things like endlessly copying horrifically monstrous
multiple inheritance patterns, and painful single-threaded state machines
instead of using threads to spin off work, and using shonky home-grown
reference counting (that's thread-dangerous) instead of standard library smart
pointers, and writing your own polymorphism instead of just using the
inheritance that the language comes with, and leaving 80% of CPUs idle while
trying to jam all the work into a single thread, and rewriting pieces of
industry-standard libraries which then makes upgrading impossible, and writing
horrifically heavy wrappers around simple DB access that turns modern DB
interaction into sludge for the purpose of solving problems that the _previous
product_ had and the current product doesn't.

Take all that, and then insist that all new code do that as well. That's what
I'm talking about.

------
skywhopper
Multiple misunderstandings here. Most importantly the author seems to have
misunderstood the quote from Wikipedia defining an anti-pattern. The
definition said an anti-pattern is a commonly seen solution which is net
detrimental AND for which a known net-positive solution exists.

But it’s important to note that anti-patterns _are_ still patterns, ie, they
are commonly seen solutions. They just happen to be poor ones. But your
innovative solution, whether it’s good or bad, can’t be an anti-pattern, no
more than it can be a pattern. Because it’s new. No one else is using it. It’s
not a pattern yet.

Likewise the author seems to have missed the AND part, and spends a good deal
of time ranting that just because a solution exists doesn’t mean any other
solution is an anti-pattern... well, yes, exactly!

------
kyberias
The author provides no examples of "Anti-patterns" that should or should not
be called anti-patterns. He just claims it's derogatory and is critical to
people's opinions.

To quickly label some solution as an anti-pattern is useful because it's
efficient. We repeat the same mistakes and it's usually not worth the effort
to make a detailed analysis for every single mistake developers do just so
their feelings don't get hurt from "derogatory" anti-pattern-labeling.

~~~
convolvatron
the whole 'pattern' and 'anti-pattern' terminology robs technical discussion
of its meaning.

can't we just say things like 'maybe writing a bespoke build system isn't the
best use of time right now', or 'there is this really cool range data
structure I saw that seems really relevant, here's a link', or 'we keep having
consistency problems cleaning up objects to put them back in the pool, maybe
we should just always reinitialize them so that all that code is in one place'

trying to boil down all software to 50 simple shapes that we can assemble is
just way too reductionist. think about what you're doing. use your words.

~~~
mwfunk
Patterns aren’t reductionist, they’re just names for things that are common so
that it’s easier to talk/think/write about them when those are the things
being discussed. Surely no one has ever told you that all code you write has
to conform to some set of predefined design patterns. That would be an
engineering antipattern right there. Like most things in design, they’re
supposed to help you think more clearly and concisely about common problems,
they are absolutely 100% not intended to circumscribe your solutions.

~~~
mamon
It is funny like some programmers, especially from Lips/functional programming
equate term "design patters" with GoF book and their list of 23 specific
patterns, and then proudly declare that "Lisp does not have/need design
patterns". Of course it has, just different ones than Java/C#

------
c3534l
> The term “anti-pattern” is a derogatory term used to disparage software
> design approaches that a given developer, or group of developers may not
> like.

So right off the bat you have an author doing the exact same thing he accuses
other of doing, abusing the meaning of a term as bludgeon against pet peeves.
And yet here he is, in the very first sentence of the article, lying about
what an anti-pattern is because it makes his pet peeve look more legitimate.
And it doesn't get that much better in the rest of the article.

The complete lack of self-awareness of some people is absolutely astonishing.

------
Fellshard
The exact logical misstep made in this article is at the point where it is
somehow assumed that saying a single solution is wrong (in this case by
calling it an antipattern) is only done because someone believes a single
solution is right (by calling it a pattern).

This is completely fallacious, as there may be many commonly executed good and
bad solutions. We would just like to differentiate between which ones nearly
always lead to dramatic costs later, and which ones nearly always lead to low
costs later.

------
legulere
Software Design Patterns are an Anti-Pattern.

They could be used as a terminology for commonly recurring patterns. But ask
your colleagues about the difference of a facade an adapter and a decorator
and you will see that in reality they are useless for that. The chosen names
are just too generic to have a precise memorable meaning.

People think they can use it as an abstraction, but for most patterns it’s not
abstracting in a way that programming languages support. The only thing I can
do with an adapter is add it to the class name and feel intelligent. People
tend to do that less to real abstractions (List instead of
SequentialIteratorFactory or IteratableMonad).

The most offensive thing people are doing with design patterns is trying to
use them like bricks to build code. You will end up with code that’s 90%
boilerplate and 10% actual logic strewn all over the place.

~~~
Nvorzula
> The most offensive thing people are doing with design patterns is trying to
> use them like bricks to build code. You will end up with code that’s 90%
> boilerplate and 10% actual logic strewn all over the place.

This is absolutely one of my greatest pet peeves with the developers I have
been interacting with the past several years. It is almost as though they
ignore all functional and non-function requirements set forth before them,
instead placing a priority on having no fewer than four
`AbstractSingletonFactoryAdapor`s. The result is the complete abuse, and
misuse, of these patterns that is just impossible to follow and almost always
incorrect (Really? A singleton? Do you REALLY think that there will only ever
be one logged in user during the life of the JVM? Or did that just HAPPEN to
be the case while you were sitting at your desk pounding this out?).

------
jmull
“...Design patterns are formalized best practices...”

(From Wikipedia, apparently, not the author of the article.)

No idea where that idea came from, but this probably just means Wikipedia
should be updated. Design patterns don’t need to be “best practices”. Also,
that term is mostly useless to use in the definition or description of
something since it’s a highly contextualized term.

It looks “Formalized” should also be dropped. It should say something like
“defined and documented” instead. As it is, I think it can be confused (as the
author of the article has) with referring to having gone through a formal
verification process.

------
falsedan
> _Declaring one of these as correct and all others to be incorrect is an
> oversimplification, and a logical fallacy._

I think the root of this black-and-white thinking (the is a right answer to
“what pattern should I us?”) & the social disparaging of making a
mistake/being wrong both stem from logocentrism in Western critical thought.

The attitude of “being right trumps any objections” is reinforced by CS grads
trained by examinations where there __are __correct answers and doesn’t set
them up for the wide world where clients don’t want the technically correct
solution when it comes at the cost of delivery speed, brittleness,
inflexibility, etc. This is one place where Software Engineering is superior,
since it trains students in various techniques and the associated trade-offs.
Imagine civil engineers & architects taking the position that, since _A
Pattern Language_ says that four square walls and a peaked roof exemplifies
the timeless way of building, they insist that is the only way to construct a
house.

The real value-add of patterns is giving a standard vocabulary/shorthand for
discussing designs without getting bogged down in the boring technical details
of the implementation. Uncritically cleaving to them as some sort of defence
of quality is pure cargo culting.

------
blackbrokkoli
IMO the authors misses the aspect of scope.

The term "Anti-Pattern" is not used in a vacuum. Instead, it usually refers to
a specific term, like >Usability< or >Static Websites<, for example. If I
refer to usage of fuzzy, thin fonts on mobile as an anti-pattern, it does not
mean every artistic collage with this feature is automatically shit. If I live
in 1820 and tell people not to use metal for horse riding gear, I'm not
judging the possibility of a superior means of transportation made out of
metal.

But for 99% of the people who just want to build something _with_ a
technology, not revolutionize it, the word is still a good metaphor to guide
to the right path in the given time and place.

Of course, as most things, anti-pattern defining shouldn't escalate in
dogmatism, but I don't follow the conclusion that the metaphor itself is
destructive or bad...

edit: grammar

------
jupp0r
Anti patterns imho:

Singleton

Inheritance as a means of code sharing (vs polymorphism)

Premature optimization

Manual resource management (vs RAII)

Implicit ignoring of errors

------
zapzupnz
This article is an anti-pattern.

------
atian
This article is incoherent.

------
ncmncm
This article, like most, rides on the idea that Patterns are Good. But each
pattern really represents a failure in our languages and language ecosystems.
An anti-pattern is just a bigger failure.

We have a different name for the true successes: Libraries. When we can
encapsulate a solution to a recurring problem, and make a clean interface to
it that does not dictate the architecture of the whole system that uses it, we
put it in a library, document it, and maybe publish or even standardize it.

The merit of different languages for constructing systems (as opposed to
myriad other uses for languages!) turns on how effectively you can build,
deploy, and use libraries in those languages. Each place where no library can
be constructed or used, requiring deployment of a pattern instead, represents
a failure of the language design or ecosystem to enable encapsulating and
generalizing the idea. In a Better Language, we would use a core language
feature or library, and not need to code the pattern over, yet again.

When we propose extending a language with a new feature, our best arguments
for the feature arise from examples of libraries we _could_ write, that we
cannot, now. Or, more frequently, are of how it would enable better
encapsulation and generality for the kinds of library we already use.
(Previously impossible libraries are usually conceived later.)

This is the reason that C++ still dominates in system implementation, despite
that its most devoted users are its worst critics. It originated the goal of
the zero-cost abstraction, has carried it farther, and is still moving.

As an example, the hash table is a common Pattern in C. C coders are always
writing custom hash tables and hash schemes, because you too often can't get
tolerable performance from a library version. In slow languages, the
performance overhead is considered negligible, and the built-in hash table
dominates. Meanwhile, in C++, it would be foolish to code a single-use hash
table. The std-library one is fast and forgiving; and better ones, deeply
analyzed and optimized with detailed tradeoffs (entry stability, modify vs
lookup speed, etc.) and thoroughly tested, are easily found and dropped in.

Sometimes a library can't displace a pattern, and you need a core language
feature. C++ has templates. Rust has its borrow checker. C++ and Rust are
getting "await". In languages with the right bent, these core features combine
and breed to provide encapsulation for ever more powerful libraries.

Powerful libraries are not just a convenience, saving coding time. They add
correctness, at scales large and small, not available from patterns. They
amortize and capture optimization effort far beyond what would be affordable
for single-use code.

Obligate garbage-collection is poison to a powerful-library ecosystem, because
the effects of tradeoffs needed to meet performance requirements reach deep
into the shared semantics libraries rely on. A powerful library for an
obligate-GC language would need to accommodate all the different ways GC might
work. In practice, nobody has time for that, so library dependencies in such
languages instead constrain the tradeoffs, and many libraries are likely not
to be usable in a system that must make them. With fewer uses to amortize
over, libraries get less attention, and are fewer.

C++ shows its age, but nothing to replace as a powerful system language is
above the horizon. Rust might get there, eventually, if its minders raised
their sights, and if development of C++ were to slacken.

Most coding, meanwhile, is done on smaller systems, in more forgiving
environments, where abstraction overhead is tolerated. There is little
pressure to enable replacing patterns with libraries. Patterns are still in
the vocabulary there.

