
From STUPID to SOLID Code! - couac
http://williamdurand.fr/2013/07/30/from-stupid-to-solid-code/
======
dljsjr
A lot of people love to try and make a case for Singletons. I have yet to see
anyone make a really, truly good case for a pure software singleton. They just
make it an absolute nightmare to be well informed -- as a developer -- about
state in complex/concurrent/distributed systems.

The only time I ever use them is when my hands are tied by some sort of
hardware restriction; a JNI wrapper around some native lib that talks to
proprietary hardware like a motor controller where instantiation of more than
one comms object would blow out a fuse or something. And even then, I'd make
the argument that the people who designed the hardware and its corresponding C
API should have just made a safer interface instead of saying "Don't call
`new` more than once, or else!". Seems lazy and less-than-appropriately fault
tolerant.

~~~
RogerL
I know what you are saying, and agree. But consider: I have state that I truly
need to be global - something I read from an INI file, the command line, or
internally calculated, it doesn't much matter. Is not the alternative to a
singleton having that state replicated all over the place? And then you
introduce model/view/ signals, have to deal perhaps with race conditions, and
so on. Then, just try to grep for the existence of that state in your code.

Like anything, it's a tradeoff. Let's say I have an app with 10 OpenGL
windows, and I want to simultaneously control the lighting model for all at
once. A singleton gives me an easy way to do that, but then the windows have
to poll in some manner (which is perhaps not an issue, since they are in a
render loop anyway). But I am brittle - introduce the requirement that each
window handles light differently and the singleton is toast. OTOH, introducing
a signal/slot or model/view architecture may be overkill for what I am doing.
This might be a small corner of the code, and I just want clarity for the few
times I go in to read it. I find it extraordinarily hard to understand
messaging code - where did this message come from, what was the state of that
object that caused it to fire the message, and so on. I think complexity goes
up at least by 1 order of magnitude when you have to maintain state and time
in your mind to understand a program.

Perhaps not the best example; so consider command line arguments. It's read
only, set once, but perhaps needed to be accessed by many different parts of
my code. A singleton or 5 seems pretty darn clean to me. Again, not without
it's tradeoffs, but the alternatives, to me, seem worse. You can pass, pass,
pass, the data through methods and constructors until it gets to the code that
needs the info - in which case you have infected a lot of code with rather
superfluous (to them) knowledge about command line settings. That's brittle,
and you end up touching a lot of code when you add something to the command
line. You can do a signal/slot or model/view, which can be quite powerful, but
you may be forced into a specific design by your sw stack (they have a
model/view architecture which you are using for GUI or something). So your
code is brittle by being coded to a specific API, and again, it rather
obscures where this data comes from. Plus, this forces the command line
reading code to know all about your specific API, how to format the signal,
when to send it (command line reading is often done well before the whole
stack is started up and going, so you can't send a signal yet). That's very
brittle as well. Can you take that code, put it in a test harness, and
trivially write tests for it? Probably not.

Maybe I am missing some obvious technique. But a _few_ things in an app are in
fact global. Trying to obscure that fact, to my mind, just makes the sw more
brittle and a lot harder to understand and debug. Which is not in any way an
argument for making non-global things global for convenience.

~~~
dragonwriter
> But a few things in an app are in fact global.

Sure, but for the most part, there is no reason for the components of the app
to _depend_ on those things being global (and a significant cost in
brittleness for them to do so), so, in general, they should be passed to the
objects that need them, even if they are globally consistent throughout the
app.

Obviously, there are optimization reasons why you might need to avoid doing
this in particular cases, but in general doing so up-front is going to be a
premature optimization.

~~~
skormos
"...there is no reason for the components of the app to depend on those things
being global..."

BINGO! This is the anti-pattern part. Not that the data is globally required,
but that your units are mutually dependable. This is why Interfaces and
Dependency Injection are so popular now, because they facilitate the ability
to make easily testable, singular units that allow for mocks / stubs etc.

To illustrate, what happens if your global requirement happens to also be
external? Would you want to have to test your component with a live version of
your global and external resource? No, you would want the ability to test
(either as a single unit, or even the whole app), without requiring the
database, messaging, web service, etc, to be up.

------
kazagistar
I have been questioning the "programming languages are for humans, use full
names" principle recently. Why do we have a tendency, as humans, to create
abbreviations so often then? Mathematicians get upset if an operation uses
more then a single character or symbol to express it, and are willing to have
massive amounts of overloading to achieve it.

~~~
dragonwriter
> Mathematicians get upset if an operation uses more then a single character
> or symbol to express it, and are willing to have massive amounts of
> overloading to achieve it.

Because context-switching and then processing concise symbols in that context
is more efficient for most people's brains that processing prose.

~~~
kazagistar
So why would it be any different in programming?

------
dave1010uk
Anthony Ferrara did a similar presentation a while back:
[http://blog.ircmaxell.com/2012/05/dont-be-stupid-grasp-
solid...](http://blog.ircmaxell.com/2012/05/dont-be-stupid-grasp-solid-
slides.html)

~~~
couac
Yes, I linked his presentation in the introduction :)

------
MarkMc
_Always_ avoid singletons? Here's my situation: in various parts of my code I
need to get the current time. Normally I would call System.currentTimeMillis()
but in my test cases I need to ensure the time is a particular value. So I
have a Clock singleton class which allows me to get the current time but also
to 'stop the clock'.

Is it really better for me to pass the Clock instance all over the place,
rather than have a singleton instance that can be referenced anywhere?

~~~
skrebbel
If you really want to modify global state while running a test, then you might
as well use a dependency injection container and get similar behaviour,
entirely generalized.

You can then have two much simpler clock classes, one for real time and one
for fake time.

Personally, I prefer just passing around the clock, though - it makes
dependencies between classes crystal clear.

~~~
ExCodeCowboy
You get the added benefit of not relying on global state. Which you have to do
anyway if you want to parallelize your test suite execution.

------
skrebbel
I've always felt that the SOLID rules are very difficult to comprehend, yet
pretty basic to apply. A junior dev that I recently coached, who had grown up
on modern OO languages (Python, C#, etc), understood all these rules in his
underbelly but would never be able to explain them.

I believe that we should be able to come up with a set of (slightly different)
rules that are simpler to explain and yield the same good designs.

~~~
couac
I think that it is a matter of vocabulary, no? Reading the LSP definition in
Liskov's paper is quite hard to understand, but with simpler words, it is more
understandable.

------
luiz-pv9
About the square/rectangle example, what would be the solution? The Square
subclass should not verify equality with width and height or it should not
even exist? Or something else?

\-- edit Wikipedia: If Square and Rectangle had only getter methods (i.e.,
they were immutable objects), then no violation of LSP could occur.

~~~
dragonwriter
> If Square and Rectangle had only getter methods (i.e., they were immutable
> objects), then no violation of LSP could occur.

Yeah, this is the fundamental problem; _mutable_ objects tend to result in LSP
violations when class heirarchies follow common sense is-a relationships, but
mutating operations are not limited to those (if any exist) that preserve
identity.

A corollary to this is that the use of default constructors very often
accompanies LSP violations, as it usually implies _all_ of the object's state
is subject to mutating operations.

------
nathell
I tend to perceive tight coupling as a virtue rather than as a symptom of
stupidity. On the other hand, indescriptive naming is a real, and severe,
problem in my code.

Also, both these acronyms seem to lean toward the object-oriented paradigm.
I'd like to see something more universal.

------
ExpiredLink
'Dependency Inversion' _increases_ coupling. More and more I'm inclined to
consider 'Dependency Inversion' an Anti-Pattern.

~~~
dragonwriter
> 'Dependency Inversion' _increases_ coupling.

This is an interesting claim. Could you explain further _how_ this is the case
(or provide references which explain it?)

------
winkerVSbecks
Someone needs to fix the font on that website. Ultra thin font on white
background and not rendering properly – highly, unreadable!

~~~
couac
Oh I am really sorry. Could you give me more details (OS, browser) so that I
can fix the font?

~~~
winkerVSbecks
chrome on os x

[http://cl.ly/image/2H4115120p0t](http://cl.ly/image/2H4115120p0t)

