
Don't overload #nil? in Ruby - prosa
http://paul.rosania.org/writings/2011/11/09/dont-overload-nil/
======
bluesnowmonkey
Having recently worked with a Ruby codebase where anything could be nil at
anytime, I consider even references to nil to be a code smell. Developers are
inherently prone to ignoring error handling, so they'll happily ignore the
fact that Order#customer really means Order#customer_or_nil. And when they see
errors in production, they'll slap an andand on it and call it fixed. The code
becomes extremely difficult to reason about.

I prefer that functions throw an exception when unable to do what they
promised, rather than return nil or a null object. The try-catch block serves
as documentation that the function might fail. If someone forgets to catch the
exception, it will shut everything down rather than leave the program in an
unanticipated state that could lead to an error elsewhere.

~~~
DanielRibeiro
Nils are a very bad code smell. They come from C's null, which is a billion
dollar mistake[1], according to its creator: Tonny Hoare. Specially now that
we have monads[2, 3].

Scala's standard library provides very helpful information on how to replace
null with its Maybe class (called Option in Scala). Just take a peek into
their collections library[4], and search for Option.

[1] [http://www.infoq.com/presentations/Null-References-The-
Billi...](http://www.infoq.com/presentations/Null-References-The-Billion-
Dollar-Mistake-Tony-Hoare)

[2]
[http://moonbase.rydia.net/mental/writings/programming/monads...](http://moonbase.rydia.net/mental/writings/programming/monads-
in-ruby/00introduction.html)

[3] <http://andand.rubyforge.org/>

[4] [http://www.scala-
lang.org/api/current/scala/collection/immut...](http://www.scala-
lang.org/api/current/scala/collection/immutable/Iterable.html)

~~~
aphyr
_Nils are a very bad code smell. They come from C's null, which is a billion
dollar mistake[1], according to its creator: Tonny Hoare. Specially now that
we have monads[2, 3]._

Do you have a source for this? I was under the impression nil was directly
taken from Smalltalk, which derived it from Lisp.

~~~
DanielRibeiro
Wikipedia[1] has the refs:

 _The null reference was invented by C.A.R. Hoare in 1965 as part of the Algol
W language. Hoare later (2009) described his invention as a "billion-dollar
mistake":[10][11]_

[1] <http://en.wikipedia.org/wiki/Null_pointer#Null_pointer>

Where:

[10]
[http://qconlondon.com/london-2009/presentation/Null+Referenc...](http://qconlondon.com/london-2009/presentation/Null+References:+The+Billion+Dollar+Mistake)

[11] [http://www.infoq.com/presentations/Null-References-The-
Billi...](http://www.infoq.com/presentations/Null-References-The-Billion-
Dollar-Mistake-Tony-Hoare)

Yes, the same video I linked above.

~~~
aphyr
Nil was a part of Lisp 1, which is why I was confused. It predates Algol W by
seven years. As null references and nil are somewhat different beasts, I'm not
sure the criticism "billion dollar mistake" applies fully.

------
mcobrien
I'm not sure I like it, but this works

    
    
        def nil.method_missing(*args, &block); end
    

So now you can call nil.whatever.you.like and get nil. It's also still falsy,
but now every nil value in your app silently works and doesn't throw an error.

It's both impressive and scary that Ruby lets you (or that new contractor
you're not sure about) change the fundamentals of the language in one line of
code.

~~~
bodhi
I recall that this was the default action of Smalltalk? I can't find a
reference though, so I am very likely incorrect.

In Objective-C nil will absorb any messages too, see the section "Sending
Messages to nil" in
[http://developer.apple.com/library/ios/#DOCUMENTATION/Cocoa/...](http://developer.apple.com/library/ios/#DOCUMENTATION/Cocoa/Conceptual/ObjectiveC/Chapters/ocObjectsClasses.html)

~~~
spooneybarger
It isn't the default action in smalltalk. nil does not silently absorb
messages in smalltalk.

------
regularfry
I've said this in a comment at rosania.org, but I think it bears repeating
here: this is why we need a #blank? or #null? protocol that user classes can
participate in as part of core. #nil? and an inextensible FalseClass just
aren't enough to do the sorts of thing that Avdi was trying to do in his
original post. My tentative proposal is languishing here:
<http://redmine.ruby-lang.org/issues/5372>

------
jasiek
I guess that overriding the behavior of any built-in classes in Ruby is not a
great idea, as most gems rely on that unmodified behavior.

~~~
DanielRibeiro
As a gem creator, is is terrible.

As a application creator, is is only bad when/if you open source part of it.
Otherwise, you can always run the gems' tests against the compillation of your
env.

------
Confusion
Conceptually, nil != false. You shouldn't use 'nil' to mean 'false'. You
should always test explicitly for nil-ness using #nil?. If you write

    
    
      if finished?
        do_something
      end
    

then finished? is not expected, and may not, return nil.

If you expect _some_value_ to be able to be nil (which, for boolean values, is
hardly ever), you should write

    
    
      if !some_value.nil? && some_value
        do_something
      end
    

In the end, these 'convenient' coercions that allow you to use any value as a
boolean only come back to bite you, because unexpected stuff happens when
stuff is unexpectedly nil. If it makes my coworkers cry that an object is nil
and true at the same time, then they are at fault, not me.

Edit: and more importantly (I forgot to point this out explicitly) a non-nil
value != true. That any string or integer is 'true' can lead to very hard-to-
find bugs.

~~~
necubi
Such code is everywhere in the Ruby community and explicitly endorsed by many.
I have never seen code like your second example in the wild, and it's far more
complicated than the idiomatic way of writing it:

    
    
       if some_value
         do_something
       end
    

It's a core expectation of the language that !!nil_thing == false, so why
violate it?

~~~
Confusion
Because when you write

    
    
      if !some_value
         do_something
      end
    

you often don't actually want it to be executed when some_value is nil instead
of false. And when it is, you go scratching your head and searching where
_some_value_ came from, to discover some silly typo.

In some ways, the reverse is even worse: when you write

    
    
      if some_value
        do_something
      end
    

and you change some_value from nil to some sensible default like '', you will
forget to update this clause and you will not understand why the code is being
executed. If you use

    
    
      if !some_value.nil?
        do_something
      end
    

you don't have that problem, because the nil-test sticks out like a sore
thumb.

~~~
csallen
I agree with aphyr's response to you. This has never been a problem for me. I
think your issue is that you consider an empty string to be a sensible default
representing false or nil.

~~~
Confusion
I fully agree that those sane defaults indeed aren't right for representing
false or nil. What I'm arguing is rather the reverse: nil is often initially
used as an (insane?) default and that is later changed without updating all
related checks (because they don't stick out and demand attention, which makes
them easy to overlook or 'overthink'), which causes bugs.

------
derefr
Isn't this a flaw in Ruby, though? It also means that you can't create a
delegate/decorator/proxy object for a false object and have it be false as
well, which goes against the general "everything is determined by sending
messages" vibe Ruby has going on.

I know the reasoning (it's much faster to do math on object IDs than it is to
call a method), but there are workarounds for this (e.g. only allowing frozen
objects to be boolean-false, and having the object's truth-ness/false-ness
represented by the return value of its #false? method _at the time of
freezing_ —thus allowing the interpreter to locate it in a semantically-
meaningful part of object-ID space that can later be masked for in a TEST
instruction.)

~~~
adestefan
The ruby people will tell you that ruby's x, where x is power, flexibility,
etc., comes from the ability to shoot yourself in the foot so just don't shoot
yourself in the foot.

~~~
raganwald
Speaking as a Ruby user, I think nil and the entire behaviour around
truthyness and falseness are wrong. I should be able to create my own nil
objects, and I especially would love to be able to create false objects that
carry more information.

For example (and there are code smells in this, but it gets the idea across):

    
    
        if account = Account.create(params[:account])
          ...
        else
          complainAbout account.errors
        end
    

Truthiness should not be reserved for whether an object exists, it could also
be used for whether it is valid, complete, or anything else.

My particular example might not be a great one, but I think framework and
library developers ought to be able to work a consistent use for truthiness
and falseness into their creations, e.g. a true object has been saved, a true
object is valid, a true object is complete, a true object represents the
current state of the world instead of the past or future or a wrong state, and
so forth.

~~~
masukomi
To me this isn't a problem with nil. This is a problem with the create method.
create either returns you the object or something ambiguous (nil). it COULD
return you the object or something useful, like an error class, or an error
code, or whatever....

If you care enough about the state of what's being returned then it would be
trivial to test if the returned object was of the class you were expecting
(Account) and if not handle what it did return (some error indication) as
appropriate.

~~~
raganwald
I'm ok with that. BUT if that's how we want to go, why not ditch truthiness
for non-nil objects altogether?My complaint is that truthiness is baked into
the language in such an inflexible way.

~~~
masukomi
I'm not sure how I feel about that. It just seems so useful to have it, BUT I
think there's definitely a strong argument to be made for your proposition.

Alternately, why not just switch to a functional language where, it seems,
ambiguity and other related problems rarely make it past the bouncer at the
front door.

~~~
raganwald
I think this is a design issue that is orthogonal to functional languages or
static typing. We could easily create a language where writing "if foo" causes
an error when foo doesn't resolve to an instance of Boolean. If we like, we
could add coercion to boolean through the #to_b" method (although I have
issues with implicit coercion).

At a deep level, I wonder if my issue is with if stamens and boolean operators
being magic outside of the object system. Smalltalk gets this right. Scheme
gets this right. if "if" is defined in the standard library rather than being
magic syntactic construct, we can write our own control primitives:

    
    
        provided account = Account.create(params[:account])
          ...
        end
    

:-)

~~~
sleight42
That added flexibility comes at the cost of standardization.

There are practical reasons that we're not all coding in IO. Allowing for the
redefinition of control primitives is a slippery slope. In short order, we may
have apps that read well to us but aren't clear to others.

