Hacker News new | past | comments | ask | show | jobs | submit login

Tony Hoare, null's creator, regrets its invention:

“I call it my billion-dollar mistake. It was the invention of the null reference in 1965. At that time, I was designing the first comprehensive type system for references in an object oriented language (ALGOL W). My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldn't resist the temptation to put in a null reference, simply because it was so easy to implement. This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years.”

https://en.wikipedia.org/wiki/Tony_Hoare#Apologies_and_retra...

https://www.infoq.com/presentations/Null-References-The-Bill...




Including "null" is one of the most unfortunate things about Go. I'm glad to see that other modern languages (Swift, Rust, and so on) are avoiding it.


I too find it annoying in Go, though I'm not sure what the default value of a reference in a struct would be otherwise.

However, I do see the value of NULL in a database context even though it makes database interfaces harder -- especially in Go, where the standard marshaling paradigm means anything NULLable has to be a reference and thus have a nil-check every time it's used.

The conceptual match is so awkward that when I write anything database-related for Go, if I have the option then at the same time I make everything in the database NOT NULL; even though that screws with the database.

Ah, NULL. When I think about the pain it causes, balanced against its utility, I sometimes wish I'd never heard of it.

And I'm sure learned people said the same thing about Zero, once upon a time.


Go's wholesale embrace of null is kinda jarring at this day and age, since it doesn't even have safe null dereferencing operators like Groovy, C#, Kotlin et. al. It's like Java all over again.

Considering Rob Pike is a huge fan of Tony Hoare, and the inevitable mountain of pain caused by null references, that's kinda surprising.

But I guess "Worse is Better" in the sense of "simplicity of implementation is valued over all else" is still the guiding principle of Go. As Tony Hoare himself said: "But I couldn't resist the temptation to put in a null reference, simply because it was so easy to implement". It seems like we're doomed to repeating this mistake again and again.


> I too find it annoying in Go, though I'm not sure what the default value of a reference in a struct would be otherwise.

The way it works in Rust is that you can't have a reference without the thing you're referring to. There isn't a default value and because of that it's an error to try to have a reference without something to refer to.

The way Rust works with this is to have a type called Option (I think this is a monad in haskell?) that lets you say this can be None or Some, so you have to explicitly handle the None case whenever you use it (either by panicing, or matching or some other method).


You’re thinking of Maybe[1]:

  data Maybe a = Just a | Nothing
      deriving (Eq, Ord)
Yes, it is a monad[2].

  return :: a -> Maybe a
  return x  = Just x

  (>>=)  :: Maybe a -> (a -> Maybe b) -> Maybe b
  (>>=) m g = case m of
                 Nothing -> Nothing
                 Just x  -> g x
[1]: https://wiki.haskell.org/Maybe

[2]: https://en.wikibooks.org/wiki/Haskell/Understanding_monads/M...


It's a monad in Rust too (with Some being return, and and_then[1] being >>=). Rust's generics just aren't yet[2] flexible enough to abstract over monads within the language.

[1] https://doc.rust-lang.org/std/option/enum.Option.html#method... [2] https://github.com/rust-lang/rfcs/issues/324


For anyone who has the same reaction as I did when first hearing of this, "But isn't that just like a nullable value?" (don't worry, this post will not contain any monad analogies)

Yes. Ish. In languages that have a null/nil/None then almost anything you have passed to any function could be null. Every function you write could have a null passed into it and you either have to do checks on everything or trust that other people will never pass those values in.

That's pretty OK until something unexpectedly receives one and hands it off to another function that doesn't deal with it gracefully.

In Haskell (and I assume others) this can only happen to Maybe types, they're the only ones that can have the value of Nothing. So the compiler knows which things can and cannot be Nothing, and therefore can throw an error on compilation if you are trying to pass a Maybe String (something that can be either a "Just String" that you can get a String from or a Nothing) into a function which only understands how to process a String.

This feels like it might be restrictive, but there are generic functions you can use to take functions which only understand how to process a String and turn it into one that handles a Maybe.

It's quite a nice system, even though I get flashbacks to java complaining I haven't dealt with all the exceptions.

Haskell doesn't completely stop you from shooting yourself in the foot though, you can still ask for an item from an empty list and break things. There are interesting approaches to problems like that however: http://goto.ucsd.edu/~rjhala/liquid/haskell/blog/about/

Finally, it's worth pointing out that Maybe and Nothing and Just and all that aren't built into the language, they're defined using the code that gbacon wrote. So in a way, Haskell doesn't have a Maybe type, people have written types that work really well for this kind of problem and everyone uses it because it's so useful.

[disclaimer: I've probably written 'function', 'type', 'value' and other terms with quite specific meanings in a very general way. Apologies if this hurts the understanding, and I would appreciate corrections if that's the case, but just assume I'm not being too precise if things don't make sense]


> Finally, it's worth pointing out that Maybe and Nothing and Just and all that aren't built into the language, they're defined using the code that gbacon wrote. So in a way, Haskell doesn't have a Maybe type, people have written types that work really well for this kind of problem and everyone uses it because it's so useful.

Yep, and I believe it's the same with with Rust, it's been put into the standard library but it's not some special thing that only the compiler can make.

And like you said, the whole idea is that the normal case is that you can't have null values. If you need them for something you declare that need explicitly and have to handle it explicitly or it's a compile error. That way it can be statically checked that you've handled things.


Option (and/or option) is the usual name of Maybe in ML style languages like SML, ocaml, etc.


What's wrong with using an Option/Maybe type? Can't Go do that?


You could (just use an interface). However, it would be a pain to use because Go doesn't have generics (if a type X implements the Maybe interface, it is not true that []X can be casted to []Maybe). So you would have to always make temporary slices to copy all of your Xs into.


Go effectively has Option types for database query results: https://golang.org/pkg/database/sql/#NullBool


This is not a generic option type, but rather a tri-state bool, or an Option<bool>. Go has no user-defined generics, so you can't have a bool type. It does have built-in "magical" generics, namely arrays, slices maps and channels, but no option/maybes. Language-level option types are not unheard of (C#, Swift and Kotlin all have a notion of this sort, although they all support proper user-defined generics as well).


> Language-level option types are not unheard of (C#, Swift and Kotlin all have a notion of this sort)

Swift's Optional is a library type: https://github.com/apple/swift/blob/master/stdlib/public/cor...

Though the compiler is aware of it and it does have language-level support (nil, !, ?, if let)


Swift most definitely has a null: nil. The difference is that you can (and it is the default to) declare variables of type non-nil object (non-optional), unlike Objective-C.


nil is just shorthand for Optional.None, which is Just Another Value. (sort of, really you can make nil translate to any type, but please don't)


Fair but I don't think the original claim was intended that pedantically! :)


The point of the original claim was that modern language (like Swift or Rust) tend not to make null part of every (reference) type. That Swift has a shorthand for Optional.None doesn't change that, nil isn't a valid value for e.g. a String (as it would be in Java or Go), only for String? (which is a shorthand for Optional<String>)


What you are describing is no different than modern Java where variables are marked @Nullable and its a compiler or linter error to dereference it without being inside a null-check conditional. If you don't use this in Java it is just the same as having String? as your type everywhere.


> What you are describing is no different than modern Java

It is in fact quite different. String/String? is not an optional annotation requiring the use of a linter and integration within an entire ecosystem which mostly doesn't use or care for it, it's a core part of the language's type system.

> If you don't use this in Java it is just the same as having String? as your type everywhere.

Except using String? everywhere is less convenient than using String everywhere, whereas not using @Nullable and a linter is much easier than doing so.,


String isn't a reference type in Swift, but yes.


Well, while Rust does have pointer types[1], it doesn't allow them to be used in typical code (i.e. dereferenced), except within "unsafe" blocks. A null pointer does exist[2]. I believe this is needed for such things as interoperability with C code.

[1] https://doc.rust-lang.org/std/primitive.pointer.html

[2] https://doc.rust-lang.org/std/ptr/fn.null.html


The only use case I've had for these personally is having a global config "constant" which can be reloaded behind a mutex. unsafe + Box::new() + mem::transmute() on a static mutable pointer. I believe I copied this from Servo based on a suggestion on IRC.

IIRC this was pre-1.0 Rust, so there's probably a better way to do it now.


I've recently played with using nil pointer references as a form of default overloading. The process in the function block was something like:

1) assign default value to variable

2) check if pointer parameter is nil

3) if pointer is not nil assign pointer value to variable

4) do work with variable

Am I overlooking a simpler way to do this? I feel like lib.LibraryFuncion(nil, nil) is nicer then lib.LibraryFunction(lib.LibraryDefaultOne, lib.LibraryDefaultTwo), though admittedly the explicitness of the second option is appealing.


Could you just create well-named functions which call LibraryFunction() internally rather than relying on the caller to specify 1 or more default arguments?


Yeah. It's funny I'm currently working on two Go libraries one uses the convention you listed [1] and the other [2] uses nil pointers.

[1]: https://github.com/b3ntly/go-sms/blob/master/client.go

[2]: https://github.com/b3ntly/mLock/blob/master/lock.go

Disclaimer: Both libraries are in the pre-alpha stages and are extremely light on tests and/or broken.


Hiding it in the implementation detail won't get rid of it.

While you can do pointer arithmatic there will be null pointers and while memory is addressible there will be pointer arithmatic.

And while I'm a programmer, I want access to all the functionality the CPU offers.


> While you can do pointer arithmatic there will be null pointers

There is no reason why the concept of a null pointer has to exist. If there were no special null pointers it would be perfectly okay for the operating system to actually allocate memory at 0x00000000. With unrestricted pointer arithmetic you can of course make invalid pointers, but a reasonable restriction is to only allow `pointer+integer -> pointer` and `pointer-pointer -> integer`. You can't make a null pointer with just those.


It's was meant to be there , It's a feature ,Going for the option/maybe types would have been too much according to it's designers.


I hear the "too complicated" argument a lot from defenders of null, but it doesn't quite make sense to me.

Good code is generally agreed upon to always check whether function inputs or the results of function calls are null (if they are nullable). Why not make it a compile-time error if you don't check rather than hoping that the fallible programmer is vigilant about checking and delaying potential crashes until runtime?

Go is extremely pedantic about a number of things like unused library imports, trailing commas, etc. which have absolutely no bearing on the actual compiled code, but it intentionally leaves things like this up to programmers who have shown that they can't be trusted to deal with it properly.

Having to manually deal with null is much more complicated than having an Option/Optional type in my opinion. We've also seen that it's far less safe.


> Go is extremely pedantic about a number of things like unused library imports, trailing commas, etc. which have absolutely no bearing on the actual compiled code

Trailing commas, agreed. But unused library imports have the (probably unintended) side effect that their `func init()` will execute. Which is also why there is an idiomatic way to import a module without giving it an identifier, just to have this side effect.


Good point. And there can actually be more than one init() function per package!


Yes, but I guess that they're just concatenated at compile-time.


I have to disagree with Hoare sensei there. The number is nowhere near a billion. I personally know of one case that totaled to a billion.


Would love to learn more about this single billion dollar mistake


SQL summation without coalescing null addenda to zero. The sum result was a null. This had been going on for years. Oops!

The lesson/mitigation was to add a NOT NULL attribute on top of a DEFAULT 0.


Hoare was talking about null references specifically. Or, to be even more precise, making references nullable by default, and allowing all operations on them in the type system, with U.B. or runtime errors if they actually happen.

NULL in SQL is a very different and largely unrelated concept. Though probably a mistake just as bad - not the concept itself even, but the name. Why did they ever think that "NULL" was a smart name to give to an unknown value? For developers, by the time SQL was a thing, "null" was already a fairly established way to describe null pointers/references. For non-developers, whom SQL presumably targeted, "null" just means zero, which is emphatically not the same thing as "unknown". They really should have called it "UNKNOWN" - then its semantics would make sense, and people wouldn't use it to denote a value that is missing (but known).


Sounds like this could have also been mitigated by either the SQL server warning about SUM() operating over a nullable column or a "where foo is not null" clause. Your solution is best, though.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: