Hacker News new | comments | show | ask | jobs | submit login

The most important thing to remember with assertions vis-a-vis optimising compilers removing assertion calls: you should never have assertions that cause side effects.


assert InitialiseStuff() != False, 'Initialise failed'

If the optimising compiler is set to eliminiate assertions the InitialiseStuff function won't get called! This will (subtly, or not so subtly) break your program!

wow. That's a significant gotcha.

I guess there's no reasonable way of changing the language to prevent that from happening, say, only allowing asserts on variables? (I mean obviously because it would break python, but also because it would be an inconsistant implementation.)


In the general case, this is hard because you can easily disallow a lot of useful cases that don't cause problems.

For example, say you want to assert that a particular property of your object contains something:

    assert obj.whatever != None
Whoops, unless your compiler is sophisticated enough to be able to follow the call chain and ensure that the getter doesn't cause any side effects, this is now no longer allowed. You'd have to use a temporary variable, which is unnatural and prevents the very optimization you're trying to pull off.


The reason Python devs may not know this is that almost nobody runs python with -OO -- as it's mostly a no-op with barely any in-place optimisations. At least not in CPython.

Most developers who work with executable compilers tend to know about this sort of thing already; no doubt because for some of them they've done this very thing by accident and gotten burnt by it at some point or another.


Which optimizing compiler does this? PyPy?


No idea what pypy does but cpython with the -O flag will.

  a = 0
  def gadd(b):
      global a
      a = a + b
      return a
  assert gadd(1) == 1, 'a != 1'
  print a

  $python assert.py
  $python -O assert.py


This sounds like an incorrect compiler.


I disagree, because side effects in an assert would always be a code smell, but placing slow code in there is useful behavior. The line could have easily been:

assert EnsureUnique( obj )

Which runs through the program's data structures to ensure that nothing else matches a property of obj in some way.

Hacky, slow, but very useful to keep around if you have a constraint like that. But if you run it in release mode with production sized data sets, it'll slow to a crawl if you don't cut out the entire check.


I wouldn't rely on assert code for this, better say "if dataset is huge don't check".


When it optimizes out a statement that the spec explicitly says can be optimized out? A correct compiler is one that conforms to spec. You using a statement for something outside of its intended purpose makes the compiler incorrect?


Sorry, I'm not deeply familiar with the Python spec. Outside of C or a lisp I would expect assert to behave more like a function which discards its arguments when disabled and less like a macro which stops evaluating its arguments when disabled. This expectation would obviously be incorrect in the case of Python, which is fairly explicit as to the meaning of assert[0]:

  if __debug__:
   if not expression: raise AssertionError
My mistake!

[0]: http://docs.python.org/2/reference/simple_stmts.html#the-ass...


Agreed, a compiler should replace "assert(FOO)" with "FOO", and then remove "FOO" if it has no side effects, like any statement.


This sounds nice in theory. In practice, the compiler cannot eliminate any log statements (writing to a log file is a side effect). Thus, the decision to not execute "FOO" as for example Java does when asserts are disabled, is the correct one.


I think the GP is referring to when NDEBUG is defined.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact