

Assert(int+100 &gt;  int) optimized away - mhyee
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=30475

======
rspeer
Wow. That was a pretty stark example of what not to do when checking for
integer overflow, _and_ what not to do when reporting a bug.

~~~
zaroth
I think the point is not "this is the best/right way to write the check", but
rather, this way of writing the check used to work, and is pervasive, and now
it gets optimized away, and that seems like a Bad Thing.

The bug report appears to contain a lot of investigative work by the submitter
to try to substantiate their claims.

I don't have a problem with the back-and-forth, people are allowed to get
passionate, especially if they are doing the leg work to try to substantiate
their position.

~~~
tempestn
There's a difference between passion and just being rude though. Not only is
the latter unpleasant, it removes _any_ chance you might have had of
convincing anyone of anything. I mean, his third post ended with, "THIS IS NOT
A JOKE. FIX THIS! NOW!" Can you imagine anyone reacting positively to that?

------
jloughry
The issue is six years old (2007) but the problem is still relevant in 2013:
see the brand new paper by Wang, et al. 'Towards Optimization-Safe Systems:
Analyzing the Impact of Undefined Behavior' to appear in SOSP'13 next week.
The authors define 'optimisation-unstable code' unusually; what I think they
mean is that compilers are permitted by the C language standard to do
_anything_ with undefined code, but the real problem is that any change of
compiler, environment, flags, or compiler version might _change the behaviour_
of existing code, including opening up security vulnerabilities that weren't
there last week.

 _ETA:_ here is the link to the paper, in PDF:

[http://pdos.csail.mit.edu/papers/stack:sosp13.pdf‎](http://pdos.csail.mit.edu/papers/stack:sosp13.pdf‎)

------
zaroth
As very much not a C programmer, I did sort of grasp the argument that while
the _overflow_ behavior may be undefined, once the overflow happens and if the
end result is that the sign goes negative, how can you reasonably skip the
negative branch?

    
    
      int a,b,c;
      a=INT_MAX;     /* statement ONE */
      b=a+2;         /* statement TWO */
      c=(b>a);       /* statement THREE */

~~~
MaulingMonkey
Undefined behavior means _anything_ is allowed.

If you have a buffer overflow which overwrites your x86 instructions to
completely change the behavior of the program, that's "OK" because buffer
overflows are "undefined behavior". Simple cause and effect.

The authors of compiler optimizers noted this basically gave them free reign
to optimize based on the assumption that undefined behavior does not happen,
because if it DID happen, whatever the program ends up doing in that case of
not a concern, as whatever it is, it's "OK".

An example of this: If I dereference a null pointer and try and assign it to
three, and then after that dereference check if the pointer is null, I can
optimize away that check.

On a typical x86 system, this sounds reasonable: "It'll crash anyways, right?"
...however, the optimizer is allowed to do this even if there's no guarantee
there will be an access violation because I'm using an unprotected memory
model, or the member is a sufficient way into the containing struct to be in a
seperate mapped page, etc.

Taken to a further extreme, this applies to signed integer overflow as well.
"You're not allowed to do it within a well defined program, ergo, we assume
you don't do it and optimize away based on that assumption."

