Hacker News new | past | comments | ask | show | jobs | submit login
Inter-Procedural Optimization and Derefinement (playingwithpointers.com)
45 points by adamnemecek on July 17, 2016 | hide | past | web | favorite | 11 comments



It seems way more to me, that the linker shouldn't merge different implementations of a function.

On the other hand the compiler maybe should rename the function when it does inter-procedural optimization on it.


The C/C++ standards require that two pointers to the same inline function are equal, even across translation unit boundaries. Therefore, in the worst case you need to pick a single implementation of the function in the linker. Of course, you could use different implementations in translation units where the address of the function is insignificant.

A similar issue affects the C++ template compilation model used in most compilers. If template instantiations weren't merged at link time, C++ binaries would be incredibly bloated (and C++ binary size is already a problem).


"The C/C++ standards require that two pointers to the same inline function are equal, even across translation unit boundaries."

Sigh. I presume this is because you can pass inline functions to callbacks, and the 6 people in the world who do

if (a == &f::foo) {do something}

want it to work no matter what?

One solution, of course, is to privatize the function you call, but still hand any code paths that escape pointers and take the address the "other" version (which is i presume what you meant)


I don't understand the example

  bool g(bool b) {
    if (b)
      return true;
    int unused = 1 / 0;
    return false;
  }
Shouldn't it be optimized to

  bool g(bool b) {
    return b;
  }
instead of

  bool g(bool b) {
    return true;
  }


  int unused = 1 / 0;
is undefined behaviour. The compiler can replace it with anything it wants. Like for instance a return true. Compilers usually (ab-)use this to remove branches.


No need to for parentheses around "ab" in abuse. This is just bad behaviour from compilers trying to optimise for benchmarks.

The right thing to do is to issue an error or warning, and then not make absurd assumptions about other variables. A good explanation of all this can be found here: http://c9x.me/compile/bib/ubc.pdf


I guess the problem with issuing warnings is that dead code can also contain undefined behaviour, so the compiler would probably produce lots of false positives.


Division by zero is undefined behavior, so the compiler is free to assume that the b==false branch is unreachable.


It would also be reasonable to remove the declaration as the variable is "unused".


I see.


I've always wondered, is there a time when assuming no undefined behavior occurs (and none does) allows the compiler to do an optimization it couldn't otherwise?




Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: