Hacker News new | past | comments | ask | show | jobs | submit login

Common misconception, see Proebsting's Law, "The Death of Optimizing Compilers":


as well as 'What every compiler writer should know about programmers or “Optimization” based on undefined behaviour hurts performance '


Slides with the talk, not my favorite, have a link to the talk?

The second paper is so biased it hurts. It hardly attempts to hide this bias, on the second page it start referring to one group of people as "clueless" and never justifies it describing what what clued in would be.

The second paper also has a strong assumption that compilers should somehow maintain their current undefined behavior going forward. It is almost as though the paper author thinks a compiler can somehow divine what the programmer wants without referring to some pre-agreed upon document, such as the standard for the language.

The second paper also talks only about performance and not about any other real world concern, like maintainability, reliability or portability.

This paper is setting up straw men when it trots out code with bugs (that loop on page 4) and then a pre-release version of the compiler does something unexpected. Of course non-conforming code breaks when compiled. Of course pre-release compilers are buggy.

The paper's author wants code to work the same on all systems even when the code conveys unclear semantics. That is unreasonable.

Why even write a book about it? effectively a no-op

To give credit to the paper's author that no-op is part of the SPEC benchmark suite and the author feels that code in that benchmark is being treated as privileged by compiler authors.

Even though I disagree with the author I try to understand some of his perspective.

There's a gap between "humans can't write assembly better than the compilers" and "there's nothing humans can do to help the compiler write better code".

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact