
Dynamic Pessimization: Shaggy Dogs and SpiderMonkey Unwinders - DonHopkins
http://tromey.com/blog/?p=927
======
DonHopkins
The pioneering prototype based object oriented language Self [1] solved the
problem of debugging JITted code way back in 1992 with a technique called
"Dynamic Deoptimization".

[1] Self:
[https://en.wikipedia.org/wiki/Self_(programming_language)](https://en.wikipedia.org/wiki/Self_\(programming_language\))

That made it possible to set source level breakpoints in optimized code and
examine the stack frames that would have been generated had the code not been
optimized.

It also makes it possible to handle programming changes during debugging, and
change a running program and immediately observe the effects of the change.

Dynamic Deoptimization is also useful for purposes not related to debugging,
like throwing away code with invalid optimistic optimizations.

Self led the way to Sun's HotSpot JIT Java compiler, which led the way to
JavaScript's JIT compilers, but some of Self's original important original
ideas got lost along the way, and are now being rediscovered.

My hope is that this time around they'll call it "Dynamic Pessimization".

[2] Debugging Optimized Code with Dynamic Deoptimization:
[http://bibliography.selflanguage.org/dynamic-
deoptimization....](http://bibliography.selflanguage.org/dynamic-
deoptimization.html)

[3] PDF: [http://bibliography.selflanguage.org/_static/dynamic-
deoptim...](http://bibliography.selflanguage.org/_static/dynamic-
deoptimization.pdf)

By Urs Hölzle (Stanford University), Craig Chambers (University of Washington)
and David Ungar (Sun Microsystems Labs).

Proceedings of the ACM SIGPLAN ‘92 Conference on Programming Language Design
and Implementation, pp. 32-43, San Francisco, June, 1992.

Published as SIGPLAN Notices 27(7), July, 1992.

Abstract: Self’s debugging system provides complete source-level debugging
(expected behavior) with globally optimized code. It shields the debugger from
optimizations performed by the compiler by dynamically deoptimizing code on
demand. Deoptimization only affects the procedure activations that are
actively being debugged; all other code runs at full speed. Deoptimization
requires the compiler to supply debugging information at discrete interrupt
points; the compiler can still perform extensive optimizations between
interrupt points without affecting debuggability. At the same time, the
inability to interrupt between interrupt points is invisible to the user. Our
debugging system also handles programming changes during debugging. Again, the
system provides expected behavior: it is possible to change a running program
and immediately observe the effects of the change. Dynamic deoptimization
transforms old compiled code (which may contain inlined copies of the old
version of the changed procedure) into new versions reflecting the current
source-level state. To the best of our knowledge, Self is the first practical
system providing full expected behavior with globally optimized code.

Conclusions: Global optimization need not impair source-level debugging. The
SELF system increases programmer productivity by providing full source-level
debugging of globally optimized code. To the best of our knowledge, SELF is
the first system to do so; other systems either compromise on debugging
functionality or severely restrict the kinds of optimizations that can be
performed. In SELF, the compiler can perform optimizations such as constant
folding, common subexpression elimination, dead code elimination, procedure
integration, code motion, and instruction scheduling without affecting
debuggability.

Two techniques make this possible: lazy deoptimization and interrupt points.
The optimizations performed by the compiler are hidden from the debugger by
deoptimizing code whenever necessary. Deoptimization supports singlestepping,
running a method to completion, replacing an inlined method, and other
operations, but only affects the procedure activations which are actively
being debugged; all other code runs at full speed. Debugging information is
only needed at relatively widely-spaced interrupt points, so that the compiler
can perform extensive optimizations between interrupt points without affecting
debuggability. Our debugging techniques is not specific to SELF and could be
applied to other programming languages as well.

[5] About the dynamic de-optimization of HotSpot:
[http://stackoverflow.com/questions/20522870/about-the-
dynami...](http://stackoverflow.com/questions/20522870/about-the-dynamic-de-
optimization-of-hotspot)

[6] HotSpot Performance
Techniques:[https://wiki.openjdk.java.net/display/HotSpot/PerformanceTec...](https://wiki.openjdk.java.net/display/HotSpot/PerformanceTechniques)

Deoptimization is the process of changing an optimized stack frame to an
unoptimized one. With respect to compiled methods, it is also the process of
throwing away code with invalid optimistic optimizations, and replacing it by
less-optimized, more robust code. A method may in principle be deoptimized
dozens of times.

1\. The compiler may stub out an untaken branch and deoptimize if it is ever
taken.

2\. Similarly for low-level safety checks that have historically never failed.

3\. If a call site or cast encounters an unexpected type, the compiler
deoptimizes.

4\. If a class is loaded that invalidates an earlier class hierarchy analysis,
any affected method activations, in any thread, are forced to a safepoint and
deoptimized.

5\. Such indirect deoptimization is mediated by the dependency system. If the
compiler makes an unchecked assumption, it must register a checkable
dependency. (E.g., that class Foo has no subclasses, or method Foo.bar is has
no overrides.)

