
“What the Hardware Does” Is Not What Your Program Does: Uninitialized Memory - pornel
https://www.ralfj.de/blog/2019/07/14/uninit.html
======
userbinator
When K&R invented C, and when ISO/ANSI standardised C, I don't think this is
at all what they had in mind for UB. Whenever discussions like this come up, I
like to quote the C standard itself on its definition:

"NOTE: Possible undefined behavior ranges from [...] to behaving during
translation or program execution _in a documented manner characteristic of the
environment_ "

The whole point of C as K&R and ISO is to let you do "what the hardware does".
They left parts of the standard purposefully undefined, so it could remain
applicable to a wide variety of implementations. The intention was definitely
NOT "screw the programmer" as a lot of the compiler writers seem to have
interpreted it, but to allow them to do something that makes sense for the
environment.

Now we have, mostly academics from what I've noticed, that are taking the
language farther and farther away from reality; focusing only on something
uselessly abstract, completely ignoring the practical consequences of what
they're doing. Compilers are becoming increasingly hostile to programmers. A
programming language that had humble and very practical uses, with
straightforward and easily understood behaviour, has been perverted into
theoretical quagmire of uselessness.

Something very very odd is going on, and I don't like it one bit. It's very
WTF-inducing.

(I don't know much about Rust, hence why I didn't say anything about it. But
I've been using C since the late 80s.)

~~~
lmm
> Now we have, mostly academics from what I've noticed, that are taking the
> language farther and farther away from reality; focusing only on something
> uselessly abstract, completely ignoring the practical consequences of what
> they're doing. Compilers are becoming increasingly hostile to programmers.

I don't think it's the academics, it's the compiler implementers - driven by
wanting to win at benchmarks. Our industry is absurdly focused on
"performance" over all else (even correctness). But then again, those who care
about other things moved on to non-C languages years or decades ago.

~~~
roca
Indeed, it's compiler implementors, not academics.

However, the performance impact of optimizations that take advantage of UB is
not known, and is potentially very large. It would be a very interesting
experiment to modify a C/C++ compiler so that every C/C++ program has defined
semantics in terms of a simple array-of-bytes abstract machine, and see how
much slower the generated code is compared to the regular compiler.

~~~
userbinator
_It would be a very interesting experiment to modify a C /C++ compiler so that
every C/C++ program has defined semantics in terms of a simple array-of-bytes
abstract machine, and see how much slower the generated code is compared to
the regular compiler._

Alternatively, look at compilers like Intel's ICC --- it has historically been
one of the best at code generation, yet it's not known for having anywhere
near the same level of UB-craziness as Clang (or GCC, to a lesser extent). The
same has been my experience with MSVC, at least the earlier versions.

~~~
pcwalton
On the contrary, ICC _doesn 't even implement IEEE 754 correctly_ by default
[1], for performance. This is way more aggressive than GCC or Clang.

[1]: [https://hal.archives-
ouvertes.fr/hal-00128124v5/document](https://hal.archives-
ouvertes.fr/hal-00128124v5/document)

------
an_d_rew
> the Rust program you wrote does not run on your hardware. It runs on the
> Rust abstract machine

Excellent point, and very well made… Thank you for writing this!

There is a constant back-and-forth here on Hacker News about whether or not
“undefined behavior“ is the root of all evil or the root of all real-worls
optimizations… And your article does a great job of explaining, in real-world
terms, what UB really (in-part) is.

~~~
twic
I don't like this formulation much. I would rather say that the program does
indeed run on your hardware, but the program that runs is not the program you
wrote. It's the program into which program you wrote has transformed by the
compiler - better yet, it could be any one of the programs into which the
compiler is allowed to transform the program you wrote.

I prefer this because it is more open about the fact that all the weird and
counterintuitive behaviour happens because the language specifiers and
compiler implementers decided to make it that way (usually, with good
reason!). It's not some unavoidable property of the universe, or the machine.

The idea of an abstract machine is still really useful, because it lets you
reason directly about the code you are writing, rather than having to express
and reason about what the compiler might do with it. But i think we should be
clear that it's a tool for thinking, not a truth.

The idea that you can ignore the real hardware is particularly unhelpful in
Rust, because it's a great fit to low-level problems where the real hardware
is a big deal. For example, at work, we have a Rust program where we routinely
need to think about NUMA placement and cache coherency protocols. Those don't
exist in the Rust abstract machine at all!

~~~
anp
> The idea that you can ignore the real hardware is particularly unhelpful in
> Rust, because it's a great fit to low-level problems where the real hardware
> is a big deal. For example, at work, we have a Rust program where we
> routinely need to think about NUMA placement and cache coherency protocols.
> Those don't exist in the Rust abstract machine at all!

If I understand Ralf's overall model here, I think they'd argue that this sort
of reasoning must be done within the context of the abstract machine's
behavior for std::ptr::{read,write}_volatile, no?

~~~
twic
The kind of reasoning we do is "if thread A writes to this location, then next
time thread B writes to this location, it will have to take ownership of the
cache line, which will take N cycles, so let's not do that". I don't think
that kind of performance reasoning maps on to anything in Rust.

------
rkagerer
Would have liked to see the machine code generated by the example function,
and a deeper dive mapping compiler choices to the unintuitive results.

The article (indeed the point of it) abstracts that all away behind "undefined
behavior" and a mental model sitting between your code and its resulting
executable. Which is fine, but it leaves a loose end which fails to sate my
curiosity.

~~~
archgoon
It depends on whether you generate using rustc 1.28 or rustc 1.36, and whether
you're compiling with or without optimzations. This does not crash in
unoptimized rust (either 1.36 or 1.28) but it will crash in optimized rust
1.36.

[https://godbolt.org/z/8Yxl2c](https://godbolt.org/z/8Yxl2c)

I think though (as your question indicates); that the author misses the point
of why people care about "What the hardware does". At the end of the day,
assembly code is going to execute, and that assembly code is going to (despite
the authors protestations to the contrary) have well defined memory of one
value or another. The moment you start saying "Rust has a third value of
uninitialized" the question comes up "How is that abstraction enforced by the
hardware?" This is valuable information for understanding how the language
works.

From the authors discussion, I was expecting some sort of sentinel value being
checked; however, instead, the uninitialized memory access is detected by the
compiler and it panics uniformly regardless of the actual memory state.

The idea that one should only worry about the abstract virtual machine of rust
seems like an encouragement of magical thinking. "Don't worry about how any of
this works, the compiler will just make it happen". This will not go over well
with many people who are curious about learning Rust.

However, if the author is arguing "Don't let the behavior of a naive
enforcement of a Rust safety construct dictate how the optimized version
should work" this seems like a more interesting position; but it's not clear
that is the argument being made here.

~~~
Rusky
> However, if the author is arguing "Don't let the behavior of a naive
> enforcement of a Rust safety construct dictate how the optimized version
> should work" this seems like a more interesting position; but it's not clear
> that is the argument being made here.

This is exactly the point the author is arguing. The focus of all their work
on UB is to make sure safe Rust can do all the optimizations we would like, by
careful design of the abstract machine.

The immediately visible outcome of this work is a set of rules for what you
can do in _unsafe_ Rust, which taken together amount to this weird-looking
abstract machine with its extra "uninitialized" values- something that can be
implemented efficiently on real hardware _assuming no UB_.

The point here is that this abstract machine is a better, simpler, easier way
to convince yourself whether or not an unsafe Rust program is well-defined,
and that "what the hardware does" is too many layers removed to be a good tool
here. You can think about "what the hardware does" another time, for other
purposes, but trying to do so in this context is actively unhelpful.

~~~
archgoon
_shrug_. There's a difference between saying "This is a useful abstraction"
and saying "understanding what assembly is generated is irrelevant in
understanding what your program does so I will try to end all discussions
where it comes up".

I mean, the latter seems quite a bit more extreme, and is what the author
explicitly is calling for.

~~~
saagarjha
The issue is that there are no guarantees on the generated assembly, and what
your compiler of today may do is not necessarily what tomorrow's will.

~~~
archgoon
There are most certainly guarantees on the generated assembly. The assembly
has to enforce the abstract machine. I want to know how it does that. It can
change, that's fine, it can be improved, it can be made worse, but the idea
that the rust program doesn't run on physical hardware, as explicitly stated
in the article, is pure bullshit.

~~~
ralfjung
> The assembly has to enforce the abstract machine.

The assembly has to implement the abstract machine _only if your program has
no UB_. The assembly never has to check if memory is "initialized" or not even
though that distinction is real on the abstract machine, because if the
difference would matter, your program would have UB.

To determine _if_ your program has UB, looking at the assembly is useless. The
only way is to consider the abstract machine.

~~~
archgoon
When you are in UB it's even more interesting to ask what the hardware
actually does because the standard will not specify anything.

~~~
saagarjha
No, because the compiler will not generate code that is consistent in this
case.

~~~
archgoon
So? What do you think I'm arguing for here?

~~~
saagarjha
There is no standard mapping between "your C code" and "what your computer
will do" if your code has undefined behavior. Your compiler will produce some
assembly, which you cannot rely on, and _that_ will be "what your hardware
does". If that's what you're trying to say I think we agree.

------
rini17
Is there an example of actually helpful and practical enabled-by-undefined-
behavior optimization in C/C++? All I can remember are discussions of its
pitfalls, like this.

~~~
steveklabnik
One classic helpful example that gets brought up is integer overflow and
loops. Because overflow is undefined in C and C++, the loop doesn’t need to
check for it on each iteration.

~~~
mhh__
[https://d.godbolt.org/z/_r6C9N](https://d.godbolt.org/z/_r6C9N)

Example of an Integer overflow based optimization

~~~
andreareina
Wait, how does that work? Without overflow, how does the compiler prove that
eventually x == 7?

~~~
cryptonector
If x <= 7 then the answer will be 420, but if x > 7 then behavior is undefined
because x will overflow, and x is signed, and signed integer overflow is UB in
C, so the compiler is free to, for example, conclude that blazeit() is never
called with x > 7, thus it can prove that blazeit() always returns 420.
Besides, if the compiler chooses to implement signed integer overflow much
like unsigned integer overflow, then x will eventually come around to 7
anyways, so the answer must be 420.

~~~
rini17
Then later, after some minor unrelated change, compiler stops optimizing and
it suddenly starts overflowing the stack.

How can anyone put up with that?

~~~
cryptonector
I expect the compiler would optimize the tail recursion into a loop, so the
likely worst case is not that you blow the stack but that you spin for a
while.

~~~
rini17
Not really, any expectations here are unsupported by C standards. Compiler is,
by spec, allowed to do anything here.

------
quotemstr
I disagree that bit and byte level uninitialized models are equivalent.
Consider a program that uses one bit of a bitfield in a stack allocated
strict. The compiler is free to preserve the value of that bit however it
pleases --- e.g., in the carry flag --- and randomize the rest of the bits if
you ever read the whole byte.

~~~
ralfjung
Hm, good point about the bitfields. The paper I cite [1] actually talks
specifically about bitfields as their precise semantics in the presence of
"poison"-style uninitialized memory is not entirely clear yet.

[1]: [http://www.cs.utah.edu/~regehr/papers/undef-
pldi17.pdf](http://www.cs.utah.edu/~regehr/papers/undef-pldi17.pdf)

------
martinhath
Why aren't undefined bytes like these treated in the same way as I/O data?
That is, arbitrary but fixed data? This seems to align fairly well with how I
think about uninitialized data.

~~~
lmm
If you're asking why the C standard didn't originally define it that way, it's
because some architectures might use a trap/invalid representation (that traps
when accessed) and we want compilers to be free to reorder memory accesses.

~~~
martinhath
I think I'm mostly asking why this isn't a good solution for Rust, as I think
C and C++'s design decisions should be absolutely irrelevant for it's
development. However, since rustc uses LLVM, this seems to be difficult :(

I suppose it could be enlightening to understand why it wasn't a good decision
for C or C++ at the time either.

> some architectures might use a trap/invalid representation

Traps on what? Access of an invalid representation? What if such
representations doesn't exist?

~~~
lmm
> Traps on what? Access of an invalid representation?

Yeah - certain bit-patterns are just "invalid" rather than representing any
given value. It's much nicer to debug, because you get an immediate failure
(at the point where your code tries to access the uninitialized variable)
rather than having a corrupt value propagate through your program.

> What if such representations doesn't exist?

Then you can't implement that strategy (other than by emulating it with
considerable overhead, e.g. by having an extra marker byte for each variable
in your program and checking it on every access). Hence why the C standard
doesn't require you to do this.

As originally intended, C left the behaviour undefined so that users on
platforms that did have trap representations would be able to take advantage
of them. (It's very hard to rigorously specify what accessing a trap
representation should do without impeding the compiler's ability to reorder
memory accesses). Unfortunately it's ended up being used to do the opposite by
modern compilers - not only do they not trap on access to uninitialized
values, they abuse the undefined behaviour rules to propagate unexpected
behaviour even further from the code that caused it.

~~~
Faark
But then this should be documented in the types definition. I don't see this
"can also be something else than 0-255" in the in the types documentation
(that is arguably not at all detailed).

We use types to restrain complexity. It was a mistake in C# to allow every
object to be null. A better type system would allow devs to make a contract to
easily disallow this and they try to fix this. Now here we have a blog post
that seems to be fine with a function parameter of type u8 not actually being
of 0-255. That's a huge change I always understood the type. Do I now have to
do implement a null-check equivalent?

Undefined behavior for unsafe code is fine. But there has to be a transition
were we go back to classical behavior. And in the blog posts example, this
should be somewhere in _main_. Certainly not the seemingly safe
_always_returns_true_.

~~~
lmm
> But then this should be documented in the types definition. I don't see this
> "can also be something else than 0-255" in the in the types documentation
> (that is arguably not at all detailed).

It's not a valid value of that type - it's not a value you'll ever see if
you're using the language in accordance with the spec (and, in the case of
Rust, not a value you can ever see in safe Rust). It's an uninitialised value.

> We use types to restrain complexity. It was a mistake in C# to allow every
> object to be null. A better type system would allow devs to make a contract
> to easily disallow this and they try to fix this. Now here we have a blog
> post that seems to be fine with a function parameter of type u8 not actually
> being of 0-255. That's a huge change I always understood the type. Do I now
> have to do implement a null-check equivalent?

The point is for the language to do the null-check equivalent for you. A trap
representation is null done better. Silently defaulting to a valid value is
even worse than silently defaulting to null, because the value propagates even
further from the point where it's wrong - imagine e.g. a Map implementation
that, rather than returning null for a key that isn't present, returned an
arbitrary value.

(Of course in the case of a Map, returning Maybe is better. But there's no way
to do an equivalent thing for uninitialized variables, unless we made every
single field of every single struct be Optional, and that's actually just
equivalent to reintroducing null - the advantage of using Optional is the
ability to have values that aren't Optional, at least in safe code).

> Undefined behavior for unsafe code is fine. But there has to be a transition
> were we go back to classical behavior.

Unfortunately no, that's not and has never been how undefined behaviour works.
Undefined behaviour anywhere in your program invalidates the whole program and
can lead to arbitrary behaviour anywhere else in your program (this has always
been true with or without trap representations).

Pragmatically, what you want in the blog post's example is to get an error
that tells you that the bug is that x was uninitialized, as soon and as close
as possible to the point where x is actually used uninitialized. Ideally that
would be on the "let x = ..." line (and if you didn't use "unsafe", that line
would already be an error), but given that you've made the mistake, you're
better off having an error as soon as you touch x (which happens in
always_returns_true). Then you can see what the problem is and what's caused
it. If always_returns_true runs "successfully", returning false, then you
don't actually find out there's a bug until later (potentially much later) in
your program, and have to do a lot of detective work to find out what went
wrong.

~~~
ralfjung
> Unfortunately no, that's not and has never been how undefined behaviour
> works. Undefined behaviour anywhere in your program invalidates the whole
> program and can lead to arbitrary behaviour anywhere else in your program
> (this has always been true with or without trap representations).

I even have a post about this. :D [https://www.ralfj.de/blog/2016/01/09/the-
scope-of-unsafe.htm...](https://www.ralfj.de/blog/2016/01/09/the-scope-of-
unsafe.html)

------
saagarjha
> So each time an uninitialized variable gets used, we can just use any
> machine register—and for different uses, those can be different registers!
> So, one time we “look” at x it can be at least 150, and then when we look at
> it again it is less than 120, even though x did not change. x was just
> uninitialized all the time.

Is this actually true? I thought these would just be poisoned and then the
optimizer would just do whatever it liked in the presence of undefined
behavior (like optimize the function to return true).

~~~
cperciva
It could do that, but it doesn't have to. A common example is "a variable is
set inside a loop and used outside it"; if the loop runs zero times, the
variable will be used uninitialized, but the fastest compilation will assume
that the loop runs at least once and read the value from whatever
register/memory location the loop writes into.

------
rthrowayay
This poison thing is interesting because it seems like one poison value could
have the capability of poisoning all the data in your program. You could
imagine a faithful implementation of the rust abstract machine after hitting
one code path would cause your program to start generating complete nonsense.

------
scoutt
I appreciate the attempt to explain low-level stuff, but I think this is a
high-level language programmer trying to understand herself/himself an issue
without a clear idea of what he/she is talking about

>> The answer is that every byte in memory cannot just have a value in 0..256,

0..256?? Is this still 8-bits bytes??

>> it can also be “uninitialized”. Memory remembers if you initialized it.

This is plainly wrong.

>> So, one time we “look” at x it can be at least 150, and then when we look
at it again it is less than 120, even though x did not change. x was just
uninitialized all the time.

You might be dealing with a bug on a non-volatile variable. It has nothing to
do with allocated but uninitialized memory.

~~~
noobermin
The problem here (and I caught it too and sort of recoiled at it) was they
were using their logic and worse language/word choice regarding the "abstract
machine" being the actual thing we should think about when reasoning about a
programming language before they actually went out and were explicit that's
what they were doing. They saved the explicit thesis until the end of the
piece. They should have traded the writer's desire to be witty and hide the
lede and just lead with it upfront.

Of course you're right in part that their idea of "memory" is an abstraction,
but it isn't too wrong. A "variable" in C or any compiled language on a modern
machine is an abstraction that could refer to a register one moment, be a
place in the cache in another, be a place in memory in the next, and be on a
swapfile after. The "variables" are abstractions which lie in "memory" which
is another abstraction because it need not be in one place.

~~~
ralfjung
Good point, I should have at least mentioned that there is an "abstract
machine" when I introduce this strange kind of memory. Thanks for the
feedback!

------
IshKebab
I don't think this is really right. He claims that uninitialised memory is not
just random bytes, but it is!

It's just that there is a compile time optimisation that allows the compiler
to assume you will never read from uninitialised memory.

It would be perfectly possible to make a language (or even a C++ compiler)
that didn't perform that optimisation.

~~~
ralfjung
> I don't think this is really right. He claims that uninitialised memory is
> not just random bytes, but it is!

No it's not. To describe the behavior of a program involving uninitialized
memory (like the example in my post), at no point in time to you need to talk
about arbitrarily chosen bytes. The "abstract machine" on which a Rust
programs runs (of which your hardware is a fast implementation, but _only
accurate for UB-free programs_ ) does not "pick random bytes" when you
allocate new memory, it just fills it all with `None`.

You should not think in terms of optimizations when thinking about what your
program does. The optimizations the compiler performs can change from version
to version and are affected by seemingly random changes at the other end of
your program.

> It would be perfectly possible to make a language (or even a C++ compiler)
> that didn't perform that optimisation.

Sure. That would be a different language though, with a different abstract
machine. C/C++/Rust behave the way I described (and that behavior is not
defined by what any particular compiler does).

~~~
IshKebab
You missed my point. _Rust_ and _C_ may treat uninitialised memory in a
special way, but that doesn't change what uninitialised memory _actually is_.

Think about assembly. What is uninitialised memory there? It's just memory
with an unknown value.

------
pizlonator
Sounds like a lot of excuses for what is really a compiler bug that found its
way into the spec.

~~~
saagarjha
Undefined behavior is _not_ a compiler bug: it's a necessity that is required
to have language constructs that give you a way of doing things for which
there is no good way to define the behavior of. Your compiler optimizes your
code every day by concluding that you're not doing anything illegal: it would
have a rather miserable time if it couldn't make these assumptions.

------
verisimilitudes
The real issue is the C and C++ languages are horrible languages. They're too
high-level to correspond to any real machine, yet so low-level so as to make
such an abstract machine useful. The C language leaves the precise lengths of
types up for grabs, as merely one example. As for Rust, I'd figure it's poor
as well, considering it follows in the footsteps of C++.

I can compile an Ada program that has an uninitialized variable and use it,
but I get a warning; there's also a _Valid_ attribute that acts as a predicate
for whether a scalar value has an acceptable value or not.

To @userbinator , you're mistaken to believe the C has much design behind it.
There are many things where one requires a certain range of values and C
forces the programmer to use a type that's much larger than necessary and cope
with unwanted values. The C language leaves details to the underlying machine,
so long as that machine is determined to pretend it's a PDP-11. Most languages
that have a standard expect the programmer to follow it; since most C
programmers don't know what the standard says, having been lied to about it
being a simple and base language, they're offended when they do something they
never should've done; they shouldn't be using C anyway, however.

Abstract language details are necessary for a high-level language and can work
quite well if the language is designed well; this then leaves high-level
features to be implemented in whichever way is best for the machine; the C
language doesn't do this well at all, however, and precisely specifies the
nature of irrelevant details and so hinders the machine and implementation
possibilities.

The C language doesn't even have true boolean values or arrays thereof. You're
expected to use an entire integer that's zero or not and you're left to your
own devices if you want an array of these values that isn't grotesque in its
wastefulness. Meanwhile, most proper languages have the concept of types that
only have two values and can easily use an underlying machine representation
for efficiently representing these, without involving the programmer.

In closing, you may argue that C is necessary because it permits specifying
these low-level details, albeit required in every case instead of only where
necessary. To that, I direct you to look at Ada, which permits the programmer
to ignore such details wherever unneeded, and so leave them to the compiler’s
discretion, but allows size, address, representation, bit-level organization,
and more to be specified in those cases where it's truly necessary.

Here's a link others may like for learning more about Ada and the deficiencies
of C:

[https://annexi-strayline.com/blog](https://annexi-strayline.com/blog)

~~~
kyllo
> Meanwhile, most proper languages have the concept of types that only have
> two values and can easily use an underlying machine representation for
> efficiently representing these, without involving the programmer.

Don't other languages still use an entire byte to represent a bool though,
since memory access is at the byte level? Having a bool type in the type
system is really a language usability concern, I don't think it's at all a
performance optimization. And stdbool.h exists now, so that concern has been
addressed. When you want a bitmap, you can just use an int of the appropriate
length and do bitwise operations on it, instead of wasting space with an array
of ints.

~~~
verisimilitudes
>Don't other languages still use an entire byte to represent a bool though,
since memory access is at the byte level?

While at the discretion of the implementation, Common Lisp is a language that
can easily and transparently perform this optimization. Common Lisp even has a
specialized array type, BIT-VECTOR, which can only hold values of zero or one,
which is more likely to be optimized for size than other types. Ada allows the
programmer to specify data structures be optimized for size, which is nice.

Now, representing a lone true or false value is a different matter and I'd
expect it to consume an entire register or whatnot under most anything, since
you probably wouldn't be able to store anything else with the remaining space.

>Having a bool type in the type system is really a language usability concern,
I don't think it's at all a performance optimization.

Ada has a boolean type because there are clearly boolean situations, such as
predicates, and having a dedicated type reduces use errors. Programmers are
encouraged to define their own boolean types, though, such as (On, Off), say.

>And when you want a bitmap, you can just use an int of the appropriate length
and do bitwise operations on it.

That's what I was describing. Why should a high-level language have you making
your own arrays? Don't you agree that programs would benefit from a
specialized type for this that can more easily be optimized and specialized
for the particular machine and whatnot?

------
tomp
This is just terrible. I'm really sad that it's 2019, and not only are we
_still_ talking about undefined behaviour, but there are also blog posts
arguing _for_ undefined behaviour! I expect a good (programmer-friendly)
compiler to _at least_ warn the programmer in any case of provable, or
potential, undefined behaviour, or ideally, refuse to compile/use
_implementation_ -defined behaviour (i.e. exactly "what the hardware does").
Anything else is basically just inviting security bugs in your code.

But, for a counter-point: what is an example of a code/algorithm that not only
_uses_ undefined behaviour (i.e. relies on it in order to compile to fast,
optimized code), but also couldn't possibly be rewritten to eliminate
undefined behaviour (while keeping the same speed)?

~~~
nemetroid
> But, for a counter-point: what is an example of a code/algorithm that not
> only uses undefined behaviour (i.e. relies on it in order to compile to
> fast, optimized code), but also couldn't possibly be rewritten to eliminate
> undefined behaviour (while keeping the same speed)?

Any code that handles signed integers is going to assume that
overflow/underflow does not happen.

~~~
aarongolliver
Spent the week trying to figure out how to reimplement __builtin_add_overflow
(et al) on Windows and boy is it a chore. The previous implementer had
literally just used operator+ in a function called "safe_add" and I was
dumbstruck.

