It doesn't panic within the code you typed, but it absolutely still can panic on OOM. Which is sort of the problem with "no panic"-style code in any language - you start hitting fundamental constructs that can can't be treated as infallible.
> Basically every practical language has some form of "this should never happen" root.
99% of "unrecoverable failures" like this, in pretty much every language, are because we treat memory allocation as infallible when it actually isn't. It feels like there is room in the language design space for one that treats allocation as a first-class construct, with suitable error-handling behaviour...
Assuming memory is infinite for the purposes of a program is a very reasonable assumption for the vast majority of programs. In the rare contexts where you need to deal with the allocation failure it comes at a great engineering cost.
It's not really what this is about IMV. The vast majority of unrecoverable errors are simply bugs.
A context free example many will be familiar with is a deadlock condition. The programmer's mental model of the program was incomplete or they were otherwise ignorant. You can't statically eliminate deadlocks in an arbitrary program without introducing more expensive problems. In practice programmers employ a variety of heuristics to avoid them and just fix the bugs when they are detected.
Deadlocks also don’t result in panics in most environments. The problem isn’t so much bugs - those can be found and fixed. The problem is more that no_panic in most languages implies no_alloc, and that eliminates most useful code
It doesn't panic within the code you typed, but it absolutely still can panic on OOM. Which is sort of the problem with "no panic"-style code in any language - you start hitting fundamental constructs that can can't be treated as infallible.
> Basically every practical language has some form of "this should never happen" root.
99% of "unrecoverable failures" like this, in pretty much every language, are because we treat memory allocation as infallible when it actually isn't. It feels like there is room in the language design space for one that treats allocation as a first-class construct, with suitable error-handling behaviour...