
Differentiable approximations to the min and max operators - aidanrocke
https://github.com/AidanRocke/analytic_min-max_operators
======
heartbeats
How does this stack up to the softmax[0] function, log(exp(n_1) + exp(n_2) +
... + exp(n_i))? Are they analytically equivalent?

0:
[https://en.wikipedia.org/wiki/LogSumExp](https://en.wikipedia.org/wiki/LogSumExp)

------
aidanrocke
tl;dr

1\. Within the context of optimisation, differentiable approximations of the
min and max operators on R^n are very useful.

2\. However, in order for these approximations to be useful they must also be
numerically stable.

