
The Convex Geometry of Inverse Problems [video] - espeed
https://www.youtube.com/watch?v=pcIGP9X9E40
======
Shalhoub
Microsoft Research: 'Each year Microsoft Research hosts hundreds of
influential speakers from around the world including leading scientists,
renowned experts in technology, book authors, and leading academics and makes
videos of these lectures freely available.'

It's a pity Microsoft doesn't expend energy in figuring out how to make
opening an email attachment or clicking on a URL safe.

~~~
whorleater
Microsoft has over 100,000 employees. They're not pulling resources from other
teams for Microsoft Research.

------
digitalemble72
I think it's important to note that all of these methods have been largely
superseded by deep learning techniques. For example, we can now directly learn
algorithms such as gradient descent[1] and classical inverse problems like
superresolution are solvable with deep networks [2]. While there still may be
a role for tools like CVX, I anticipate all future progress will come from
end-to-end differentiable systems.

[1] [https://arxiv.org/abs/1606.04474](https://arxiv.org/abs/1606.04474) [2]
[https://arxiv.org/abs/1501.00092](https://arxiv.org/abs/1501.00092)

~~~
physPop
Largely superseded? This is a sarcastic comment, right?

In case it isn't: the assertion that these are superseded is categorically
false. For any reasonably computationally difficult problem, being able to
capture the structure of a problem is hugely powerful, rather than blindly
throwing deep learning algorithms at it. Just becuase you _can_ doesn't mean
you _should_.

For example, algorithms for convex problems in particular can be orders of
magnitude more efficient than naive nonlinear approaches. Also consider the
case where the problem has some (possibly sparse) structure, where custom
solvers can render trivial otherwise computationally intractable problems.

~~~
scotty79
I read about application of neural networks to fluid dynamics. It ended up
being faster than usual approaches. At least some matemathical solutions might
eventually be superseeded by pretrained nn at least in some contexts.

~~~
SmooL
I think the key was that the neural network didn't compute the fluid model
accurately so much as a compute a simulation that looked accurate to humans

