
Approximating a solution that doesn't exist (2009) - kawera
https://www.johndcook.com/blog/2009/08/11/approximating-a-solution-that-doesnt-exist/
======
kxyvr
I've posted this on HN before, but in the world of optimization, I really like
the book Convex Functional Analysis by Kurdilla and Zabarankin. It's the only
book that I know of that gives a comprehensive treatment of when a solution to
an optimization problem exists and can be attained. Essentially, the entire
book builds up to what they call the Generalized Weierstrass Theorem, which
gives the appropriate conditions in a few different ways.

Anyway, I work on optimization solvers and this is a big deal since there's
always a question in whether or not the problem we pose can really be solved.
Numerically, the solver can churn forever and maybe it's just hard, or maybe
there's no solution. Outside of LPs, it's really hard to tell. Even convexity
isn't enough for a solution. For example, min exp(-x) doesn't exist even
though exp(-x) is strictly convex. There's an infimum, however.

Mostly, this is a way to say that I agree that the theory matters. I like the
book above for establishing these conditions in optimization.

Edit 1: Does anyone know a good, nonnonsense book for establishing similar
conditions for either ODEs or PDEs? In case it matters, I prefer looking at
things from a general functional analysis/operator point for a view.

~~~
johndcook
re book recommendation: Lawrence Evans's book Partial Differential Equations.
Functional analysis etc.

------
conistonwater
> _Also, it’s not obvious from looking at the equation that there should be a
> problem at t = 1._

I found this curious, it seemed obvious from the equation there is going to be
a problem. If you think about it, u'=u^2+1 has the solution tan (up to some
constants), which is singular. So a function that satisfies y'=y^2+t^2 will
satisfy y'>u' for t>1, and so should also have a singularity, you just don't
quite know where it's going to be.

~~~
monochromatic
Everything is obvious once you know about it. But be honest--if you looked at
that equation in the course of your daily work (rather than in a blog post
about solutions that don't exist), would you immediately think "well that's
gonna blow up real quick"? I sure wouldn't.

~~~
conistonwater
Yes, I would, that was the point I was trying to make, it _looks_ like it has
a singularity. In my comment I was trying to explain how one might be able to
spot it in advance.

~~~
monochromatic
I guess. It doesn't jump out at me the way that 1/x or tan(x) would though.
But I haven't had a lot of reason to mess with differential equations since
undergrad, so maybe I'm just rusty.

------
mehrdadn
> When I first saw this example, my conclusion was that it showed how
> important theory is.

He drew the exact _opposite_ of the conclusion I see here. This seems like the
perfect example for arguing why theory _isn 't_ important. Anybody who knows
nothing about the theory could still tell that there's no solution --
precisely _because_ of the very observation that lowering the step size (error
tolerance) doesn't stabilize the graph and keeps causing larger and larger
changes. If you had a solution, lowering the step size would eventually make a
vanishingly small difference, and it's clearly doing the opposite. You don't
need Picard–Lindelöf to tell you that.

------
tlb
If you get rid of the t^2 term it still goes to infinity at t=1.

 _dy /dt = y(t)^2; y(0)=1_

has the solution _y = 1 /(1-x)_

~~~
nokcha
>has the solution y = 1/(1-x)

1/(1-t), not 1/(1-x), right?

~~~
tlb
Right

