Hacker Newsnew | comments | show | ask | jobs | submit login

Is Boyd still talking about convexity?

Convexity is an important concept, and the world is just awash in rock solid treatments.

So, for some positive integer n, consider n-dimensional Euclidean space. Suppose C is a closed, convex subset and point x in the space is not in C. Then there exists a closed half space that covers C and omits x. The boundary of this closed half space can share a point with the boundary of the closed half space so that the plane of the half space is 'supporting' for C. So, this is a 'separation' result. Yes, it generalizes past finite dimensional Euclidean spaces! Yes, it's a darned useful theorem! E.g., can ask for the point in C closest to x -- the point has to exist since C is closed. Then the plane through this point and perpendicular to the line to x is supporting for C. So, can get a nice projection result and use it for optimization and best approximation.

Okay, my point: This result is one of the most important about convexity but is standard in 'analysis'. I first saw the result in the 'advanced calculus' book, Fleming, 'Functions of Several Variables'.

Bigger point: Convexity pops up frequently in analysis. E.g., there is Jensen's inequality. With it can easily knock off a nice list of otherwise difficult to prove inequalities. E.g., in the L^p spaces, we can use Jensen's inequality to get Holder's and Minkowski's inequalities.

When we start with optimization, we should start with convex sets and also convex functions. E.g., there is a nice list of theorems of the alternative that are separation results for cones and polyhedra that can be used to establish some otherwise tricky results about duality in linear programming and are also key to the Kuhn-Tucker conditions in non-linear programming. Of course, the feasible region in linear programming is a convex set.

In non-linear programming maximization, there is a non-linear dual that is to minimize a convex function, and that can be the core of the powerful technique of Lagrangian relaxation.

When we have a norm on a vector space, the locus of all points distance 1 from the origin is convex.

In facility location we can be minimizing a convex function and doing so by constructing a sequence of planes supporting the hypergraph of the convex function.

Alas, I never heard a lecture from Boyd!

There are plenty of rock solid, highly polished books where the role of complexity is made clear!




Absolutely it is possible to get a good high level picture of a field without taking any particular class. Moreover things like "convex optimization" can be done in the footnotes of a serious maths course. But on the other hand until there are again 300+ people functional analysis classes like in the old soviet union, this is perhaps the best there is.

-----


Yup, Kolmogorov and Fomin -- NICE book. I never learned from it, but I believe that eventually I got a copy, looked through it, and liked it.

If Boyd has some nice engineering applications, terrific!

-----




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: