Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Convex functions aren't the only functions with a single local (and global) minimum - consider sqrt(|x|) for a simple 1d example.


Yes, that's true. But in optimization domain the concept of "convexity" is understood in terms of set, not always from the 2nd derivative of a function. Because you might have search spaces where you are not able to differentiate the objective function at all. In those cases the "convex" means a "convex set".


Sure: you can define convexity for a function that is equivalent to the 2nd derivative definition in the case that the function is twice-differentiable, using only the definition of the convexity of a set.

Define the epigraph of a function to be the set given by {(x, t) | f(x) ≤ t}. Then, we say f is a convex function iff the epigraph is a convex set.

This is equivalent (exercise for the reader!) to the usual definition that a function f is convex iff f((1-t)x + ty) ≤ (1-t)f(x) + tf(y), for all 0 ≤ t ≤ 1, with x, y in the domain of f.

Note that neither of these two definitions require differentiability (or twice-differentiability), but the definitions are equivalent in this case.[0]

---

[0] For proofs of all of these statements see B&V's Convex Optimization.


When talking about "convex optimization" one nearly always means that both the function and the domain are convex.


No need to define 'convex optimization' here, that follows from the definition of a convex function: F(ax + (1-a)y) <= aF(x) + (1-a) F(y) for 0<=a<=1 and x,y in domain of F. For the inequality to be satisfied ax + (1-a)y has to be in the domain. That mean the domain is convex.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: