You are correct on both points. Early in my learning, I found both points to be important. And, for a beginning student, both points can be crucial: Without the second point, it is far too easy for a student to get into some not very good material or into some good material but, still, lost.
Eventually I got away from your second point.
Still, there is a version of your second point that lasts: For research, seminars and conferences are good, fast ways to keep up, see the forest for the trees, pick new research directions, etc.
When my company is successful and I retire, I will return to mathematical physics and, maybe, attend research seminars in, say, Boston.
Convexity is an important concept, and the world is just awash in rock solid treatments.
So, for some positive integer n, consider n-dimensional Euclidean space. Suppose C is a closed, convex subset and point x in the space is not in C. Then there exists a closed half space that covers C and omits x. The boundary of this closed half space can share a point with the boundary of the closed half space so that the plane of the half space is 'supporting' for C. So, this is a 'separation' result. Yes, it generalizes past finite dimensional Euclidean spaces! Yes, it's a darned useful theorem! E.g., can ask for the point in C closest to x -- the point has to exist since C is closed. Then the plane through this point and perpendicular to the line to x is supporting for C. So, can get a nice projection result and use it for optimization and best approximation.
Okay, my point: This result is one of the most important about convexity but is standard in 'analysis'. I first saw the result in the 'advanced calculus' book, Fleming, 'Functions of Several Variables'.
Bigger point: Convexity pops up frequently in analysis. E.g., there is Jensen's inequality. With it can easily knock off a nice list of otherwise difficult to prove inequalities. E.g., in the L^p spaces, we can use Jensen's inequality to get Holder's and Minkowski's inequalities.
When we start with optimization, we should start with convex sets and also convex functions. E.g., there is a nice list of theorems of the alternative that are separation results for cones and polyhedra that can be used to establish some otherwise tricky results about duality in linear programming and are also key to the Kuhn-Tucker conditions in non-linear programming. Of course, the feasible region in linear programming is a convex set.
In non-linear programming maximization, there is a non-linear dual that is to minimize a convex function, and that can be the core of the powerful technique of Lagrangian relaxation.
When we have a norm on a vector space, the locus of all points distance 1 from the origin is convex.
In facility location we can be minimizing a convex function and doing so by constructing a sequence of planes supporting the hypergraph of the convex function.
Alas, I never heard a lecture from Boyd!
There are plenty of rock solid, highly polished books where the role of complexity is made clear!
If Boyd has some nice engineering applications, terrific!