The one thing I'll add to this, is to say that, for devs like us, things like Coursera, Udacity, EdX, etc. are really valuable. When you're a mid-career professional who already has strong technical chops in at least one area, "on demand education" like this is super valuable in multiple regards:
1. It's much more accessible since you don't have to go sit in class, on campus, every day, etc. You can fit it around your existing life much more easily.
2. It can serve as a nice way to start "bridging" to a new area. For example, if you're already, say, a skilled Java developer and you want to start moving into Data Science, MOOCs offer a nice way to pick up some additional credentials to help that transition. People argue about the value of certificates from Coursera and the like, and that's fair. I think you don't want something like that to be the only credential you have, if possible. But taken as a complement to your existing credentials, experience and skills, I believe these things can be very useful.
3. Related to (2) above, but MOOCs can be a nice way to add complementary education outside of tech altogether. If you're a developer who aspires to eventually move into management or whatever, consider taking business classes from Coursera/EdX/Udacity as well. There are some really nice offerings out there, including a complete (accredited) MBA program that you can do (partly) through Coursera in conjunction with the University of Illinois. Or maybe you're a Java developer (just an example, I don't mean to pick on Java people, as "I is one") who thinks that something to do with synthetic biology is going to be "the next big thing", but doesn't want to go back to school for a biology degree... great, there's a ton of life sciences / biology / chemistry /etc. stuff that you can take online if you want to start positioning yourself for something like that.
Let me also add this: I totally agree that you don't need to go "all in" on every new tech that comes out, and try to ride the hype wave for everything new. But, I think it's smart to at least explore (many|most|some|??) of the trendy new stuff, to a limited depth... at least dip your toes in the water, do the "Hello world of XXXX" where XXXX is Swift, or R, or Rust, or Scala, or Node, or Go or whatever. At least give yourself a fair chance to evaluate the new stuff, decide if you feel like it's worth investing more in based on direct experience, and at least get a feel for the toolchain and what-not. From there, you can kind of monitor what's going on around you and decide if/when to go deeper with new technology XXXX. Yes, doing all this requires an investment of some time, but that's part of the cost of keeping your value up.
Of course, the truth is, most of this goes equally for the "under 40" crowd as well. But the whole "keep learning new stuff" thing probably becomes a little more important as you get older. There might be exceptions like the whole "COBOL programmer who gets paid big bucks because nobody new is learning COBOL anymore" but I'd consider those situations to be exceptions.
On my first meaningfully paid job, first actually useful thing I did was that I went into corporate archives and found documentation for Oracle 7i and Microstation 95 which I was supposed to be using. Somehow nobody else there even thought of reading the docs. (the fact that I somehow automated myself away in this job is another thing)
I'm a big fan of the "teach yourself by reading the docs" thing and I've done a lot of that in my career. But since I started doing a lot of these online courses, I've found that they can be very beneficial. Having a little bit of structure around the learning process, having "classmates" to discuss with, etc. have helped me as I've been (for example) learning R. Now, could I learn R by just reading books, tutorials, and experimenting on my own? Sure, of course. But the courses are pretty cheap, add a little bit of rigor to the process, and come with a minor credential from a well known / highly regarded university (Johns Hopkins, specifically). All in all, I've found that this approach works really well.
Just to share an example of one regard in which the online classes have had value above and beyond me simply doing it entirely on my own... The "R ecosystem" has a LOT of stuff in it. If I just started playing around with R, I'd have little clue which libraries and other tools to try out. But the courses I've done so far have guided me towards learning things like ggplot, dplyr, knitr, etc. Knitr in particular turns out to be really cool, and I'm glad I discovered it... but I don't think it's something I would have gone looking for on my own. Maybe I would still have found it eventually anyway, but as it stands, I feel like the course approach guided me in some useful directions early on.
YMMV, of course.