xyz will be a lucrative profession if and only if
1. xyz is gated
2. xyz is intrinsically hard
3. fewer people attempt to get into xyz over time
4. xyz yields higher wages per hour relative to other professions.
I don't see programming fitting any of these axes, let alone all of them.
I don't think the presented axes are correct (#3 I'd argue isn't necessary, though there may be a different supply factor that is, and #4 itself is equivalent to being a currently lucrative profession, it isn't a separate consideration that is necessary), and programming certainly currently meets 4, and arguably meets 2, at least for some significant subfields of programming.
I was a Production Manager at a Medical Device company here in the Bay area previously for almost 3 years. Wanted a change of scenery so I started helping Garry out.
From there, I started helping out with events and tech/ops at YC and joined the team to help out internally, officially (Tara and co have been doing so much already)! All I've got to say is I came for the experience, and stayed for the community.
I don't know about the rest of their stack, but their Machine Learning team is pretty good. They were using PMML the last time I spoke to them. And a little bit of Scalding as well. Preventing fraud at scale is quite a challenging exercise.
The article itself refers to Boyd's Convex Optimization textbook, which is sort of a rite of passage for anybody doing anything math-related at Stanford these days - even their MS in Management course requires a semester of convex optimization.
One of the strange things about the software business is that software is an iterative process where the rate of iteration can go from linear to polynomial to exponential very fast. Another strange thing is that if you are a software guy, the whole world is your customer. The third strange thing is the constant reinvention of old wisdom, repackaged in new frameworks, languages, platforms.
These three strange things then combine in weird ways to take on a life of their own.
Had BCC been written in a prior era, it would have been some MFC code that ran only on windows 95 boxes, shipped on a dozen floppies, and purchased by atmost a few dozen people who bought a printed copy of Dr.Dobbs and looked at an ad for BCC and said, Hey, I need to make my own bingo cards.
But because he had a web app and priced it right and iterated upon feedback, the whole world came knocking on his door.
Luckily, I was at mlconf last week where Dr.Anandkumar spoke - they called her the "tensor lady" :) She's using tensors in machine learning for a bunch of things -
Latent Variable Models: Training LVM's using local search methods like EM, gradient descent, variational bayes etc. have a bunch of problems - they get stuck on local minima, the algorithms are hard to parallelize with poor convergence. In these cases, tensors yield guaranteed learning using embarassingly parallel algorithms, so faster convergence & can be run on Spark.
Also saw a demo on training 2-layer nets for GMM using tensors, and they learnt the weights rather fast. So using tensors in deep learning shows promise, though the techniques are in their infancy.
One of the challenges the professor mentioned was the availability of open source libraries to do tensor decomposition, which the above methods require.