Dear Googlers, it would be interesting to know how computational resources are allocated to new ideas (eg. Kurzweil's PRTM-based NLU system) at each stage, from prototype genesis to mature technology. What are the factors that come into play?
Thanks jpdoctor. Could you elaborate on what you mean by formal defn? Who defined them, out of which principles or goals? (I understand that this might have multiple answers)
ml-class.org does a phenomenal job in equipping you with the practical knowledge needed to apply the tools of machine learning to real problems.
There is no reason why learning to use these tools should be hard. If you want a challenge, there are plenty of problems in the world amenable to solution via machine learning, especially in today's data deluge.
If you want a deep mathematical appreciation of the algorithms and their derivation, you should do CS229, not CS229a.
How then do you decide which projects are worth trying on the large scale?