A firetruck is _____
Try typing this in Google and you'll get "red", "moving" and "made". During the course you build a network that trains next-word completions using arbitrary bodies of text. You can train it for hours, days or weeks...and it just gets better and better. Eventually you will max out the capacity of your network, but then you can fiddle with the number of nodes and other hyperparameters. In the end you're just training a "black-box" nonlinear function to best approximate an unknown function defined by training data.
I agree that the math can be complex, but I think it boils down to probability and the notation of presenting the ideas more than the underlying concepts. I feel like the most advanced math used in NLP is the log function, personally. Along with working with big arrays of data, or structures like Markov models and neural nets, which tend to be just arrays of numbers.
In a normal AI course, we had to form write-ups of contemporary AI articles, and one I found interesting was a model for summarizing text, including chapters, books, and other writing. The key idea was finding the most significant sentences in any given paragraph or unit and then using that verbatim.
It might be interesting to take some of these simple ideas and flesh them out with some of these advanced AI methods. For example, finding a more complete meaning of a book chapter and rewriting the summary.
That's the kind of AI work that I think people expect and are looking for from the NLP field, and it's not necessarily out of reach currently.
Do you have a cite for that?
A. Kazantseva and S. Szpakowicz, "Summarizing Short Stories." Assoc. for Computational Linguistics, vol. 36, no. 1, pp. 71-109, Mar. 2010. [Online]. Available: http://www.mitpressjournals.org/doi/abs/10.1162/coli.2010.36...
There is a PDF available. It's about 40 pages long.
Instead, I did a simple project on searching using language processing and just read Foundations of Statistical Natural Language Processing , which is not too difficult, and Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics and Speech Recognition , which is a pretty heavy read but a great reference. I was able to find a used copy of the second book for $0.30.
I also put a bit of study into articulatory phonetics and speech recognition as part of a graduate study-abroad, which is an interesting field on its own, but I always wanted to come back to computational linguistics.
Here's another good one from the creator of coursera (Stanford grad I think)
Its Bengio's (very well known deep learning researcher) upcoming textbook. I would highly recommend to anyone interested in deep learning/neural net subset of machine learning.
It uses a lot of the same concepts - recurrent nets and word embeddings. If you guys want to play around with it in a real life scenario, head over there to check it out. Discussion here . Link here. 
Edit: Update wrong link.
Whereas deep learning is going down the creating consciousness route?
I found that talking about it along the lines or neural networking seems to be more accurate
It's free to browse online if your interested:
That chapter however is a nice intro to neural networks, and the previous chapter is a nice intro to genetic algorithms.