It's amazing how a lot of the basics were written about in textbooks from the 80's and even earlier, but how some recent works by people such as George Hinton have reinvigorated the field and actually enabled practical applications.
- (smart)mobile phones
- the web
- practical pattern recognition using neural nets
aka deep learning
- Cybernetics (incl. Szilard, Von Neumann)
- Information theory (incl. Shannon, Turing)
- Whatever you might call LISP
- Overlapping window based BLT GUIs
Also, if you can find an ancient copy of Thomas Calculus when it was two volumes. Buy the first volume. Solve all the odd problems, the solutions are in the back. That's how I learned!
A later edition of Thomas was a single volume big thick yellowish book. It was very good. We used it as freshmen at MIT. You could learn from it. I don't remember if that version had answers in it. But solving problems and having answers to check against is key.