For all of the courses I've taken (ML, PGM, NLP, SNA) I've followed roughly the same pattern: follow the course for 50-70% of the material doing all homeworks etc. Take what I've learned and go off do some fun projects. Once I hit a road block return to the course and finish up (usually months after the course is 'officially' done) typically forgoing the homework being I've already created 'homework' for myself.
Compared to my classroom experiences, I've found that this method greatly enhances the benefit of the materials I learn later in the course, as well as allowing me to more strongly reinforce material learned earlier. This is also something that physical classes simply cannot emulate. I do find it a little funny that I've never officially completed a coursera course, despite having covered much of the material very thoroughly. Hopefully as these courses continue we'll start to see them adopt patterns that extend beyond replicating the limitations of a physical classroom online.
Maybe that's an area where online classes can innovate further: hard deadlines could be optional. People like me who want them could enable them when they start a course; people like you who don't, could disable them instead.
So there's no clear-cut answer. Making hard deadlines optional could very well cause more damage and less completion even if it's better for a few individuals. I hope that Coursera is collecting analytics on how people perform with different types of deadlines, so that they can come up with the best overall solution.
I just learned about the Neural Net class about a week ago and it's scheduled to end today. I decided not to let that stop me and went through all the lectures in a week. I'm going to have to watch a lot of them again, and I need some time to work with the concepts, but I enjoyed 'cramming' it all in one big gulp.
On the flip side, I stopped doing the Computational Finance at week 8 of 10, but I'll go back and finish it up over the holidays.
And on the third hand, the Modern American Poetry class I did was best done following the timeline, so that I go to participate in the forums at the same time as everyone else. That was a very forum heavy class, though, which is rare.
Anyway, I'm just glad all this stuff is out there now. I've learned a ridiculous amount in the last year.
In the case of edX, MIT's 6.00x course is strictly deadline based. CS50x allows you to take the course at your own pace as long as you finish everything before the 13th of April, 2013.
But those aren't the ones you're looking for. Udacity meets all your requirements. They have open enrollment, meaning you can join in at any time. Furthermore, they have no deadlines. As long as you complete all the problem sets and give the final exam, you get a certificate.
Here are some Udacity courses that might interest you:
Statistics 101 - http://www.udacity.com/overview/Course/st101 - Taught by Sebastian Thrun
CS271 Introduction to Artificial Intelligence - http://www.udacity.com/overview/Course/cs271 - Taught by Sebastian Thrun and Peter Norvig
CS373 Artificial Intelligence for Robotics - http://www.udacity.com/overview/Course/cs373 - Taught by Sebastian Thrun
Hope that helps.
There's CS188.1x by Berkeley on edX - https://www.edx.org/courses/BerkeleyX/CS188.1x/2012_Fall/abo...
It's deadline based, but I don't know whether they will remove their material at the end of the course or not.
Obviously, you can avoid this by downloading all the videos and slides as soon as you have access.
I believe that all Udacity courses are now self-paced. You can go through the lectures/quizzes/assignments whenever you want to.
Is it possible to review old courses and take them at my own pace, or are they completely gone once they are done?
Academic honesty is probably the biggest hurdle that online education is going to face. Once grades for these courses start to matter (and at some point, they will) there will be people in the world who will try to cheat.
to beat a dead conversation past the point of anyone caring other than me...
if you looked at the minute by minute activity, an honest effort to take a class should produce a fairly detectable minute by minute signature
for example, someone answering the questions who hasn't watched the whole video, unlikely to be an honest effort
there are enough data points in there, that detecting cheating should really be a question of mining the right data points
ML is the general introductory course.
PGM is more about applied knowledge.
NN is a specialized and very good one.
There is also Berkeley CS188.1 on edX, which covers search.
Of course, just courses are not enough. You need textbooks, AIMA, PAIP and PGM are highly recommended.
But most importantly you need practice. One cannot learn how to swim, ride a bicycle or wrestling just by reading books and watching movies.
Very few regions of your brain learn something just by watching and listening. Those which are active when you actually do something must be trained by doing. It is how you will form your own intuitions, timing, etc. Non-verbal parts of the brain are doing the most of the job. Neural networks need to be trained on real data to learn its behavior.)
This simple idea is a cornerstone of MIT approach in teaching CS. They do lots of projects and labs, working in groups. This is real training. Recent version of 6.01 is a good example.
So, start your own project or join some Open Source. Then you will get real practice, training and feedback from other people. It is not meeting that matters, it is practice.
Scala is just a buzzword.
Disagree. There's a strong case for static typing. I am not saying that it is "right" or "wrong", but this static/dynamic trade-off is not simple and has no all-purpose answer.
Also, you will never convince most engineering managers to let you write a production system in Common Lisp. They may be wrong in their prejudice, but they're focused on risk-reduction and would prefer all production software to be written in Java. Scala has a fighting chance of getting into the "for production" language space. In 15 years, most of the good Java engineers are going to have moved to Scala and prefer it.
Out of curiosity, have you tried Clojure? It's pretty neat. I actually like it better than I like Common Lisp.
I know only one way to reduce a risk of running a software - choice an appropriate hardware, and run as less software as possible.
You must know your hardware and your software. That means it must be compact, readable and easily modifiable, so that you can quickly adapt and fix it, on the go.
So, I would choose a well-defined, mostly-functional, small, simple language, with decent compiler directly to native machine code and very thin FFI, to use specialized OS services.
It could be a solution from http://scheme.com or Gambit-C, not anything that begins with J.
So, in my opinion, a native compiler for, say, Arc to x86_64 (written in Lisp) with UNIX integration (NO Windows support) would be a better solution. Unfortunately, it doesn't exist yet.)
It seems like Go is taking the same approach - they have very thin layer of abstraction on top of an OS (post of Plan9 core libs) they have native compilers, and comprehensible runtime. So, this is real, available better solution.
There is Erlang - almost the same underlying principles, but complicated with a VM.
In order to reduce risks you must have deep understanding of what you are running.
I'm investing my time in learning this language: I'm sure it will be more widely used in the following years.
The problem with Coursera, IMO, is that despite the promise of free education for all, the very large amount of courses with a broad spectrum of usefulness and difficulty makes it too difficult to navigate and can cost a student hours of wasted time watching videos that may be far beneath or above the student's abilities. I would suggest Coursera creates a pre-exam section for all the courses so the student can have a slightly better understanding of what they are signing up for. Hell, they are promoting themselves as a Machine Learning repository, can't they do a basic suggest-a-course section based on pre-exams?
With that said, I love the idea of Coursera, Udacity, EdX, etc. and what they promote. I am one of those people who, despite being American, had zero chance of ever getting into college due to the quality of my inner-city high school and financial circumstances. Of course, the issue is that if you do grow up in this circumstance, you probably won't be able to take advantage of the free education because you may not have a computer. Regardless, anyone who wishes to partake can join with little more investment than a $200 computer from Walmart, so the financial excuse isn't quite the barrier that it once was.
How much opportunity do any of these courses open up? How much do employers give a hoot about a silly certificate from a pseudo-school that rides on the backs of watered-down Ivy League classes? Employers still take a degree from DeVry more seriously than the scattered layout of MOOC, and until this is changed, there will never be tangible proof or progress in the sphere.
I could tell employers all day that I read, and worked through, SICP and Cormen, but why should they care? They shouldn't because there was no one around to tell me my code is wrong. I can show employers websites I built using exotic languages such as Clojure, but why should they care when I can't pass a basic whiteboard test? I could tell them that I can proficiently write in x y z language, but they don't care and they shouldn't. To all of those people who suggest that employers don't care about your background education and only care about the projects you worked on, I challenge you to point out how many of your coworkers don't have at least something that resembles a Formal Education (TM). Coursera is not a Formal Education (TM). The proof will come if, and only if, one day a student self-learned from Coursera with a large portfolio with self-projects nails a job over a kid from MIT with no portfolio. No one, at this point, can seriously say this is happening and a convincing scale.
I think you overestimate the utility of formal qualifications for getting a programming job (assuming that that's what you're talking about here). In fact I would say that if tell a potential employer that you've worked through SICP, you show them some projects that you've done and they reject you on the basis of not having a degree, you've dodged a bullet!
I'm completely self taught. I haven't got a degree, nor have I worked though SICP. I'm in my mid 30s and I've been professionally programming for about 5-6 years. I work at a hedge fund in London; it's doable!
The winners were a mechanical engineering student, an actuary, and an insurance risk analyst. This is all neat, but I was refering specifically to students w/ no education using these courses as a replacement for college.
Yes, as an add-on, I give it a huge Plus 1, but as a replacement, not yet. I'm not saying there is no hope, I'm saying it's not quite there yet.
By whiteboard test, I assume you refer to a coding problem given at an interview. I can see that it can be very difficult to get an interview without formal credentials, but once there, wouldn't your experience allow you to shine?
To be clear, I wasn't exactly stating my own credentials in that post, I was simply stating that even with said credentials, there is little chance of being taken seriously. I never read Corman, but the rest is fairly accurate.