There's a big difference between having an instructor and learning things from a book or video lectures. An instructor can correct your mistakes and give you feedback. A book or a video can't do that.
Some people presumably don't have a right kind of discipline to study on their own. This is why remote study programs can be a lot more difficult to complete vs. on-site programs.
EDIT: There's a known effect, discussed in the Coursera "Learning How to Learn" course, where you read through the material/watch a video and think that you've actually mastered or learnt it while in fact you have not. If you're aware of this and test your knowledge you can fix that but it's another reason why a structured program with an instructor can have an advantage...
Some people presumably don't have a right kind of discipline to study on their own. This is why remote study programs can be a lot more difficult to complete vs. on-site programs.
I needed the structure and social interaction of the college environment. But oddly enough, it depended on what discipline I was studying. I would not have been able to learn math and physics on my own, and those were my majors. On the other hand, I easily taught myself programming and electronics.
A friend whose daughter got an online degree (while at home with a baby, living in a small rural town with no other options) said that the hardest thing about studying online was the sheer boredom of sitting at a computer all day with no one to interact with.
A particular advantage of going to college for me, was that I learned how to teach by observing my professors in a variety of settings, and imitating their methods. Today, a lot of my work is surprisingly similar to teaching: Presenting ideas to people who are not experts, mentoring younger colleagues, etc.
Programming and electronics have clearer outcomes and feedback mechanisms than physics and math. Unless you have a lab setup, you can't test out your physics knowledge to the same degree you can electronics and programming. Math is similar. You can try to recreate proofs of various things to demonstrate a capacity for proving or an understanding of the underlying system in which you're constructing proofs, but ultimately you're not creating anything novel for quite a while.
With programming and electronics, self-study offers the opportunity to make things, both novel and derivative, and immediately (or near enough) see whether or not the function. Similar to setting up an experiment in a physics lab, but again, who has a home physics lab? Programming can be done on a cheap (< $200) laptop. A large portion of electronics can be accessible with a similar investment in (mostly reusable) parts.
One other possible difference is that, at least for a time, the subject matter itself was getting easier to learn. For instance the inventors of programming languages have tried to figure out what makes a language good for teaching. (I studied Pascal, which originated as a teaching language).
For electronic components, "easy to learn" meant things like designing op amps and logic chips that behaved more like their textbook symbolic representations without weird tricks and pitfalls.
In contrast, you can't change electromagnetics just because you want something that's easier to learn. Nature won't approve. Progress has been made towards explaining it better, for instance, today's notation is simpler than what Maxwell had to grapple with. But such progress comes slowly.
My undergrad and grad were in liberal arts. After earning my PhD I took an mpa online as I was bored and able to pay for it out of pocket easily enough. One of the things the stood out for me at least is that I really missed just browsing the rare book collections and all the various microfilms and archives the library the university had. The digital pdf articles weren't really anything special. I probably learned more from the stacks than I did even while working on my dissertation. Actually my one professor one time brought me to the archives and taught me in detail all the numerous odd tricks in researching old paper documents. This has been an important lesson as I research a lot of older non-digitalized archives on personal time for side projects.
In hindsight, I have this conflict of views sort of. If automation goes the ways in which some fear, then physical institutions will provide a place for people to go instead of rioting out in the streets. To build on Salman Khan's thoughts about online education becoming the Aristotle/Plato model while the physical area acts as a place to build projects and so forth.. this could be one way to go [0]. At the same time, I've read a number of books about the atmosphere of campuses pre-1960s. Personally I feel we protest too much these days. So perhaps a focus on online campuses while limiting fewer people to the physical campuses can allow for the protests to die down and perhaps a return to the pre-1960s era. That said, I have my doubt that people are just going to be content sitting at home in mass learning.. so perhaps Salman Khan's ideal will be what emerges. Then again, it would also be ideal that we don't throw out the old way completely. It would be ideal with some smaller liberal arts colleges could be dead set in their ways in just teaching old liberal arts. Ideally without the computer so to encourage actually reading texts in whole and producing critical thinking. This would be the elite probably due to the cost of that education.
My bias yields to the hope that the majority of all jobs are destroyed by automation and that people having nothing else to do but to get phd's in the subject that they are interested in. Perhaps though the access of education will depend on the level of talent and knowledge; for the masses a khan/coursera style approach, for the mid-level an approach that is a hybrid of khan as Plato model and physical interaction for project development. The upper end, the elite would be in person primarily, small classes in cohesive campuses with lack of protests that disturb the open learning environment; of course moocs would be available to these students but the primary purpose would be more old form to develop better critical thinking skills. The moocs, even if PhD programs are developed will suffer the hive mind problems as one can see in say many social media platforms (reddit, facebook, twitter etc).
Problem is that in many of the top CS universities today the lecturer is just a top researcher who's forced to read off of slides for a couple of hours a week and doesn't have the time, desire, the skills or the incentives to be a great pedagogue to the hundred or more students sitting at his/her lecture.
In many universities it does feel like you're basically paying for a very expensive real-life recording of something you could have just watched online. Sure, there are TA office hours, but is waiting in line to talk to a TA for 10 minutes a week worth tens of thousands a year?
IMO we should focus on teaching meta skills as soon as possible, teach people HOW to learn, and then let them be auto-didacts with opportunities to practice their craft and be surrounded by other people struggling to be great at the same skill.
This is obviously not talking about high stakes professions such as surgery, where perhaps the traditional approach is the best we can do until computers can vet our skills.
I studied computer/electrical engineering at my institution. A year or two after I graduated, the CS department lost their accreditation. I wasn't surprised when I heard the news. Of the CS classes I was required to take, one was ostensibly on operating systems. It was taught in C, the grad student lecturer only knew Java. Another instance, we were studying MIPS architecture. The professor was moonlighting in a start up that was going under, and couldn't be bother to show up to lecture, and the TAs he sent were clueless. The EE/CPE students that had already studied this stepped up & taught the class because the TA was inept and the prof never showed. I was super pissed when I got a D on my project on a minor technicality, despite having basically taught the class in the absence of the prof and a competent TA.
What institution was this if you don't mind me asking? I can't say that I much enjoyed the CS courses I had to take as part of the computer engineering program myself, as it was mostly taught from power point and definitely the material in lecture was mostly useless for projects and such. MIPS architecture, though, was really fun and the instructor was great but kind of a jerk at times.
If the dropout rate is 87% with 100,000 enrollees, but if a university is 90% with 5,000 enrollees, the Open University still wins. And then you take into account the number of people who enroll for OU who enrolled because of the lower barrier to entry vs their level of commitment.
As an OU graduate (late 90s), my first thought was that a lot of people just try the OU, others just like to take some classes without sticking to a program and with no intention of getting a degree...
> the cost for an equivalent course for a resident of England is £2,786
The OU used to be about giving a chance to education to everybody. This is sad...
To be fair OU courses are eligible for student loans in the same way going to university is, but the price tag certainly put me off when I was looking into doing some courses recently.
I completed an OU Maths degree three years ago (at 49), I'm not surprised the rate is high, there were a lot of people who started it without any idea of how much or what sort of work was entailed, I remember one student in the first tutorial asked if we would be learning Excel.
It does sadden me to see them struggling with the new fees, I'm not sure I would have done it at the current rates. I certainly wouldn't have taken a loan for it.
If that dropout rate is the fraction of people who take at least one OU course but don't get an OU degree, it seems kinda misleading; some people may take OU courses with no intention of getting a degree.
(I followed the THES link but didn't find anything about the OU dropout rate at the other end of it.)
> There's a big difference between having an instructor and learning things from a book or video lectures.
Absolutely. Anders Ericcson makes this point very clear in his research and also the book "Peak"
In order to become an expert, you need feedback from other experts. I used to think "well who broke through the field first? who became the first expert?" and to that I say that is how human knowledge works. It's built on top of one another, whether we think about it consciously when trying to learn a new concept in a new field, or we are using it subconsciously in the form of an abstraction.
As an anecdote, my first "real" job after getting my CS degree out of college, I worked at a place for about 2 years. In 1 and a half of those years, I felt like I was barely getting by, not meeting my knowledge goals. Then we decided to get crazy, and try this agile and pair programming thing. I worked under one of the most productive developers we had there at the time for 3 months. 5 years later, I still contest, I learned more in that short 3 months under his guidance than I have in my whole career on my own. I have been searching for that experience again ever since.
When I was in school, I struggled heavily with certain concepts. I had one really smart math teacher who couldn't have cared less for average math students (e.g. me), and another math teacher who was not that knowledgeable with math. So after struggling for years, I stopped advancing in that direction. Keep in mind this was a long time ago, pre-YouTube.
But now? I can take an online course taught by people who actually know how to teach. And if I don't understand something, I can rewind and play the lecture over again. Or I can take it a slower pace. Or most importantly, I can go on YouTube and find other teachers who explain a concept differently. It is highly improbable that any single teacher can explain everything perfectly to everyone so it's quite amazing that one can basically go on YouTube for alternative explanations that make more sense to them.
In fact, if you look at highly rated videos on say, math concepts, you'll find plenty of students who leave comments like, "Wow, you explained this better than my teacher did" or "I wish you were my teacher." Students MAY have the capacity for concepts, but the teachers could also be poor at teaching. How many students actively blame themselves for being 'too stupid' or 'bored' and instead give up because they don't feel empowered by their paid teacher to overcome the knowledge gap?
No doubt there's clear advantages in having an innovator or brilliant person guide you in your studies. And certainly if one were to do a Masters or PhD, having an advisor is essential and invaluable. That said, for a lot undergrads / students, I would be surprised if the majority of them had a strong relationship with their professors/teacher.
I believe that online courses and college should exist out there in the world, as there are advantages to both. But I think because the feedback portion of it so dependent on the quality of the teacher, that it's not a guaranteed advantage.
I agree. The physical interaction between a teacher and a student is irreplaceable by MOOCs. There's all of these externalities -- instant feedback, relations with peer and faculty, pressure to keep up with the course. That being said, the cost of a college education is too high right now and attending college is not right for everyone.
> There's a known effect, discussed in the Coursera "Learning How to Learn" course, where you read through the material/watch a video and think that you've actually mastered or learnt it while in fact you have not.
This explains a lot of people I've worked it. "Oh I read a tutorial/watched YouTube videos of it, let me do it".
I've found the illusion of understanding particularly pernicious where coding is concerned. As I learn, I read tutorials and do little exercises and think I understand. But then I have an idea for something practical, sit down to code it, and realise I haven't a clue.
I've learned that I can't say I understand something unless I can explain it to someone else and apply it to solve a practical problem.
Having someone to talk to on a regular basis is kind of the idea behind teaching assistants, at least for many types of work. There is definitely room to innovate between paying the cost of a full blown degree and 100% self study. Effective or not, I think dev bootcamps became popular for this reason.
I attended Tealeaf Academy, now LaunchSchool, to kick-start my own self learning, and I thought it was an excellent investment. Their model is (or at least was) to provide a structured program, with email & chat support, and a few hours per week of live interaction. I also liked how they framed learning as the entry point to a path which required a lot of learning, and not a "be an employed developer in 90 days." It worked well for me.
> Some people presumably don't have a right kind of discipline to study on their own.
Won't these people just "drop out" early on in their careers anyway? Someone that can't do any self directed study will not last long in almost any IT role.
I can't comment on this specific MOOC but I've done MOOCs where there were a lot of quizzes, active forums, extra-curricular activities, and involvement from the course staff (E.g. Dan Ariely's on Coursera). Those are certainly better than just reading a book or watching a video. I think there was still a very high dropout rate. Having any sort of online community certainly helps as well.
I've learnt to play the guitar online. I did a couple of music related MOOCs. I use YouTube resources like JustinGuitar and Lick'n'Riff. But I still felt like I needed a real teacher and went and got one. Even though this teacher is probably not at the same level of some of the YouTube "gods" :) I still got a lot out of that.
In general, I've found the forums on MOOCs pretty useless other than for dealing with specific technical issues with the course itself. They just don't scale well and there's a huge disparity in the level of the participants a lot of the time. In programming courses, you've got some people asking about some nuance of an algorithm while others aren't sure how to install a text editor. Any course built around having meaningful discussions in the forums has been a tire fire in my experience.
Automated code checkers are definitely useful but, other than that, I'm not sure I've found MOOCs much different from watching YouTube videos. Nothing wrong with that--especially with top-flight lecturers--but they're hardly revolutionary. Videos of lectures have pretty much been a solved problem since VCRs went mainstream.
Honestly, at this point I found reddit a much nicer place to go then stackoverflow for technical questions.
SO users demand that your question be "correct". Note, I didn't mean that it should be clear what your problem is, it's just that if you ask a "noob question", for an explanation, or if five people think it can't be done, you'll get down voted and closed in a second.
It's sad because other (smaller) SE sites are quite nice.
This is incorrect. SO is polluted with "noob" questions and answers, which is why SO is so useless. Google and now even MSDN search results are also polluted with SO hits which are mostly just noise. If anything SO needs more strict moderation.
I completely disagree. I'm an experienced dev from C++ learning Android/Java dev. Literally 90% of what I need is on SO (the rest usually from mkyong), thanks to google some of the answers even bubble up to the search page so I dont even have to click on them. I would be much worse off without SO.
That says more about you as a developer than SO. In just about any SO answer I find numerous things that are either wrong or bad advice. The difference between me and a lot of SO users, especially novices, is that they don't know a good answer from a bad one.
A lot of people on SO seem to be answering just for points, half the time, or more, they don't actually know what they are talking about all or have experience with what they are explaining; they have often literally just "researched" the question and now they condense a couple of sources into an answer. Without having a proper understanding of the subject.
SO is bad for the whole industry. And there is not much that is worse than a "developer" copying an answer off SO and putting it into a production code base, without even understanding the answer they just copied.
Not going to argue the general case, but for Android in particular the documentation is also full of bad advice and half-explanations. Plus it's not uncommon to hit a long-outstanding SDK bug. SO is great for Android development to fill in those gaps, provided you can sort wheat from chaff.
Some people presumably don't have a right kind of discipline to study on their own. This is why remote study programs can be a lot more difficult to complete vs. on-site programs.
EDIT: There's a known effect, discussed in the Coursera "Learning How to Learn" course, where you read through the material/watch a video and think that you've actually mastered or learnt it while in fact you have not. If you're aware of this and test your knowledge you can fix that but it's another reason why a structured program with an instructor can have an advantage...