"The results of the University of Texas at Austin’s first full-semester foray into massive open online courses, or MOOCs, are in."
"Professor Michael Webber’s “Energy 101,” which had an enrollment that peaked at around 44,000 students, had 5,000 receive a certificate of completion — about 13 percent of the roughly 38,000 students who ultimately participated."
So let's unpack this a bit. Professor Webber created a class called "Energy 101" and processed 5,000 students through it to completion. Your typical 100 level undergraduate class might have anywhere from 50 to 200 students in it.
UT Austin this year had 8690 freshman total.
So assuming the largest possible class of 200, this professor in one semester taught the equivalent of 25 semesters of 200 student classes, in one semester.
Why should we care that 32,000 people signed up and then said "Woah, really don't have the time to commit to this right now?"
It is one thing to move to a campus dorm and start studying some topic. It is quite another to click a few links on a web page and say, "Yeah, that could be an interesting class."
@turadg recounts how Coursera created statistics for partners: """The completion “rate” does not matter because the denominator is a very noisy signal. It’s a count of enrollments and what does “enrollment” signal in an online course, when it takes two clicks from a tweet."""
For people learning from non-conventional sources, they are used to using bits and pieces from different online sources and building up their own courses and their own paths.
This fascination with completion rates is simply our propensity to value what we measure instead of measuring what we value.
It's interesting that so many people have trouble even considering that there might be other ways to approach education.
I've used msft edx to learn bootstrap and their azure machine learning platform and they were there best moocs I've ever taken.
Education not something intuitive, even if there are many paths. If it was we would just be leaving kids in libraries.
Zuckerberg received an honorary degree, which suggests the University approved of what he learnt.
If you're pointing at education as the cause of the problem for these 2, structured education bears at least some of the blame because it should have been teaching the tools for further learning.
Considering MOOC signups are completely free, a percentage alone tells you nothing. 4% of 100 signups would be appalling. 4% of 60,000 signups is still thousands of people successfully completing courses, and is actually a success story.
Perhaps they need a box on the signup or viewing form, to indicate "just checking it out to see if I'm interested", or "just here to watch part of the course".
Here in Australia, when students finish high school and enter university it is common to hear them given the advice that it is "okay to change" after a year of your course. Get a feel for it, decide if it is actually for you, calibrate against the teaching/learning style, measure the enjoyment vs effort etc. I imagine MOOCs are starkly in contrast (provide an email address) to the complexity of signing up for a university degree.
Who knows, maybe students even form opinions by partially completing MOOCs in topics they might want to study at a tertiary institution!
Imagine you ran a class with 90% casual users wherein 80% of serious users finished and 10% of casuals. Out of 1000 users 170 would finish or 17%.
Charging increasing amounts of money would drive away increasingly large number of casual users while increasing completion rates.
By the time you get rid of all your casual users you will have an 80% competition rate while halving the number of people who take the class to completion and decreasing the number who learn something by 90%.
Maybe we can measure the relative preparation of incoming students who participated in moocs vs those who did not to better understand how much people benefit?
The completion “rate” does not matter because the denominator is a very noisy signal. It’s a count of enrollments and what does “enrollment” signal in an online course, when it takes two clicks from a tweet.
There are other complexities but completion rate in the “are MOOCs worth offering” question is silly and tired.
What matters is people getting value they otherwise wouldn’t.
It seems like we would want to compare what is learnedC that’s how I value. But not sure how you would test that.
There's no certifications with these. You can't get an accounting, law, or medical degree, bachelor/associate certificate, or anything you'd need professional accreditation for.
Maybe the completion rate is similar for people who buy books. I'd end up getting a book thinking it'd be an investment as it would be good as learning a subject. The next day I didn't really want to spend 12 straight hours learning Visual Basic.
Also these aren't like auditing a class and listening in on a real lecture either. Just from the list for UT Austin (a very good school): https://www.edx.org/school/utaustinx
There are university lecturers people listen to all the time though, through the Great Courses / Teaching Company. Those sort of fit the leisure-learner path and don't require logging into some system for it.
There's an even lower bar to entry: all the MOOCs I've participated in have been free.
I don't think it's hidden that you have to study in order to learn.
whenever somebody asks me how do people read on the web?, I say: they don’t. people rarely read web pages word by word.
instead they scan for information with keywords, uninvested and disinterested, ala, lowered attention span, less commitment.
this behavior immediately translates to poor results on everything and MOOCs aren’t any different.
Most course creators do absolutely no video editing. Sure they slap in an intro but that's it.
So you'll get a lot of umms, & silence, waiting for the creator to gather their thoughts.
There's also a lot of filler content. So a 2 hour course might have 1.5 hours of history, reasons for the course, setting things up and 30 minutes of useful stuff.
So I watch most videos at 1.5x speed or more. I also download Udemy videos for offline view - therefore, Udemy doesn't know which courses I've left untouched, started, abandoned halfway or completed.
I’ve done a number of them over the last couple of years and the fact that I can’t preview has caused me to abandon so many of them. If I don’t like my professor, the way the course is organized, or whatever, I’ll abandon the course and go find some other source for the same information which does a better job resonating with me.
And I know I’m not alone in doing this, off the top of my head, I know at least 3 workmates and at least a number of friends who do the same thing.
The completion rate would have to shoot up much higher were they to allow people to know what theyre getting into ahead of time. And strangely, these companies have a much better ability to do this than traditional schools.
How about doing live events at which you simply play -- on an overhead screen -- the very best videos or courseware from the web on selected technical topics. You get the best possible presentation and the chance to mingle with people who are keen on the same subject. You'd still need a person to act as a moderator or host, but there would be no official speakers. Would you go to such a thing?
This idea came about because I was at a seminar recently with multiple presenters on technical topics. At least 90% of what was said is available on the web. And the average presentation was awful. Yet, the auditorium was packed. Two hundred people showed up to hear publicly available information presented by mediocre speakers, none of whom were experts in the field (just people interested in the topic).
This is a regularly held seminar and people come. A lot people like attending a talk even if we could have stayed home and watched a better video on the same technical topic. I think part of it is that some people absorb material better in a classroom setting and the other aspect is the chance to socialize and talk with like-minded people.
There was a dedicated volunteer who prepared a 1-2 page summary of the lecture before the meeting, and a note taker for the discussion.
The lectures in the playlist are by the top people in the field (various viewpoints represented) so it was a great learning experience. Groups were 10-15 people and discussions were all worthwhile. Not sure how this would scale to 15+, but the watch-best-lecture-available-and-then-discuss-in-groups is definitely a good model. [at least one data point]
Getting a degree from a good school (I go to UT Austin actually) is different. That's why people go to college.
Let's assume the CS degree has on average 1 out of 5 courses per semester that directly involves practicing programming, and not just filling up the requirements on liberal arts, and math, etc. 13 weeks * (3 hours lectures per week + 5 hours homework per week) * 2 semester per year * 4 years = 832 hours.
So a full-time developer bootcamp would need to be at least 5 months long to match.
Most are only 2.5 to 3 months long, but there are some 6 month long developer bootcamps. The short bootcamps might not have as much value, but still a good minimum start to development.
This. Also to be seen in person, and social media that you are attending. To feel special. It is not a criticism, it is a human thing.
I think this has to do with the conflict of interest. Any platform that creates its own educational content, necessarily becomes bad at content _curation_. For ex, 3Blue1Brown has amazing series on linear algebra and calculus but we don't find Coursera pointing to it.
I have signed for many MOOCs but not completed even 1 till date. But I got the knowledge I needed and got the task done. To me MOOCs is an organized google search or detailed stackoverflow/wikipedia.
Plus my extreme aversion towards "exams", is also a reason. Never found an incentive to get a paper that tells I know something. Plus, despite having certificates from school & university, I don't remember the stuffs I learnt. I forgot most stuff after writing the exams. Sadly, some even before writing the exam.
> While many believe that there is a huge difference between online and on-campus completion rates, [Russell Poulin]’s research suggests the difference is slight. Based on a survey completed by more than 200 North American school officials in 2013, Poulin found that course completion rates averaged three to five percent better for on-campus courses than for online courses.
> Today, 2U reports completion rates of up to 88 percent for their online degree programs. Harvard Business School’s online programs claim similar success, with completion rates of 85 percent.
> At Acumen, where I design online courses, we’ve also been offering selective cohort-based programs for the past year that achieve completion rates of 85 percent. That’s a far cry from five years ago, when only 5 percent of the students were finishing the MOOCs I was designing.
> How have instructional designers collectively moved the needle so dramatically on completion rates? Unsurprisingly, some of the biggest drivers of these improved metrics include making people pay for online programs, increasing the selectivity of courses, and adding program managers and teaching assistants to follow up with learners.
They’re nerds of whatever persuasion. Normal people work on their three hours of tv a day, nerds learn about Nabokov or the Assyrians or decide that it would be better to work through a modern introduction to algebra than Euclid for fun.
Like any degree, Gatech's program has a permanent academic record, rules for grade quality in order to not be on academic probation, etc.
If you don't have the incentives in place, then many people will find excuses to not commit. MOOCs are lacking those incentives.
I haven't researched this, but I would guess the $6,600+ cost has more to do with the high completion rate than any other factors. Most MOOC's seem to be pretty cheap, if not free.
Not to mention, completion of the Georgia Tech online program results in a degree. MOOC's typically just offer a certificate of completion, which carries little weight.
I can see why Coursera/edx etc stopped giving out these certificates if you don't pay for them, but it has caused me to basically stop completing free online classes.
That is what I thought as well. People who pay a handsome sum for an online course are not going to let their money go to waste. Something about paying up simply ensures that you turn up and actually attend/finish a course.
My completion rate was not stellar.
You really get a feel for how to turn an equation into vectorized code, which the video lessons can't give you.
I've completed the first Ng course, the later five-course specialization, and a few other courses (Odersky's Scala course among them). Still, 4% doesn't sound unreasonable: I too have enrolled in more courses than I completed.
I think it would be good to design course content for the medium. Professional adults does not have unlimited spare time, especially not if you have other duties to upkeep that require time family and friends. Thus maybe it would be wise to divide the course content in smaller chunks and reward people for completing each chunk. This would be a design for Mooc that would be very similar to game level design which already works brilliant online! Aka divide and conquer algorithm. Reward people for completing levels.
Turns out taking a Stanford class (or anything) on the introduction to logic isn't most people's idea of fun.
I appreciate that these courses have been made available to anyone who wants to learn, but my point is concerning the meagre completion rate.
I remember a time when/where this kind of knowledge was hard to find and devoured whenever it was made available to you.
I'd assume completion rate for something that you actually paid would be way higher.
I've been following the MOOC phenomenon since its early days in 2010-2011, and taken a lot of moocs in the meantime.
Sometime the mooc reveals to be low quality, and I quit because I can get better information from books and/or by actually messing with the technology (xkcd.com/519).
Other times, I quit the part after the interesting part, or don't care about going through all the single details to get a worthless recognition ("a website says i've watched all the lessons").
Other times I pay for a course, go through all the lessons, do all the labs, get all the certificate of completion.
It really varies a lot, and most people ask the wrong question: it's not "how do we measure success" but "how do we measure success with respect to the original goals?"
Very little people care about an automated system somewhere recognizing my 100% completion of the course materials. But I (and possibly, possible employers) care about actual competencies. It's the same thing all over again: in real life pieces of paper don't matter, what matters is the capability of solving problems and handling situations.