I'm currently taking courses with Udacity and Coursera, and I've noticed one huge difference between the two that I hope edX learns from: whereas the Coursera class is structured like a traditional class online, Udacity's course designers seem to better understand and take advantage of the fact that the course is running in a web browser.
The difference is a bit difficult to explain. Both have videos, forums, and wikis. Udacity courses are set up as short videos punctuated with many questions and mini assignments (running in an in-browser Python IDE), along with larger homework projects. Also, the forums are continually monitored and new videos are added to clarify concepts that students are struggling with.
In contrast, the Coursera course I'm taking (AI) has longer videos (6-20 minutes) of the instructor mumbling as he draws over and over on ever increasingly confusing Powerpoint slides. Sometimes a video will have one multiple choice question, other times the video will not have any questions at all. The worst part is that only once has the video gone on to explain the question. So if a student has a problem understanding the question, they will have to resort to the forums. There's no follow-up, unlike the questions on Udacity. At the end of each section (about an hours worth of videos) students can take a five question quiz. Granted, the feedback on the quizzes are a lot better -- but it's a lot to expect an hour of instruction to be reinforced by a mere five questions.
Basically, the Coursera course is taught as if I was sitting in a class watching an instructor draw on a Powerpoint -- the fact that it's running in a web browser and can provide a different method of teaching seems to be lost on the instructor.
Granted, this might be a critique of the instructor more than of Coursera itself -- I'm only taking a single course from them, whereas I'm taking two on Udacity. But Udacity seems to understand that you can't just take the experience of sitting in a classroom and put it online: you have to understand that this is a new medium that allows new methods of teaching.
To conclude this rambling post (sorry, I didn't know how to explain what I'm feeling as a student more concisely), if these online course ventures that are popping up all over the place are going to succeed,
they are going to have to use the medium of the browser to its fullest: and in so doing, I think they will have to compete with traditional universities. That's what worries me when edX says the online classes will supplement the in-college experience: I think that you're going to have to beat the college experience to succeed in this market.
I very strongly agree. Udacity courses gives you this feeling that they step back, and tried to reinvent how would an online course be if it was invented today, taking advantage of the tools we have today. While coursera has that feeling that they took existing courses, and just threw it online and hoped for the best. For example, I have the feeling that Python was chosen for udacity (among other reasons) specifically because it could have an online interpreter. While coursera feels like they just reused whatever language and tools they were using before, just because. If I try to consolidate the udacity advantages it would be:
- Short videos: This have many different advantages and works well if its other properties. Let's you set your own pace, instead of following the long video's pace. Easier to separate topics into different parts you can easily click to go back to. When necessary, it's easier to pause between topics (since the video pauses automatically) for you to take a step back and think about you just learned for a minute. It matches the interpreter nicely, learn something new, try it, learn something else, try it. It allows the instructor to better separate links to appropriate material in the description, by topic.
- Online interpreter and automated tools: I suppose this is one of the reasons why they chose python. Similar to how codeacademy works, udacity is like codeacademy + video instructions instead of text. And this is genius. The videos are taking into account that you have an interpreter with you, and everything is integrated. The lessons, quizes and homeworks are all automatically integrated with the interpreter to make sure that you try out what's being taught and the class can give you instant feedback if you're doing it right.
- Production value: Not only the videos look better, but they are edited to be more efficient. Unnecessary parts are cut out for brevity. The video and sound quality makes sure everything is easy to see and hear. Some other online courses often times display a printscreen of the IDE, with low resolution, which makes the code unreadable. On udacity, the code is always readable and the audio always easy to listen. The better UI makes it easier to navigate. And of course, the superior visual quality just makes the whole thing more pleasant to consume.
These 3 characteristics I wish every online course would try to learn from udacity and implement themselves. It's important to note how these 3 are glued together and help each other. Short videos gives you breaks to use the interpreter. Higher visual quality makes the code easier to read and then try on the interpreter, etc etc Udacity just feels like a consistent and smart bunch of ideas put together to make sure we learn easier. Hopefully, others will evolve with them.
It's barely 6 months into this whole thing, and Udacity and Coursera are younger even than that. The first gen videos for Thrun's AI class were awful. Change has already happened fast, and there's plenty more to come.
I wonder if Coursera's great number of professors and classes will slow them down in offering a drastically different format. Udacity has been exclusively Python (with snippets of HTML in the web class), and their IDE is shared across all the classes. Coursera's many languages and professors might make that degree of focus much more difficult.
Agree, the first AI class was horrible, while Andrew Ng's Machine Learning class on Coursera was top notch. I think both platforms have advantages and things to improve, and I'm glad they are both activelly working on it.
Not sure we need yet another platform with eDX, but let's see. So far it seems more of a West versus East type of thing..
Well, I prefer Video Lectures (Like the MIT Python one) to Udacity and Coursera, feels more real and I understand it better. Also I prefer long videos (About 1 hour) to that 5 minutes videos from Udacity.
I gave up in both Udacity and Coursera (Though, I liked Coursera more), and still love those MIT Video Lectures.
I liked Udacity best at first, but after a while found myself spending a lot more time with Coursera. Mainly for two reasons:
- Udacity sometimes belabors some really basic points. Coursera never gives me that "dumbed-down" feeling. It's more like a real, and challenging, college course.
- Coursera lets me speed up the lecture, as much as 2x, and every time I go to a new lecture it remembers that speed. I managed to speed up Udacity lectures by going to html5 in youtube, but I had to reset it every time I started a new 2-minute segment.
"Udacity sometimes belabors some really basic points. Coursera never gives me that 'dumbed-down' feeling. It's more like a real, and challenging, college course."
I think this actually may be Udacity's biggest strength. It's a lot more hands on than a college course, which makes it somewhat easier. But do you actually learn less? After thinking about it, I really don't think so.
With your average college CS class they spend a few minutes talking about some problem, and then you spend a week on your own working on it. Whereas with Udacity they actually teach you all the concepts you need to solve the problem, and then you work through it together step by step.
The strength of the traditional college approach is that it teaches you to be resourceful to some extent. But this comes at an enormous cost. You basically spend all week writing shitty code and trying to make it compile just to learn at best one or two concepts. Whereas with Udacity they are actually teaching you how to write quality code, but you still have to do the most challenging parts of the algorithm yourself. (And these aren't trivial, they often take several hours.)
Clearly at some point you do need to learn to fly yourself, but with Udacity at least you can wait until you're up to speed on the basic tools rather than being shoved out of the nest on day one. Even if it seems dumbed down, I'm completely convinced that you're actually learning much more this way than what you learn in the traditional CS classroom. And after all, the measure of a class shouldn't be how hard or easy it is, but rather how much you learn from it.
I like the short cycles and quick bits of programming, especially given that they've also got the weekly homeworks. I'm just saying that sometimes they move too slowly for my taste. Let me run the videos at high speed without hassle, and that'll help. I can slow down when I need to, and speed through the parts I already get. (I've found that at high speed, having to concentrate a bit just to understand the words helps keep me focused.)
Ultimately, maybe we'll end up with classes that have several levels of explanation, letting you pick which you think best suits you for that particular piece...maybe for one part a quick presentation of an equation is sufficient, and for a less familiar part you want a step-by-step walkthrough with lots of intuitive explanation.
I'm going through the Coursera machine learning class right now and I have to say that the professor glosses over several details and often makes comments like "if you're not familiar with calculus..." and "if you're not familiar with statistics..." which caught me off guard at first. I really doubt that actual Stanford students enrolled in a machine learning course would be lost on the incredibly basic operations (e.g., taking the partial derivative of a polynomial function) he is using.
Also, there has been no acknowledgement of how contrived the exercises are. For instance: exercise one gives a data set of a the profitability of a company's existing stores versus the population size of the city in which the store is located (in units of 10,000 dollars and people, respectively). The range of the data is 5-23 (population), with most of it concentrated below 10. We fit a straight line to the data using least squares, then use that line to predict the profitability of two new locations--in cities of populations 35 and 75. I understand that this is an intro course, but there is not a word about how ridiculous this is.
I don't mean to be overly negative. I am enjoying the course, but I am surprised a bit by how basic it is. Let me say that I do like the approach of the course to ML, which is to formulate a parameterized cost function and then minimize it by some general method, rather than the typical statistics course approach which is to solve ordinary least squares directly, which gives an "exact solution" (given the data) but does not generalize to more general models.
I know this is foundational material and overall, I am impressed by the approach of the course, but I would expect more comments on the weakness of the naïve methods we are employing at this early stage and how they will eventually be improved. I find it very helpful when professors at least reference more advanced methods or provide references for further reading by the interested student. Admittedly, that is more frequently a feature of graduate courses, but encouraging students to go beyond the material is an important aspect of good teaching. I have watched the videos for several other online courses and I do appreciate the fact that Coursera is allowing me to hand in assignments for grading, which vastly increases by engagement with the material. This, in fact, is the most valuable resource offered by the program. The lectures themselves are fine--if a bit dry--but a good book or a set of well-prepared notes (not slides) would probably suffice just as well if accompanied by the assignment grader.
All in all, this is great. The more people who know about machine learning (and have access to higher education in general), the better.
IIRC from a Stanford student's comment on HN, Stanford offers two versions of machine learning, one that is more math focused and a more applied one designed for all majors. The ML course offered through Udacity is the latter one.
Completely agree. I'm currently taking 4 Udacity couses, and they are phenomenal. There is no question, in my mind at least, that Udacity or at least the Udacity approach is the future of the space.
Their advantage isn't just their technology platform, it's that they seem to have impeccable taste and judgment. They aren't just a bunch of MBAs who are 'licensing content' as part of an 'online distribution play' or whatever.
I actually think that Udacity is quite similar to YC circa 2005/2006 in a lot of respects.
I think there's a fair bit of variation between the Coursera courses.
I'm currently taking their Machine Learning and Computer Vision courses. The ML course is quite like what you described for Udacity: there are a lot of short lecture videos with questions embedded in them, online quizzes and automatically graded programming assignments. It's a very engaging way to learn and I'm getting a lot out of it. The CV course, on the other hand, has so far just been videos of the professor talking & no supplementary material.
I suspect that Coursera provides the platform and leaves how the material is presented on it up to the professor. The ML course has been run once already, so they presumably have the benefit of experience; whereas it's the first time the CV course has been run so it's quite possible that they just haven't figured out how to take full advantage of the platform yet.
>"It's a very engaging way to learn and I'm getting a lot out of it. The CV course, on the other hand, has so far just been videos of the professor talking & no supplementary material.
I suspect that Coursera provides the platform and leaves how the material is presented on it up to the professor."
It's probably this. I'm taking Model Thinking on Coursera and there are at least 2 pdf for each section (20 sections). Some of them from professor's book, some links on the web or some scientific articles.
I even stop reading it all because it's time consuming, but I'm happy it's been offered anyway.
About the style, well, it fits in the Content: programming stuff is better on Udacity; academical/theorical stuff, Coursera.
I agree strongly with these observations, and have found it to hold true over Udacity's Robotics AI, Web Apps, Programming Languages, and Design courses, and over portions of Coursera's Machine learning and NLP classes.
Edit: and I also agree with Brunov, Udacity courses are not mathematically rigorous.
I don't know enough about programming to make a comment on the quality, and I should have mentioned that in my initial post. I was trying to convey my experience of the teaching in each course, not the quality of the material.
But yes, from what I've taken in the AI class so far they are definitely teaching Computer Science. Udacity so far seems to be "applied programming" with a bit of helpful CS here and there.
Have you taken either of the CS101 courses? Udacity's has already wrapped up and started a new 'hexamester' though Coursera's has just started. I prefer the Coursera lectures generally but with CS101 at least I think Udacity was much more engaging and challenging for beginners. So far the assignments on Coursera CS101 assignments feel pretty lackluster compared to the other classes (unless I've become unwittingly jaded, in which case I'd still like to hear what others think).
I've seen examples of both sorts of courses from both Udacity and Coursera, so I think it really just comes down to the extent to which the instructors are willing to reinvent their material. And for those who aren't doing so much reinvention, I'm still glad they're doing what they are - I'd rather have an online class based on recycled lecture videos, than no online class.
I've noticed this, too. Udacity's courses are clearly designed from the start to work well on the web, whereas many Coursera courses seem to be existing lectures broken into smaller chunks. This really sets Udacity apart, but I think it's actually the result of a much bigger advantage: Udacity isn't affiliated with a meatspace university.
It might not seem like an advantage now, since they're competing with more prestigious options from Stanford and MIT, but I think the decision to detach Udacity from Stanford will prove to be a very smart choice. I think we'll see real universities take a two-faced approach to online learning: they will want the attention it draws, but will for fear of diluting the prestige value of their brands. (See the protracted wrangling over the certificates in the original AI course. And can you blame them? The brand is probably the most valuable part of a modern diploma).
This is Udacity's real advantage: they don't have to worry about prerequisites, four-year graduation rates, department budgets, or the physical constraints of offline universities. They are already much more nimble and much less risk-averse than some of the other initiatives, and it will be difficult for the programs still affiliated with universities (institutions hanging on to practices invented in, like, the 13th century) to detach themselves from offline constraints and adapt to the web.
I'm not sure that beating the traditional university is the objective or that it's even necessary. Remember that the essence of a disruptive product is that it's a little bit worse and (at least) a little bit cheaper than the incumbent product. The question, then, for disruptive education plays (of which my company is one) is: which part of the traditional experience are you going to take away in order to achieve scale? Most .edu startups are taking away instructor interaction and emphasizing lectures, which is exactly the opposite of what most students actually need.
Can I just say this is super exciting. In Australia, where I live, there was a very brief period where university education was free. It was before my time, but I remember the feeling I had as a child when I first heard about it - not amazement or disbelief, something closer to "oh yeah, that makes sense". I figured that education would have to be free because otherwise poor people would stay poor forever. Not the most airtight reasoning, but I think my heart was in the right place. I hadn't really considered the resources that would need to go into it, and who would inevitably have to foot the bill.
But this is the best kind of regression. Maybe the childish dream of "hey, let's just teach everyone" isn't so ridiculous now that we have the right technology. I find it easy to get frustrated sometimes thinking of how much power we have at our disposal, and how much of it goes into more efficient cat sharing and other electronic distractions (some of which, in fairness, I like quite a bit). This is a pretty cool example of how much what we do can mean to people: not just entertaining them but radically improving their lives.
The best part, though, is that this isn't even news. This has all already happened. Between the Khan Academy, OCW, Coursera, Udacity and edX it's actually a crowded field now. Great! Large universities move slowly, and I'm sure there are a lot of people who've been pushing for years just to get things to this point. Now things look like they're starting to snowball. It's easy to ignore one university or a crazy startup, but when someone says "hey, uh, half of the Ivy League's on this thing" it gets attention. I'm really looking forward to whatever comes out of edX, but even more to the inevitable answers to edX. It's a great time to need an education.
We still face a problem, which is inevitable when you can teach the masses. Some people, especially the poor, don't understand the importance of education. In poor countries, teenagers don't have time to study, because they have to make a living.
But I'm happy we are advancing, I believe education is the foundation of everything, so we sure are making some big steps.
Well, let's be honest with ourselves. If they won't be free from the original source, they will eventually be free from other sources. The important thing is that the material will exist and it will be available for anyone to learn from.
That said, I really doubt that these classes will remain free. (Especially with likes of Hardvard and MIT.)
University-level education is still free here in Uruguay.
However, it has several big downsides, which result on artificial barriers like absurdly difficult exams (a calculus exam I took had only 2 people out of 1200 passing), quotas on classes or specializations (sometimes decided by scolarship and other times just by random drawings), and some really bad teachers and overcrowded classes (there are good teachers too, and things pick up after 2nd or 3rd year when most people have dropped).
I ended up getting a degree from a private university.
Having all of these online courses and virtual university education does sound really great :)
You may not believe this, but Family First's Steve Fielding proposed some time close to the last election that a free university be established that would provide its services over the NBN. As kind of a justification of it.
Most of the press release is about funding and bureaucracy, as evidenced by the number of name drops and shout outs and culminating in which old school administrator will be granted the honor of being the first president of the new initiative, and going straight for brand name of "MIT+Harvard". Reads like a press release out of the 1890s, the last gasp of a struggling and suddenly irrelevant brick and mortar fabulously costly institute of a bygone era.
Conspicuously missing are any specific details about the operation or value other than its brand name and the instantly heavy bureaucracy. It's likely this because they have no specific details yet and the negotiations to date have been about who gets the biggest seats on the board, what compensation packages they can negotiate and who will win the most prestigious titles and positions in behind the scenes political wrangling.
As far as the software that runs it, the old canard of making it open source and having other people build it is just tossed out as if that is a magical solution to design.
Nothing about this smells agile. It smells very industrial and slow. Compare to Udacity and Coursera who each are happily running dozens of classes to hundreds of thousands of students each, responding quickly to feedback, and demonstrating clearly they are up to the modern speed of doing things.
This is pretty silly criticism, considering the 3 month delays Coursera had this spring.
Coursera's delays were because it is much more closely affiliated with Stanford than Udacity is. In the fall they didn't really have the Coursera name settled upon yet and the certificate printouts they sent to people who completed the courses had Stanford's name on them.
Discussions involving university reputations are always going to be long and dragged out. Udacity avoided them by having a clear separation between the website and the institutions of its instructors from the outset. Coursera acquired that separation over time.
Harvard and MIT's reputations are most of what separate them from FullSail and the University of Phoenix. It is important that they protect them. It is encouraging that more universities are following Stanford into this space.
Courses with broadcast lectures and server-based practice/homework/exams will need the (hu)manpower that universities currently command in order to grow quickly.
> In the fall the certificate printouts they sent had Stanford's name on them.
The certificates they sent had a paragraph disclaimer at the end pointing out that the certificate had nothing at all to do with Stanford. This was the only mention of Stanford. This clause was only added because Stanford legal requested it since the classes were taught by the Stanford professors who set up the system, of their own initiative and definitely not something initiated from the bureaucratic side of the institution. Your post suggests that the certificates indicated they were granted or approved or validated in some way by Stanford. This is not the case at all, it is the opposite. I recommend you track down one of these certificates and examine it to your satisfaction.
As far as Coursera, it didn't exist in Fall. It was created in response to Stanford lawyers and bureaucrats going apeshit and shutting the venture down once they saw what a threat it was since the classes were as good as what they were charging for. This caused several of their best professors to resign and leave Stanford.
Stanford is about their reputation, which comes from their top notch professors. With professors leaving, the reputation is worth less than before.
Stanford intentionally isolated themselves from this venture and tried to punish those who pursued it. This shows how committed their administration and legal staff are to the future of education. Not at all, in direct conflict to their most progressive and talented professors.
The future clearly belongs the rogues who are leaving the inefficient and ineffective old system behind to join the ground level grass roots work of modernizing education. This is something that institutions are showing themselves incapable of doing, and it scares them. They won't go down easily. They will fight this, attack the new paradigms, and try their damnedest to retain an economy based on buggy whips that supports their institutional power, wealth and obsolete practices. Many on the forefront such as Salman Khan do not have any teaching credentials or background, and that is how it must be for the old practices do not work.
>This caused several of their best professors to resign and leave Stanford.
Just to clarify, this was one professor, Sebastian Thrun. The Coursera professors still teach at Stanford. In fact, the Coursera effort has been to integrate on-campus and off-campus efforts from the beginning.
The off-campus students of the DB, ML, and AI classes were given access to interactive lectures, exercises, and exams which was revolutionary. For their part, the on-campus students were freed from the lecture problem all other college students face. When you attend most college lectures, you might as well be watching a video (to most professors chagrin).
Despite professors' pleading, in 2012 the best way to get lectures to students AND have an interactive experience is to separate the lectures out entirely and then simply have interactive "lab-ish" sessions when on-campus students are in class. This is what they did in the DB, ML, and AI classes, the Stanford students for their money enjoyed more intimate professor access, and these extra learning modules. On top of that, it must have been a big relief to be a part of a class run that way. At other schools, if a professor has videos from previous years or even slides posted online, many kids just don't go, there isn't much point.
>Your post suggests that the certificates indicated they were granted or approved or validated in some way by Stanford.
My point was that the relationship with the Stanford name was strained and it created legalistic issues. It seems like we're in agreement about that.
>As far as Coursera, it didn't exist in Fall. It was created in response to Stanford lawyers and bureaucrats going apeshit and shutting the venture down once they saw what a threat it was since the classes were as good as what they were charging for.
This is confused. Coursera didn't announce it's presence until very recently. It's existed as a stealth-mode startup for a while.
I have tried a lot of these new online courses that have been created but I still think they have missed the point, the point that khan academy got right. I don't want to plan my life around weekly assignments. I keep getting emails about about assignment deadlines, causing unneeded anxiety which puts off the whole learning experience.
Some of the courses on Udacity no longer have assignment deadlines. I'm not certain what I think about this yet: on the one hand having hard deadlines means that each "semester" of students stays on the same page and the instructors can respond to the most pressing difficulties they are having. It also allows them to iteratively refine the courses without disrupting current students.
On the other hand, it can make things difficult for us in the working world who have to balance education with our work life! I'm taking three courses now, which I found out to be overly ambitious (I should probably spend time with my wife!) so I'll probably drop one.
I thought their revision the first time around was a good way of doing it. i.e. giving HW deadlines but at the end grading by the exam only if that was a better score. That way it's as if you get credit for staying on pace, but none taken away if you went at your own pace.
This is why I dropped the MITx course. I'm responsible and motivated, but I also work full-time and have a family that includes a 2-year old. This is the 21st Century for Pete's sake. I should be able to level-up on my own time frame and not Prof. Agarwal's.
> First one to award real degree credit per course wins
This is a complex issue. Real degree means ID checks, accreditation costs, and testing in person. That costs money. As Sebastian Thrum pointed out recently, those things would prevent most people in the world from taking these classes, and is why he chose not to go that route. For his first AI class he had 160,000 students which spanned every single country on earth except North Korea. In one quarter, he may have taught AI principles to more students than all previous professors in history combined, certainly more than anyone at his university had. He also gave the exact same tests and assignments to students at Stanford for real credit and they did no better than random people in far away lands who were struggling with issues of finding electricity and an internet connection. Forcing to a credit model locks those people from outside the first world out of the system.
What does credit mean? I have taught high school and college classes, and elementary school students. Now I do software and hardware design, and I hire developers. A candidate having a degree has proven to mean absolutely nothing at all as far as technical capabilities go, nor do their grades! About the only thing I know for sure is students from Stanford and MIT are in general more capable than those from a community college, and people with degrees from online for-profit colleges like U. Phoenix and CIS degrees from anywhere can be relied on to not know enough to perform.
So I have to check people's body of work, their projects and interests, and we do proactive recruitment as well. This results in finding talented people. Not "resources" but people.
Someone who has taken a full set of courses at Coursera and Udacity is going to know just as much as someone with a "legit" credential from the university. So what is that legit credential worth? Nothing! Rather than promote more useless credentials, it would be more useful and productive to promote fewer. We should even consider getting rid of degree credentials all together since they don't prove anyone knows anything at all. Whether someone can perform is not correlated to whether they have a degree.
> About the only thing I know for sure is students from Stanford and MIT are in general more capable than those from a community college, people with degrees from online for-profit colleges like U. Phoenix and CIS degrees from anywhere can be relied on to not know enough to perform.
Yet this statement completely undermines your point. I graduated from community college instead of going to high school, and now wherever I go I get dirty looks for having some random "fake" school next to a top-ranked one. I've been flat out rejected by people who stated that "[they] know what community college portfolios look like", without knowing the whole story.
As much as I wish credits and degrees didn't matter, this is not universally true in application. In fact the less they matter on their own, the more accessible they should become! It doesn't force anyone to lower their standards. That is why sooner or later, the credits will matter.
He didn't add in community college with for-profit fake college ("people who can be relied on not to know enough to perform"), he just said that students from Stanford and MIT are generally more capable.
You went to community college, like I did (for the entire first half of my university career - Oakton Community College represent!), so you can't think that's a completely unreasonable statement. OCC certainly wasn't MIT, but it was a place where you could learn a lot of your undergrad math.
Not to say that you won't find people that will dismiss you out of hand, but you'll find that if you're black, or from out of state, or went to Yale instead of Harvard or Harvard instead of Yale - depending on the person you're talking to. Some people have really horrible quality heuristics:) But he's not doing it.
It could be 100% accurate but I still don't think it does anything to help the argument that a degree/credits shouldn't matter. I've gotten used to people requiring an explanation for a degree from TXCC, but frankly some of those top-school grads could probably do some explaining too after seeing some of the portfolios that come in...
I wish I could compare to MIT but since I have no acquaintances there, I can only draw comparisons between a few programs in art and design (philosophy might be another story). --This is the fourth time I've tried to rewrite what I want to say before hitting reply and it just isn't working out. It's a long topic, and abbreviating it simply turns it into a rant. If you really believe that degrees/credits don't matter, then you should be careful when and where you stray from that principle. If the choice is between two people with no experience, no portfolio, but a degree and maybe an interview, then it's virtually unavoidable. But, if they have work to back themselves up then why even look at their degree? Even if you have certain feelings about certain places (like UPhoenix), you're better off getting what you need to know from an interview since you never know the circumstances that led people there anyway (and there are many... visa requirements, money combined with being ill-informed about other options, etc).
I'd say that if you're thinking about doing Phoenix, save the money and just learn from all of the random resources online - and whether you went the autodidact route instead of Phoenix or after learning nothing at Phoenix, you had better lead with your portfolio.
Probably better to just leave out Phoenix altogether from your resume as an expensive mistake. And, if people won't look at you even though you can show them a pile of good work that you've done - that probably wouldn't be the greatest place to work anyway.
Phoenix is no better than no credential at all, and also implies a general ignorance of the tech industry. In addition it costs twice as much per year as state school did per degree, and dozens of times as much as the internet and old editions of books from http://used.addall.com :)
I think you misunderstood his point. He's saying that for Coursera, Udacity, and edX, they need to make a choice of making their courses available, or getting them accredited, but not both. Clearly they have made their choice.
I am addressing the second part only so far as it affects his observations. Otherwise I agree with his post (especially the complex issue part, heh).
But in response to your paraphrase, my hope is that they don't end up locked in to that duality. A degree may be an ideal endgame, but the next best thing would be to bypass the accreditation steps and work on a sort of transfer credit treaty. It's in between accredited course and CLEP subject test.
If you get bored of The Open University courses, I've been enjoying to Oxford Software Engineering MSc (which doesn't require an undergrad degree)... For the Compiler course on Coursera (which I'm also doing), I found
http://www.cs.ox.ac.uk/softeng/subjects/SEM.html (which you can do just by itself if you like) invaluable
Possibly. The cache of Harvard/MIT definitely puts it in good shape to be the victor.
But, increasingly, and especially in technical fields, degrees are being supplanted by other talent or qualification indicators. Open source work is a great example. There are significant numbers of folks without CS degrees (or even degrees at all) who have found gainful employment as coders based on their open source work alone. Whether it will be accepted by the Fortune 500 crowd remains to be seen though, and certainly the "show me what you can do" idea doesn't really work for non-technical jobs.
At least in the short term offering a degree for these online courses might help.
I wonder if for the non-technical crowd who can't show what they've learned an online degree will be thought of in the same light as a traditional brick and mortar one. It isn't so far, but that will almost certainly change if that degree comes from some place like MIT or Harvard.
Just like real world universities, there may be no "victor." but rather many groups catering to individual needs. If it was a mere matter of location, we'd only have one University per city.
For that matter, Western Governors University is still quite large. They could pivot much more easily than Harvard and integrate this into their current system, and they have the advantage of accrediation.
EDx is clearly planning to compete on quality. I expect them to be pretty resistant to watering down their course challenge level.
Coursera is likely to head mostly the same way.
Udacity, though, looks like they might lean more towards reducing the challenge to increase the audience. That feels a little dangerous as they're then open to considerably more competition from other schools. The floodgates are about to open.
What Udacity has in the short run to mitigate that possibility is a start-up agility mindset. But they'll calcify in the long run, everyone does, and then what?
What will interest me most is to see how many people can take on the challenge level of EDx. I suspect it will be many more than anyone would have thought.
The people in real trouble are those offering mediocre education for high prices. 5-10 years from now those people will be in a new line of work.
> First one to award real degree credit per course wins, I suspect
With github and stackoverflow. Do we really need a degree? You only want a degree so you can get a confirmation that you know what you know, so employers will hire you. But why don't you just link to your github and stackoverflow accounts? These online courses could incentive you to build awesome stuff and put online for this purpose.
Then the whole cycle is complete. You have a place to learn, and means to show what you've learned to get yourself employed. What else do you need?
(well, maybe the social part of universities is missing, but that can be fixed with hackatons I guess)
> With github and stackoverflow. Do we really need a degree? You only want a degree so you can get a confirmation that you know what you know, so employers will hire you. But why don't you just link to your github and stackoverflow accounts? These online courses could incentive you to build awesome stuff and put online for this purpose.
Github and a degree show different things.
Github shows that the person can write code. It doesn't show that he knows how to decide what code to write. Did he consider alternative algorithms? Did he chose based on understanding the strengths and weaknesses of the algorithms? If you see an O(N^2) algorithm used somewhere instead of an O(N log N) algorithm, is it because he didn't know better, or is it because he determined that for the inputs in this particular project the O(N^2) is actually faster?
A degree in the appropriate field from a good school goes a long way toward showing that the person can do that kind of analysis.
I'd say for most developers a degree shows that you can stick with something, and that you were at least smart enough to get through a CS program.
However, most of the people I know have forgotten just about everything but the very basics from their algorithms class within a few years.
When I was teaching myself to program, I knew 3 guys who were seniors in CS at Georgia Tech. With only about 3 years experience myself, I could code circles around all of them.
At the time they may have had more theoretical knowledge than I did (definitely not true now though since I've caught up on the theoretical side), but I would have been a much better hire for 95% of programming jobs.
Not because I was some kind of rockstar, but because I'd had more practical experience actually writing real programs. Sure if I was working at Facebook, or trying to scale something truly massive (or working with resource limited embedded systems), they would have had an edge on me, but the vast majority of programming that happens each day doesn't require that level of computer science chops.
There are definitely jobs that require a "Computer Scientist", and if you want to work on interesting problems a CS degree is extremely beneficial. However, a CS degree is neither necessary nor sufficient for most programming jobs.
(That being said, I'm currently in the process of finishing a math degree.)
Even if you can demonstrate your skills to an employer, what are your chances of an admissions committee even being willing to click the link? I have an A.S. in graphic design, a few years into a B.Arch, and am finishing my B.A. in Philosophy, Art History within the next two semesters. Unless there's some sort of scheduling miracle, I won't even be able to get minor credit towards mathematics or engineering– outside of architecture and philosophy, graduate schools wouldn't want to touch me with a ten foot pole.
My primary goal over the next semester is to compile all of my notes and certificates from these courses and scrap with hell to get credit for them. Otherwise I'll be 21 with three degrees and still need to go back to school for two years, just to get my engineering degree so I can go to graduate school for CS/eng.
--typed quickly from my ipod so apologies if my argument got funky and rushed
I would argue that the focus on certificates, degrees, and grades are a competitive disadvantage. Rather than providing useful analytics, they introduce friction in the learning process and a reduced user experience.
An educational system that finds alternatives to these artifacts will have an advantage over those that rely on them.
I realize I represent only one of several types of students, but I couldn't care less about degree credit. I've got a BS in CS, and that's enough degree credit for me.
But I would happily do all the coursework equivalent to a degree in math for my own education. I just don't care if anyone recognizes it as a degree or not. I'm not willing to pay degree prices for it, but I'd pay some token amount.
Since it's possible to earn a degree online today, I'm not sure this is the case. Top-tier universities are faced with a significant innovator's dilemma problem here: they want to enforce their duopoly on the granting of degrees and they don't want online programs (which can deliver comparable content to students in a more scalable way at a fraction of the cost) to cut into their tuition revenue.
I tried, believe me I tried hard, but video is not my type. Over 20 years I've learnt all I know reading, surfing, browsing the web, not watching videos. I can digest/absorb/ignore a whole page of text in ten seconds instead of being forced to watch boring 10 mins videos that offer only one min of real interesting content.
So, the spot is open for an education tool where text is king, like wikipedia, but with a syllabus.
It's built into Coursera's video player - there's a menu/button to change it somewhere on the toolbar under the video.
I like to watch at 1.5x (or 2x if it's something I'm really sure I completely know from previous courses or experience) most of the time, and then if something starts to be especially confusing I'll slow down to 1x.
It's great MIT and Harvard are combining forces. It completely makes sense. Offering similar courses individually to the same Internet Audience is waste of resources.
By the way, all of these courses from Udacity, Coursera, MITx lack one unique thing. These videos can't reproduce the passion of the teacher in a live class-room. In that respect they are little boring. While they are excellent resources, kind of manuals to learn stuff, to actually 'improve the experience' they need to pump passion into video lectures.
More than these video lectures I like the actual recorded class room lectures that are kept online for public. Like cs50.net and Tom Mitchel's Machine Learning.
Does anyone else feel this 'passion deficiency' in these courses, like me?
Just to offer an anecdotal counterpoint, I thought Professor Widom was incredible.
Had you taken a distance learning class before? If you just stick a random professor in front of a camera, odds are pretty good that they're less interesting to watch and listen to than the "early adopters" that have been a part of this new wave in education thus far.
Yes, Professor Widom was incredible in terms of teaching concepts clearly. But it's very difficult for anyone to produce such an enthusiasm to a passive camera as one would in a live classroom full of students. What I am complaining is not about the quality of the knowledge they impart, but about the passion and enthusiasm that they just can't impart to a student viewing these videos.
But again, there are professors in these courses (In this particular course the Prof. teaches as if someone is listening to him http://www.udacity.com/overview/Course/cs262/CourseRev/apr20...) who put a lot of effort, to produce an awe of surprise for instance when they arrive at something imp, like in a class-room. But this is an exception and can't be applied to every professor.
So I think there should be an improvement someway in this regard.
The interesting thing is there won't be all that many "slots" for professors in this market. So in the medium-term we should start to see more and more interesting lectures as the ones who are dull are simply beat out by the ones who aren't.
That's not always going to be a good thing, but it seems inevitable.
Actually, I've found many live classes incredibly boring and the lectures uninspiring. Passion is a function of the instructor and not the media. The really good teachers would do well in both live classrooms and online setting.
Let me just tell you all how awesomely happy I am reading this. So many mixed emotions, all positive I must say. Made my eyes all watery.
I don't expect everyone to understand, most of you here have had some sort of formal higher education. Where I live high-schools only offer Business or Science with Biology, Chemistry and Physics. That is it. Being fascinated with computers from an early age it is what I wanted to study. But in order to get any higher formal education one must go abroad, which means a lot of money, more than I or my family could afford. So until very recently I have been getting my education through books, articles, tutorials et al. This was OK, but I always felt I was missing something, felt like it was all a bit fragmented. I have pieces from here and there but never the complete thing. Then came along Mitx, Coursera and Udacity. So I started watching all these lectures and boy did all things fall into place. You have no idea how great it feels to actually know that you know something after a long time of uneasiness. It brought some completeness to my life.
Of course I'm nowhere near where I want to be. This feels like end of an era. I can't help but smile to see traditional education systems come to an end and see it all unfold in my lifetime.
"EdX will release its learning platform as open-source software so it can be used by other universities and organizations that wish to host the platform themselves. Because the learning technology will be available as open-source software, other universities and individuals will be able to help edX improve and add features to the technology."
Is anyone else worried that this'll be a one-sided "we released the damn source in a zip file" style open source? When administration has such a big stake in a project like this, I hope they will allow community style open source. It's harder to justify each design decision you make to a bunch of whining disagreeing third parties on a mailing list, but ultimately I think it's for the best.
Look at who is actually doing the work, not the press releases. The first class is co-taught by Anant Agarwal, the director of CSAIL, Jerry Sussman, who is a founding director of the FSF, and Piotr Mitros, who has been active in the free software community since the nineties.
Until recently, I would have agreed that other fields are different.
But I interviewed knowledge workers from a variety of fields (e.g. a variety of engineering fields, management consulting) about hiring for a project I was considering. I was surprised how small a role resumes/credentials played in most hires.
A lot of job contacts are made by word-of-mouth, hiring decisions were largely based the candidate's ability to talk intelligently about their previous projects, and HR asked for a resume after the deal was almost done.
I'm sure there are situations where traditional resumes are still very important. But it appears less common than I'd assumed.
I also struggle with this but it's difficult to relay the value or amount of your education to most employers without some sort of credential attached to it.
If I read a pile of books on international development and the economics of foreign aid, participate in local group discussions, go to lectures, and contribute to online communities, how do I communicate that base level of knowledge to an employer?
If I could enroll in an EdX international development program that consists of a series of classes and projects and results in me receiving an EdX International Development Certificate then I have a short one line item that I can stick on a resume if I want to apply to work with a business in that realm.
Not always, and less so in technical fields, but there is a huge difference between a course like "Leveraging Sharepoint for HR Management" (job training) and "Philosophy of the Mind" (pure education). You can only put the former on a resume, but I find the latter to be more meaningful.
To what degree do these initiatives incorporate the latest e-learning knowledge/ideas?
Also, to what degree do these and other e-learning curriculum or processes align with rapid change in relevant technology knowledge and skills?
I think that these types of programs can call into question or clarify the distinction between academic and vocational knowledge/experience.
What good is a programmer who isn't able to recognize the difference between an algorithm which takes exponential time or memory versus one that is linear? On the other hand, what good is a programmer who wastes his time optimizing an algorithm because he didn't know how to use the profiler or worse, was just using an outdated library or technology platform?
Does it say anywhere how Harvard+MIT will price their courses? Will they be free? I should read the article myself and find the answer, but the webpage design, the font and the rambling text is uninspiring. Like an academic paper! (sorry)
This will be the end-game business model for edX, Coursera and Udacity - once they gain critical mass by establishing a large user base and a good reputation beyond their parents' reputations.
I see them creating two tracks based on the same open content: 1) a free not-for-credit track for informal continuing education and 2) a credit-based track with more stringent exit requirements (traditional final assessments and assignments).
For the latter, they will be able to charge quite a bit, if it's commensurate with a proper degree from an accredited institution.
The billion dollar question: how are they going to administer assessment for distance learners? The ability to securely and reliably administer tests by remote is the final piece of the puzzle to enable accreditable distance learning courses.
EdX will be available to anyone in the world with an Internet connection, and in general, there will not be an admissions process. For a modest fee — and as determined by the edX board, MIT and Harvard — credentials will be granted only to students who earn them by demonstrating mastery of the material of a subject."
Although your credentials won't say Harvard or MIT.
I wouldn't take part in this for "credentials". I would do it for the sake of learning. Some of that acquired knowledge might be applied for commercial gain, some might not. Does it matter? The cost is the cost of an internet connection and time invested in learning. Hopefully what they produce will closely mirror the course contents and requirements for their undergraduate courses.
I have taken numerous amounts of online courses from h.s on up. The classes I truly enjoyed were hybrid courses when the class would meet every so often. I loathe 100% online courses I like having people around to ask questions,see how they did something,etc. Posting on a messageboard just doesn't give the same vibe.
This is great to see. For the last week or so I've had this uneasy feeling that much of the potential of this wave of elearning was going to get strangled by patents, and other IP concerns. This looks to me to be a big chunk of openness and prior art being set free. That won't stop the trolls, but it's a start.
Online education platforms are future, I am not saying that class room education is going away but both can compliment each other. I am taking Udacity classes (one class at a time as it is very easy to get overwhelmed and join 3-4 classes and don't complete any)and I like it.