But this is the best kind of regression. Maybe the childish dream of "hey, let's just teach everyone" isn't so ridiculous now that we have the right technology. I find it easy to get frustrated sometimes thinking of how much power we have at our disposal, and how much of it goes into more efficient cat sharing and other electronic distractions (some of which, in fairness, I like quite a bit). This is a pretty cool example of how much what we do can mean to people: not just entertaining them but radically improving their lives.
The best part, though, is that this isn't even news. This has all already happened. Between the Khan Academy, OCW, Coursera, Udacity and edX it's actually a crowded field now. Great! Large universities move slowly, and I'm sure there are a lot of people who've been pushing for years just to get things to this point. Now things look like they're starting to snowball. It's easy to ignore one university or a crazy startup, but when someone says "hey, uh, half of the Ivy League's on this thing" it gets attention. I'm really looking forward to whatever comes out of edX, but even more to the inevitable answers to edX. It's a great time to need an education.
This is even more true in corrupt countries. The corrupts have an incentive to keep people away from knowing their rights. A large part of knowing your rights comes from education.
- Bread & Circuses
- Plato's Cave
- Hegelian Dialectic
But I'm happy we are advancing, I believe education is the foundation of everything, so we sure are making some big steps.
That said, I really doubt that these classes will remain free. (Especially with likes of Hardvard and MIT.)
However, it has several big downsides, which result on artificial barriers like absurdly difficult exams (a calculus exam I took had only 2 people out of 1200 passing), quotas on classes or specializations (sometimes decided by scolarship and other times just by random drawings), and some really bad teachers and overcrowded classes (there are good teachers too, and things pick up after 2nd or 3rd year when most people have dropped).
I ended up getting a degree from a private university.
Having all of these online courses and virtual university education does sound really great :)
The difference is a bit difficult to explain. Both have videos, forums, and wikis. Udacity courses are set up as short videos punctuated with many questions and mini assignments (running in an in-browser Python IDE), along with larger homework projects. Also, the forums are continually monitored and new videos are added to clarify concepts that students are struggling with.
In contrast, the Coursera course I'm taking (AI) has longer videos (6-20 minutes) of the instructor mumbling as he draws over and over on ever increasingly confusing Powerpoint slides. Sometimes a video will have one multiple choice question, other times the video will not have any questions at all. The worst part is that only once has the video gone on to explain the question. So if a student has a problem understanding the question, they will have to resort to the forums. There's no follow-up, unlike the questions on Udacity. At the end of each section (about an hours worth of videos) students can take a five question quiz. Granted, the feedback on the quizzes are a lot better -- but it's a lot to expect an hour of instruction to be reinforced by a mere five questions.
Basically, the Coursera course is taught as if I was sitting in a class watching an instructor draw on a Powerpoint -- the fact that it's running in a web browser and can provide a different method of teaching seems to be lost on the instructor.
Granted, this might be a critique of the instructor more than of Coursera itself -- I'm only taking a single course from them, whereas I'm taking two on Udacity. But Udacity seems to understand that you can't just take the experience of sitting in a classroom and put it online: you have to understand that this is a new medium that allows new methods of teaching.
To conclude this rambling post (sorry, I didn't know how to explain what I'm feeling as a student more concisely), if these online course ventures that are popping up all over the place are going to succeed,
they are going to have to use the medium of the browser to its fullest: and in so doing, I think they will have to compete with traditional universities. That's what worries me when edX says the online classes will supplement the in-college experience: I think that you're going to have to beat the college experience to succeed in this market.
- Udacity sometimes belabors some really basic points. Coursera never gives me that "dumbed-down" feeling. It's more like a real, and challenging, college course.
- Coursera lets me speed up the lecture, as much as 2x, and every time I go to a new lecture it remembers that speed. I managed to speed up Udacity lectures by going to html5 in youtube, but I had to reset it every time I started a new 2-minute segment.
I think this actually may be Udacity's biggest strength. It's a lot more hands on than a college course, which makes it somewhat easier. But do you actually learn less? After thinking about it, I really don't think so.
With your average college CS class they spend a few minutes talking about some problem, and then you spend a week on your own working on it. Whereas with Udacity they actually teach you all the concepts you need to solve the problem, and then you work through it together step by step.
The strength of the traditional college approach is that it teaches you to be resourceful to some extent. But this comes at an enormous cost. You basically spend all week writing shitty code and trying to make it compile just to learn at best one or two concepts. Whereas with Udacity they are actually teaching you how to write quality code, but you still have to do the most challenging parts of the algorithm yourself. (And these aren't trivial, they often take several hours.)
Clearly at some point you do need to learn to fly yourself, but with Udacity at least you can wait until you're up to speed on the basic tools rather than being shoved out of the nest on day one. Even if it seems dumbed down, I'm completely convinced that you're actually learning much more this way than what you learn in the traditional CS classroom. And after all, the measure of a class shouldn't be how hard or easy it is, but rather how much you learn from it.
Ultimately, maybe we'll end up with classes that have several levels of explanation, letting you pick which you think best suits you for that particular piece...maybe for one part a quick presentation of an equation is sufficient, and for a less familiar part you want a step-by-step walkthrough with lots of intuitive explanation.
Also, there has been no acknowledgement of how contrived the exercises are. For instance: exercise one gives a data set of a the profitability of a company's existing stores versus the population size of the city in which the store is located (in units of 10,000 dollars and people, respectively). The range of the data is 5-23 (population), with most of it concentrated below 10. We fit a straight line to the data using least squares, then use that line to predict the profitability of two new locations--in cities of populations 35 and 75. I understand that this is an intro course, but there is not a word about how ridiculous this is.
I don't mean to be overly negative. I am enjoying the course, but I am surprised a bit by how basic it is. Let me say that I do like the approach of the course to ML, which is to formulate a parameterized cost function and then minimize it by some general method, rather than the typical statistics course approach which is to solve ordinary least squares directly, which gives an "exact solution" (given the data) but does not generalize to more general models.
I know this is foundational material and overall, I am impressed by the approach of the course, but I would expect more comments on the weakness of the naïve methods we are employing at this early stage and how they will eventually be improved. I find it very helpful when professors at least reference more advanced methods or provide references for further reading by the interested student. Admittedly, that is more frequently a feature of graduate courses, but encouraging students to go beyond the material is an important aspect of good teaching. I have watched the videos for several other online courses and I do appreciate the fact that Coursera is allowing me to hand in assignments for grading, which vastly increases by engagement with the material. This, in fact, is the most valuable resource offered by the program. The lectures themselves are fine--if a bit dry--but a good book or a set of well-prepared notes (not slides) would probably suffice just as well if accompanied by the assignment grader.
All in all, this is great. The more people who know about machine learning (and have access to higher education in general), the better.
You might be interested in this version at CalTech:
Their advantage isn't just their technology platform, it's that they seem to have impeccable taste and judgment. They aren't just a bunch of MBAs who are 'licensing content' as part of an 'online distribution play' or whatever.
I actually think that Udacity is quite similar to YC circa 2005/2006 in a lot of respects.
- Short videos: This have many different advantages and works well if its other properties. Let's you set your own pace, instead of following the long video's pace. Easier to separate topics into different parts you can easily click to go back to. When necessary, it's easier to pause between topics (since the video pauses automatically) for you to take a step back and think about you just learned for a minute. It matches the interpreter nicely, learn something new, try it, learn something else, try it. It allows the instructor to better separate links to appropriate material in the description, by topic.
- Online interpreter and automated tools: I suppose this is one of the reasons why they chose python. Similar to how codeacademy works, udacity is like codeacademy + video instructions instead of text. And this is genius. The videos are taking into account that you have an interpreter with you, and everything is integrated. The lessons, quizes and homeworks are all automatically integrated with the interpreter to make sure that you try out what's being taught and the class can give you instant feedback if you're doing it right.
- Production value: Not only the videos look better, but they are edited to be more efficient. Unnecessary parts are cut out for brevity. The video and sound quality makes sure everything is easy to see and hear. Some other online courses often times display a printscreen of the IDE, with low resolution, which makes the code unreadable. On udacity, the code is always readable and the audio always easy to listen. The better UI makes it easier to navigate. And of course, the superior visual quality just makes the whole thing more pleasant to consume.
These 3 characteristics I wish every online course would try to learn from udacity and implement themselves. It's important to note how these 3 are glued together and help each other. Short videos gives you breaks to use the interpreter. Higher visual quality makes the code easier to read and then try on the interpreter, etc etc Udacity just feels like a consistent and smart bunch of ideas put together to make sure we learn easier. Hopefully, others will evolve with them.
Not sure we need yet another platform with eDX, but let's see. So far it seems more of a West versus East type of thing..
I gave up in both Udacity and Coursera (Though, I liked Coursera more), and still love those MIT Video Lectures.
I think it's a matter of preference.
I'm currently taking their Machine Learning and Computer Vision courses. The ML course is quite like what you described for Udacity: there are a lot of short lecture videos with questions embedded in them, online quizzes and automatically graded programming assignments. It's a very engaging way to learn and I'm getting a lot out of it. The CV course, on the other hand, has so far just been videos of the professor talking & no supplementary material.
I suspect that Coursera provides the platform and leaves how the material is presented on it up to the professor. The ML course has been run once already, so they presumably have the benefit of experience; whereas it's the first time the CV course has been run so it's quite possible that they just haven't figured out how to take full advantage of the platform yet.
It's probably this. I'm taking Model Thinking on Coursera and there are at least 2 pdf for each section (20 sections). Some of them from professor's book, some links on the web or some scientific articles.
I even stop reading it all because it's time consuming, but I'm happy it's been offered anyway.
About the style, well, it fits in the Content: programming stuff is better on Udacity; academical/theorical stuff, Coursera.
Edit: and I also agree with Brunov, Udacity courses are not mathematically rigorous.
I agree though, the Udacity lectures are more engaging.
But yes, from what I've taken in the AI class so far they are definitely teaching Computer Science. Udacity so far seems to be "applied programming" with a bit of helpful CS here and there.
It might not seem like an advantage now, since they're competing with more prestigious options from Stanford and MIT, but I think the decision to detach Udacity from Stanford will prove to be a very smart choice. I think we'll see real universities take a two-faced approach to online learning: they will want the attention it draws, but will for fear of diluting the prestige value of their brands. (See the protracted wrangling over the certificates in the original AI course. And can you blame them? The brand is probably the most valuable part of a modern diploma).
This is Udacity's real advantage: they don't have to worry about prerequisites, four-year graduation rates, department budgets, or the physical constraints of offline universities. They are already much more nimble and much less risk-averse than some of the other initiatives, and it will be difficult for the programs still affiliated with universities (institutions hanging on to practices invented in, like, the 13th century) to detach themselves from offline constraints and adapt to the web.
I just took it and loved it. You do build a search engine, but te real focus is learning programming. It's an intro course.
Conspicuously missing are any specific details about the operation or value other than its brand name and the instantly heavy bureaucracy. It's likely this because they have no specific details yet and the negotiations to date have been about who gets the biggest seats on the board, what compensation packages they can negotiate and who will win the most prestigious titles and positions in behind the scenes political wrangling.
As far as the software that runs it, the old canard of making it open source and having other people build it is just tossed out as if that is a magical solution to design.
Nothing about this smells agile. It smells very industrial and slow. Compare to Udacity and Coursera who each are happily running dozens of classes to hundreds of thousands of students each, responding quickly to feedback, and demonstrating clearly they are up to the modern speed of doing things.
Coursera's delays were because it is much more closely affiliated with Stanford than Udacity is. In the fall they didn't really have the Coursera name settled upon yet and the certificate printouts they sent to people who completed the courses had Stanford's name on them.
Discussions involving university reputations are always going to be long and dragged out. Udacity avoided them by having a clear separation between the website and the institutions of its instructors from the outset. Coursera acquired that separation over time.
Harvard and MIT's reputations are most of what separate them from FullSail and the University of Phoenix. It is important that they protect them. It is encouraging that more universities are following Stanford into this space.
Courses with broadcast lectures and server-based practice/homework/exams will need the (hu)manpower that universities currently command in order to grow quickly.
The certificates they sent had a paragraph disclaimer at the end pointing out that the certificate had nothing at all to do with Stanford. This was the only mention of Stanford. This clause was only added because Stanford legal requested it since the classes were taught by the Stanford professors who set up the system, of their own initiative and definitely not something initiated from the bureaucratic side of the institution. Your post suggests that the certificates indicated they were granted or approved or validated in some way by Stanford. This is not the case at all, it is the opposite. I recommend you track down one of these certificates and examine it to your satisfaction.
As far as Coursera, it didn't exist in Fall. It was created in response to Stanford lawyers and bureaucrats going apeshit and shutting the venture down once they saw what a threat it was since the classes were as good as what they were charging for. This caused several of their best professors to resign and leave Stanford.
Stanford is about their reputation, which comes from their top notch professors. With professors leaving, the reputation is worth less than before.
Stanford intentionally isolated themselves from this venture and tried to punish those who pursued it. This shows how committed their administration and legal staff are to the future of education. Not at all, in direct conflict to their most progressive and talented professors.
The future clearly belongs the rogues who are leaving the inefficient and ineffective old system behind to join the ground level grass roots work of modernizing education. This is something that institutions are showing themselves incapable of doing, and it scares them. They won't go down easily. They will fight this, attack the new paradigms, and try their damnedest to retain an economy based on buggy whips that supports their institutional power, wealth and obsolete practices. Many on the forefront such as Salman Khan do not have any teaching credentials or background, and that is how it must be for the old practices do not work.
Just to clarify, this was one professor, Sebastian Thrun. The Coursera professors still teach at Stanford. In fact, the Coursera effort has been to integrate on-campus and off-campus efforts from the beginning.
The off-campus students of the DB, ML, and AI classes were given access to interactive lectures, exercises, and exams which was revolutionary. For their part, the on-campus students were freed from the lecture problem all other college students face. When you attend most college lectures, you might as well be watching a video (to most professors chagrin).
Despite professors' pleading, in 2012 the best way to get lectures to students AND have an interactive experience is to separate the lectures out entirely and then simply have interactive "lab-ish" sessions when on-campus students are in class. This is what they did in the DB, ML, and AI classes, the Stanford students for their money enjoyed more intimate professor access, and these extra learning modules. On top of that, it must have been a big relief to be a part of a class run that way. At other schools, if a professor has videos from previous years or even slides posted online, many kids just don't go, there isn't much point.
>Your post suggests that the certificates indicated they were granted or approved or validated in some way by Stanford.
My point was that the relationship with the Stanford name was strained and it created legalistic issues. It seems like we're in agreement about that.
This is confused. Coursera didn't announce it's presence until very recently. It's existed as a stealth-mode startup for a while.
On the other hand, it can make things difficult for us in the working world who have to balance education with our work life! I'm taking three courses now, which I found out to be overly ambitious (I should probably spend time with my wife!) so I'll probably drop one.
First one to award real degree credit per course wins, I suspect. The Open University already offers this, so it can be done.
This is a complex issue. Real degree means ID checks, accreditation costs, and testing in person. That costs money. As Sebastian Thrum pointed out recently, those things would prevent most people in the world from taking these classes, and is why he chose not to go that route. For his first AI class he had 160,000 students which spanned every single country on earth except North Korea. In one quarter, he may have taught AI principles to more students than all previous professors in history combined, certainly more than anyone at his university had. He also gave the exact same tests and assignments to students at Stanford for real credit and they did no better than random people in far away lands who were struggling with issues of finding electricity and an internet connection. Forcing to a credit model locks those people from outside the first world out of the system.
What does credit mean? I have taught high school and college classes, and elementary school students. Now I do software and hardware design, and I hire developers. A candidate having a degree has proven to mean absolutely nothing at all as far as technical capabilities go, nor do their grades! About the only thing I know for sure is students from Stanford and MIT are in general more capable than those from a community college, and people with degrees from online for-profit colleges like U. Phoenix and CIS degrees from anywhere can be relied on to not know enough to perform.
So I have to check people's body of work, their projects and interests, and we do proactive recruitment as well. This results in finding talented people. Not "resources" but people.
Someone who has taken a full set of courses at Coursera and Udacity is going to know just as much as someone with a "legit" credential from the university. So what is that legit credential worth? Nothing! Rather than promote more useless credentials, it would be more useful and productive to promote fewer. We should even consider getting rid of degree credentials all together since they don't prove anyone knows anything at all. Whether someone can perform is not correlated to whether they have a degree.
Yet this statement completely undermines your point. I graduated from community college instead of going to high school, and now wherever I go I get dirty looks for having some random "fake" school next to a top-ranked one. I've been flat out rejected by people who stated that "[they] know what community college portfolios look like", without knowing the whole story.
As much as I wish credits and degrees didn't matter, this is not universally true in application. In fact the less they matter on their own, the more accessible they should become! It doesn't force anyone to lower their standards. That is why sooner or later, the credits will matter.
You went to community college, like I did (for the entire first half of my university career - Oakton Community College represent!), so you can't think that's a completely unreasonable statement. OCC certainly wasn't MIT, but it was a place where you could learn a lot of your undergrad math.
Not to say that you won't find people that will dismiss you out of hand, but you'll find that if you're black, or from out of state, or went to Yale instead of Harvard or Harvard instead of Yale - depending on the person you're talking to. Some people have really horrible quality heuristics:) But he's not doing it.
I wish I could compare to MIT but since I have no acquaintances there, I can only draw comparisons between a few programs in art and design (philosophy might be another story). --This is the fourth time I've tried to rewrite what I want to say before hitting reply and it just isn't working out. It's a long topic, and abbreviating it simply turns it into a rant. If you really believe that degrees/credits don't matter, then you should be careful when and where you stray from that principle. If the choice is between two people with no experience, no portfolio, but a degree and maybe an interview, then it's virtually unavoidable. But, if they have work to back themselves up then why even look at their degree? Even if you have certain feelings about certain places (like UPhoenix), you're better off getting what you need to know from an interview since you never know the circumstances that led people there anyway (and there are many... visa requirements, money combined with being ill-informed about other options, etc).
Probably better to just leave out Phoenix altogether from your resume as an expensive mistake. And, if people won't look at you even though you can show them a pile of good work that you've done - that probably wouldn't be the greatest place to work anyway.
Phoenix is no better than no credential at all, and also implies a general ignorance of the tech industry. In addition it costs twice as much per year as state school did per degree, and dozens of times as much as the internet and old editions of books from http://used.addall.com :)
But in response to your paraphrase, my hope is that they don't end up locked in to that duality. A degree may be an ideal endgame, but the next best thing would be to bypass the accreditation steps and work on a sort of transfer credit treaty. It's in between accredited course and CLEP subject test.
I've earnt 150/330 points at L1/L2 towards a Bachelor's around Software Engineering with the Open University UK. I pretty much just took took exams and wrote the required essays in order to pass.
NOTE: the level 3 courses have programming projects connected to them.
Right now I'm following Compilers and Finite Automata with Coursera. I've learnt more in a week than I've learnt so far with The Open University.
With Coursera, I actually have to work to pass the class.
So yeah, I hope that Coursera have a plan to turn credits into real degrees. They can charge real money for it.
The one thing I see standing in the way of Coursera et al is identity verification: the OU require you to physically sit exams and provide proof of IT. I can't see how Coursera can get past that.
Since I moved to the Netherlands and learning Dutch (fluently), the quality of my English spelling / grammar has plummeted :(
But, increasingly, and especially in technical fields, degrees are being supplanted by other talent or qualification indicators. Open source work is a great example. There are significant numbers of folks without CS degrees (or even degrees at all) who have found gainful employment as coders based on their open source work alone. Whether it will be accepted by the Fortune 500 crowd remains to be seen though, and certainly the "show me what you can do" idea doesn't really work for non-technical jobs.
At least in the short term offering a degree for these online courses might help.
I wonder if for the non-technical crowd who can't show what they've learned an online degree will be thought of in the same light as a traditional brick and mortar one. It isn't so far, but that will almost certainly change if that degree comes from some place like MIT or Harvard.
For that matter, Western Governors University is still quite large. They could pivot much more easily than Harvard and integrate this into their current system, and they have the advantage of accrediation.
Coursera is likely to head mostly the same way.
Udacity, though, looks like they might lean more towards reducing the challenge to increase the audience. That feels a little dangerous as they're then open to considerably more competition from other schools. The floodgates are about to open.
What Udacity has in the short run to mitigate that possibility is a start-up agility mindset. But they'll calcify in the long run, everyone does, and then what?
What will interest me most is to see how many people can take on the challenge level of EDx. I suspect it will be many more than anyone would have thought.
The people in real trouble are those offering mediocre education for high prices. 5-10 years from now those people will be in a new line of work.
With github and stackoverflow. Do we really need a degree? You only want a degree so you can get a confirmation that you know what you know, so employers will hire you. But why don't you just link to your github and stackoverflow accounts? These online courses could incentive you to build awesome stuff and put online for this purpose.
Then the whole cycle is complete. You have a place to learn, and means to show what you've learned to get yourself employed. What else do you need?
(well, maybe the social part of universities is missing, but that can be fixed with hackatons I guess)
Github and a degree show different things.
Github shows that the person can write code. It doesn't show that he knows how to decide what code to write. Did he consider alternative algorithms? Did he chose based on understanding the strengths and weaknesses of the algorithms? If you see an O(N^2) algorithm used somewhere instead of an O(N log N) algorithm, is it because he didn't know better, or is it because he determined that for the inputs in this particular project the O(N^2) is actually faster?
A degree in the appropriate field from a good school goes a long way toward showing that the person can do that kind of analysis.
However, most of the people I know have forgotten just about everything but the very basics from their algorithms class within a few years.
When I was teaching myself to program, I knew 3 guys who were seniors in CS at Georgia Tech. With only about 3 years experience myself, I could code circles around all of them.
At the time they may have had more theoretical knowledge than I did (definitely not true now though since I've caught up on the theoretical side), but I would have been a much better hire for 95% of programming jobs.
Not because I was some kind of rockstar, but because I'd had more practical experience actually writing real programs. Sure if I was working at Facebook, or trying to scale something truly massive (or working with resource limited embedded systems), they would have had an edge on me, but the vast majority of programming that happens each day doesn't require that level of computer science chops.
There are definitely jobs that require a "Computer Scientist", and if you want to work on interesting problems a CS degree is extremely beneficial. However, a CS degree is neither necessary nor sufficient for most programming jobs.
(That being said, I'm currently in the process of finishing a math degree.)
I can't speak to the effectiveness of github and stackoverflow as signals of programmer ability, but I know most other disciplines do not have such services, nor will they anytime soon.
My primary goal over the next semester is to compile all of my notes and certificates from these courses and scrap with hell to get credit for them. Otherwise I'll be 21 with three degrees and still need to go back to school for two years, just to get my engineering degree so I can go to graduate school for CS/eng.
--typed quickly from my ipod so apologies if my argument got funky and rushed
An educational system that finds alternatives to these artifacts will have an advantage over those that rely on them.
But I would happily do all the coursework equivalent to a degree in math for my own education. I just don't care if anyone recognizes it as a degree or not. I'm not willing to pay degree prices for it, but I'd pay some token amount.
So, the spot is open for an education tool where text is king, like wikipedia, but with a syllabus.
I like to watch at 1.5x (or 2x if it's something I'm really sure I completely know from previous courses or experience) most of the time, and then if something starts to be especially confusing I'll slow down to 1x.
Not many takers up to now though. Any feedback (especially harsh and constructive one) would be appreciated.
With side notes, explanations, graphs, live tests, a short video if needed to explain a complex topic (only when required) and much more. So, a book is a good start, just make it rich.
This makes it bigger than Coursera and Udacity.
I think that a useful feature would be to allow students to comment on / review the courses. This will probably become more important as institutions start to offer similar courses.
By the way, all of these courses from Udacity, Coursera, MITx lack one unique thing. These videos can't reproduce the passion of the teacher in a live class-room. In that respect they are little boring. While they are excellent resources, kind of manuals to learn stuff, to actually 'improve the experience' they need to pump passion into video lectures.
More than these video lectures I like the actual recorded class room lectures that are kept online for public. Like cs50.net and Tom Mitchel's Machine Learning.
Does anyone else feel this 'passion deficiency' in these courses, like me?
I guess it's a personal problem then!
Had you taken a distance learning class before? If you just stick a random professor in front of a camera, odds are pretty good that they're less interesting to watch and listen to than the "early adopters" that have been a part of this new wave in education thus far.
But again, there are professors in these courses (In this particular course the Prof. teaches as if someone is listening to him http://www.udacity.com/overview/Course/cs262/CourseRev/apr20...) who put a lot of effort, to produce an awe of surprise for instance when they arrive at something imp, like in a class-room. But this is an exception and can't be applied to every professor.
So I think there should be an improvement someway in this regard.
That's not always going to be a good thing, but it seems inevitable.
I don't expect everyone to understand, most of you here have had some sort of formal higher education. Where I live high-schools only offer Business or Science with Biology, Chemistry and Physics. That is it. Being fascinated with computers from an early age it is what I wanted to study. But in order to get any higher formal education one must go abroad, which means a lot of money, more than I or my family could afford. So until very recently I have been getting my education through books, articles, tutorials et al. This was OK, but I always felt I was missing something, felt like it was all a bit fragmented. I have pieces from here and there but never the complete thing. Then came along Mitx, Coursera and Udacity. So I started watching all these lectures and boy did all things fall into place. You have no idea how great it feels to actually know that you know something after a long time of uneasiness. It brought some completeness to my life.
Of course I'm nowhere near where I want to be. This feels like end of an era. I can't help but smile to see traditional education systems come to an end and see it all unfold in my lifetime.
Is anyone else worried that this'll be a one-sided "we released the damn source in a zip file" style open source? When administration has such a big stake in a project like this, I hope they will allow community style open source. It's harder to justify each design decision you make to a bunch of whining disagreeing third parties on a mailing list, but ultimately I think it's for the best.
Though certificates look good on a resume, that makes them tomorrow's solution to yesterday's problem.
But I interviewed knowledge workers from a variety of fields (e.g. a variety of engineering fields, management consulting) about hiring for a project I was considering. I was surprised how small a role resumes/credentials played in most hires.
A lot of job contacts are made by word-of-mouth, hiring decisions were largely based the candidate's ability to talk intelligently about their previous projects, and HR asked for a resume after the deal was almost done.
I'm sure there are situations where traditional resumes are still very important. But it appears less common than I'd assumed.
If I read a pile of books on international development and the economics of foreign aid, participate in local group discussions, go to lectures, and contribute to online communities, how do I communicate that base level of knowledge to an employer?
If I could enroll in an EdX international development program that consists of a series of classes and projects and results in me receiving an EdX International Development Certificate then I have a short one line item that I can stick on a resume if I want to apply to work with a business in that realm.
Also, to what degree do these and other e-learning curriculum or processes align with rapid change in relevant technology knowledge and skills?
I think that these types of programs can call into question or clarify the distinction between academic and vocational knowledge/experience.
What good is a programmer who isn't able to recognize the difference between an algorithm which takes exponential time or memory versus one that is linear? On the other hand, what good is a programmer who wastes his time optimizing an algorithm because he didn't know how to use the profiler or worse, was just using an outdated library or technology platform?
The goal with that is simply to make the edX self-supporting not to make a profit.
I see them creating two tracks based on the same open content: 1) a free not-for-credit track for informal continuing education and 2) a credit-based track with more stringent exit requirements (traditional final assessments and assignments).
For the latter, they will be able to charge quite a bit, if it's commensurate with a proper degree from an accredited institution.
The billion dollar question: how are they going to administer assessment for distance learners? The ability to securely and reliably administer tests by remote is the final piece of the puzzle to enable accreditable distance learning courses.
"Will there be an admissions process?
EdX will be available to anyone in the world with an Internet connection, and in general, there will not be an admissions process. For a modest fee — and as determined by the edX board, MIT and Harvard — credentials will be granted only to students who earn them by demonstrating mastery of the material of a subject."
Although your credentials won't say Harvard or MIT.
Even if I get access to the videos and assignments, it is fair enough.
- This will certainly be at least part of the future of education