Hacker News new | comments | show | ask | jobs | submit login
Surprising Things About School (threadreaderapp.com)
277 points by shubhamjain 8 months ago | hide | past | web | favorite | 183 comments



> "Nearly everyone recognizes that MOOCs are by and large a failure with ~2% completion rates, but they make us feel good because now it’s the students’ fault not they’re not learning, not the school’s"

I understand the context the author is approaching this from - but it discounts the idea that MOOCs can be used the same way we use books.

I've utilized about a dozen different MOOCs and I've _never_ completed a single one of them. Instead I learned the concepts I needed to, maybe completed some test material if it was freely available, and moved on.

Some of those MOOCs were gateways into my career as a developer - so they were absolutely vital to me. This again seems to be an issue with measurement/metrics - I don't believe completion rates are worth that much in MOOCs, just like they aren't worth that much in technical books.


Not knocking your own choices, but its more the backlash to the 2013 "MOOCs are the greatest thing ever! KhanAcademy/EdX/Coursera! Wooo!" Personally, I love them and would love to work with those companies. In fact, when I started teaching, I used many of the aspects I saw MOOCs using in my traditional classrooms.

However, there is a truth to the statement. Almost no one finishes a MOOC (some never even log in to take MOOCs they register for), and predictive models can determine whether someone will complete the MOOC as early as the 2nd week (if not sooner). Some people do grab what they want and leave, but that is not the case for everyone. If we then look to justify why bother, its a little hard to warrant hosting costs, etc. if you aren't seeing anything come from it. If I ran a MOOC with a 95% fail rate, how can I label it successful.

We are unable to do meaningful research on MOOC strategy effectiveness because we are too busy trying to understand why people never log in in the first place.


> some never even log in to take MOOCs they register for

I've done this many times. Why? On some platforms (like Coursera) you retain access to the content of a course you have registered for, but that content isn't available for people who didn't register once the class is over. If I think I might ever want to view that content, my best bet is to register (there's no cost) and then have that content available when I want it.

I've also done this on platforms like Udacity where that's not the issue, because by registering I can create a curated list of courses I'm interested in possibly looking at in the future.

In my experience, the big failings of MOOCs is that they try to copy university classes, and university classes just aren't that good a lot of the time. Many people don't find lectures to be useful at all, but the vast majority of MOOCs focus on them. After the fifth time I'm stuck waiting for the teacher to finish their long personal anecdote or humorous story I usually give up on the MOOC and go looking for a good textbook.

I don't think I've ever seen a MOOC that had written content anywhere close to the quality of Dive Into Python or Learn C the Hard Way.

Also, MOOC efforts for community building are (at least were) focused on a particular class rather than a particular subject. This doesn't lend itself well to long-term community building, especially when people are going through these courses and very different paces.

There seems to be lots of discussion about how MOOCs can be improved, but not much effort into actually trying different approaches.


> the big failings of MOOCs is that they try to copy university classes, and university classes just aren't that good a lot of the time

We are still trying to understand what makes for better "learning". If I practice the piano everyday, I will no doubt have a better ability to play, but I may not understand music theory. As a shameless plug, I am trying to add practice to be a core part of learning Computer Science and a link to my research platform is in my profile.

This is the same separate of vocational schools and coding bootcamps to graduate schools. The traditional 4-year school sits in the middle ground between application and theory and is probably why we have such difficulty deciding "success".

> to finish their long personal anecdote or humorous story I usually give up on the MOOC and go looking for a good textbook

This is ultimately student preference. Some students like a more affable teacher, some don't want a social-able instructor (so long as its not interfering). I will say that your post is confusing as you say you don't want instructor anecdotes but then MOOCs should be doing more for larger-scope community building.

I will argue larger-scope community building is difficult and MOOCs struggle with it; I would say because of its availability. It is harder to build community when the individuals come from such diverse backgrounds. Many factors can make it difficult to build bounds (social, cultural, temporal, etc.). Furthermore, you cannot control why the student wants to take a MOOC. They may not want to give that additional effort to community building for a subject they do not perceive as having higher priority to other life decisions.

I would disagree with "actually trying different approaches". This is being done and academic journals are trying to study them (the deadline for the Journal for Education Data Mining just passed). There are also non-academic approaches like Duolingo and Khan Academy, and they are trying different techniques as well.

Ultimately, there is no panacea for learning. We know engagement can lead to learning and therefore if we can maintain engagement, we might be able to teach. Community-building can be that engagement. Self motivation can be. At that point we need to research these things more to know for sure.


> This is ultimately student preference.

Sure. But the last time I checked, almost all MOOCs were lecture focused. Like I said, I've yet to find any with a text component as good as Dive Into Python or Learn C the Hard Way or other free online textbooks. Perhaps there's are a few out there, but every time I've browsed MOOCs I've found just about everyone to be lecture focused. Out of the dozens I've looked at (on multiple platforms) I can only think of one that wasn't, and that was eventually removed.

Plenty of people use the internet to learn plenty of things. I think it's time to consider how much of the "failure" of MOOCs is due to the failure of online learning, and how much of it is due to the fact that university classes aren't a great way to teach people things (at least for a large chunk of the population).

> I will say that your post is confusing as you say you don't want instructor anecdotes but then MOOCs should be doing more for larger-scope community building.

Instructor anecdotes were an example of why I don't like lectures. With a book you can scan over content that is superfluous and get to the information you need. You can re-read or take slowly the parts you have trouble with, and quickly skip over the parts you already know. All of this is much, much harder to do with lectures.

I'm confused as to why you think professor anecdotes are related to large-scope community building? They're quite different things, and if people view them as serving the same purpose then there's even more of a misunderstanding when it comes to education than I had previously believed.

As for the difficulty of community building, I think you should broaden your horizons a bit. For instance, if you're a bit late on a Coursera course the forums are pretty much dead, and you're better off discussing things or asking questions on another site. Likewise with Udacity - the course might still be active, but everyone that has completed the course has moved on and won't see your message. This isn't because of a difficulty in community building, but because of a conscious decision that most MOOCs take to segregate their forums by class.


(I will ask if you could define "community building" a little more. You did not in your original post, and it seems my assumption was incorrect. Before trying to address better, I'd like to have a more concrete definition)

While I cannot speak for all MOOCS, I have not had the same experience. MIT's CS 6.00x course on EdX was work heavy, Udacity's Web Development course was work heavy, and even Coursera's Design of Everyday Things had participation components. In the only (barely a) MOOC that was video only, it was predominantly a "follow along with me" coding process.

For instructor lectures, I give more positive responses than negative about my anecdotes. At the end of the day, I'm human and I'm trying to enjoy my job (which students can tell if you don't). Development of tutoring systems is still in research phases as we identify knowledge components for different subjects. A history course operates different than a computer science course.

I will say EdX has produced research on instructor lecture videos. Users prefer 3-5 videos and so instructors should design courses like that. Instructors that simply upload a classroom lecture are not appropriately transitioning their material for online use. This satisfies your being able to flip from topic to topic.

I will end the Dunning-Kruger effect suggests that students are not the best at assessing what they know and that a professional instructor has a better idea of what "knowing" something means. This again gets back to my discussion on vocational vs. graduate school and again, humans are flawed, imperfect beings. While an intelligent tutoring system can alleviate this issue, humans are still building them for the foreseeable future.


> Users prefer 3-5 videos and so instructors should design courses like that. Instructors that simply upload a classroom lecture are not appropriately transitioning their material for online use. This satisfies your being able to flip from topic to topic.

In my personal experience, it really doesn't. Breaking something into segments can make things a bit easier, but it still leads to a lot of wasteful time if there's a minute and a half of useful information within an 8 minutes block. You can't scan through it the way you can with a book, and it's much more difficult to review a difficult piece of information you just received (it's easy to re-read a sentence slowly, whereas rewind a video to the beginning of the last sentence is more cumbersome and you're going to be watching it at the same speed).

Text also tends to be much more succinct, where as lectures are often repetitive and meandering.

I think the assumption that multiple video segments solves the problem is instructive. Might not a large part of the problem be instructors saying "We've done X, which has solved Y issue" instead of saying "We've done X, let's look at whether or not it has solved Y issue"? I appreciate the fact that people are attempting to solve these issues, but if there's no differentiation between attempting to solve something and successfully solving something than it shouldn't be a surprise

> I will end the Dunning-Kruger effect suggests that students are not the best at assessing what they know and that a professional instructor has a better idea of what "knowing" something means.

This is a pretty big assumption, and one that I don't think is accurate (based on personal experience, and my experience talking with both students and professors). The best way I've found to test one's ability is to actually apply it to a task, where it usually becomes quickly clear to the individual where the holes in their understanding are.


I searched for The Design of Everyday Things but I could only find a course with this name on Udacity, not Coursera.

Can you check and confirm?


Ah, it was Udacity, not Coursera; my apologies


I still miss the excellent Dive into Python. Last commit was 7 years ago.


> If we then look to justify why bother, its a little hard to warrant hosting costs, etc. if you aren't seeing anything come from it

> If I ran a MOOC with a 95% fail rate, how can I label it successful.

How is the completion rate a better measure of value added than the absolute number of finishers?


Both are important. There are multiple questions to be asked.

For extremely self motivated people, MOOCS are a win, no doubt. Again, MOOCs are definitely better than no MOOCs.

But the promise was that MOOCs can just replace college to a large degree, because the content you get in college will be online. In that instance the question becomes "do MOOCs work for most people?" For what we'll call the "average" person, studies have shown that just isn't the case. Most people don't learn well from MOOCs, unfortunately.

Therefore what we're learning is that a non-trivial amount of what makes a college education successful is some combination of the following things that MOOCs don't have: external pressure, scheduled courses, due dates, a community of learners, a physical campus, etc.

As such, I view part of the next step to figure out which of those aspects MOOCs are missing that are vital to the mix in order to help average people learn the things they need to know. Traditional education just says all of them are necessary, but I'm not convinced that's the case.


Do most people learn well from traditional methods though?

If dropping out of a MOOC involved a big financial hit, social humiliation, and the loss of a great deal of personal freedom, the completion rates would skyrocket. Completely different incentives.

The reasoning that traditional education is successful because it has high completion rates because of external pressures is a bit circular, isn't it?


It also works the other way. It's socially/professionally acceptable to stop working for 3/4 years to go to university, somewhat less so to stop working to take a bunch of MOOCs. Therefore, most people taking MOOCs are also working full time/studying/looking for a job, and they're also likely to be older and have more life commitments than most students. So MOOCs are naturally going to take a back seat to everything else going on in a person's life. They also don't carry the same recognition as having a degree.

MOOCs also come with a number of unique advantages compared to traditional universities:

- They widen access to education to people that aren't able to attend university

- People can take courses they know they might struggle with without fear of flunking them and harming their academic record

- Mature students don't need to make life sacrifices in order to take a MOOC. They don't have the dilemma of whether to up roots and move to attend a prestigious university


Perhaps. The question is which of those things really move the needle; can you have some, not all? For example, our completion rates are 90%+ - better than universities, but there is no social humiliation, loss of personal freedom, or financial hit.


What is the typical demographic of your cohorts? How do you compare to college programs that have similar cohort demographics?

I think all educational outcomes should be measured with respect to the answers to these two questions.

K12 schools in extremely wealthy areas can compare themselves to like peers, but comparing themselves to high-poverty areas is a useless comparison. Neither institution would do well under the other's constraints.

For-profit higher ed is definitely the same way.


We're very spread out, general lower income, more black/latinx than white, so it's not a case of us just grabbing the rich kids.


Could you cite the studies that say "MOOCs don't work for the average person"? While it is certainly an intuitive statement, I find it hard to empirically verify because the population of people who take MOOCs are very different from those who attend colleges.

For example, those who take MOOCs are probably working full / part-time, trying to pick up some knowledge on the side. They will obviously have less time for learning than full-time students.


While "average person" is not the term I'd use, the question you're asking is a bit loaded. What measure is used to say a MOOC "works"? If we are saying completion, then high drop out rates are an issue. If we say high interactivity among those that don't drop out, this is where the current field of research in MOOCs is at. While I do not study it, colleagues in my lab are.

My point is that quantifying the effectiveness of a MOOC is an ill-defined domain. Instead, we acknowledge the attrition rate and do analysis on those that pursue the course. Current research looks at system interactions, the social networks of MOOC forums, etc. to identify student behaviors that show higher "gains", be it course completion or grade.

If a student leaves a MOOC, there is no way to identify why they leave. Likewise, unless asked, it is hard to identify why a student joins a MOOC. Students enrolled in a MOOC can come from a variety of culture, geographical, sociological, motivation, etc. backgrounds. As such, assuming traditional student behaviors is ill advised. However, in the process of learning, access to material is not enough to learning the material and additional effort is needed by the student to build the necessary mental models in their head. If a student drops out of a MOOC, it is hard for the runners/analysts of the MOOC to appropriately say the student learned the material.

As austenallred mentions, current education research is looking at the motivation of learning, as well as what constitutes motivation (and learning for that matter).



>some never even log in to take MOOCs they register fo

So what? How is this at all relevant?

If we look at how many people have walked into a bookstore or library, have browsed through a book but never bothered to actually buy and read it, would we say books are a failure? Would you say books have a 95% fail rate? This is such a pointless train of thought.


Why did you decide that MOOCs should be held to the same standard as libraries or bookstores?

MOOCs are marketed as courses, and sometimes as an alternative form of higher education. Furthermore, I doubt the people who go to the trouble of designing curricula for MOOCs expect students to drop out partway through. Comparing MOOCs to college courses seems like a better match to the image the MOOC companies themselves have promoted. And if we make that comparison, completion definitely matters.


The fundamental difference is that MOOCs are recorded rather than performed live. This makes both the distribution and access patterns more similar to books than to college courses.

Of course any particular MOOC can be made more or less like a normal class by putting the content behind a hefty paywall, enforcing strict due dates, and adding perverse performance incentives.


> Some people do grab what they want and leave, but that is not the case for everyone.

Are there metrics that have measured this? What percentage of users find a MOOC useful even if it wasn't completed?

What I was trying to get at is that by looking at completion rates we might not be looking at the whole picture. Especially if the question we're revolving around is whether the course is worth the creation/hosting costs.

> some never even log in to take MOOCs they register for

[...]

> we are too busy trying to understand why people never log in in the first place

So you have at least 4 classes of 'user':

- did not register

- registered but never logged in again

- logged in at least once

- completed course

It seems to me that there's a different way to categorize the users in measuring course effectiveness:

- did not find the course useful/helpful

- did find the course useful/helpful

So if you are able to craft a course that has magically high engagement, but people don't find it particularly useful - you've still failed.

Maybe the value that course-makers hope to deliver through completion of the course simply isn't high enough?

If a user completes 70% of a course but doesn't see the need for the rest of it, then stopping at that point is the logical move. What sets college/university apart in this is degree which has a very high social impact. The fact remains that MOOC achievements are still widely looked down upon as somehow lesser than their college equivalent.

This whole discourse has encouraged me to revisit those courses I found most effective and complete them. At the very least I can help those courses stats and provide some further narrative of their utility.


A major issue with using "useful/helpful-ness" as our metric for success is quantifying the term. A self-report Likert scale is susceptible to users just rating 1 or 7, or saying its useful even when it was not (https://en.wikipedia.org/wiki/Self-report_study#Disadvantage..., not the greatest citation, but I am short for time).

The issue then looks at what is "success" in a MOOC. If the goal is to just have videos online, I'll just watch YouTube (as I have my own lecture series there). However, observing is considered one of the lower level of learning and things like Bloom's taxonomy point out there needs to be some type of interaction for better learning gains. These interactions require the student to have a more active role in the learning process. If they are not interacting with the system (logging in, watching videos, completing exercises, etc.), then they are not taking this needed active role. This is where "drop out" begins to be quantified and where then we can measure what worked, what didn't.

To address you example, if the user stops at 70%, researchers will ask "why?" From there, analysis of student behaviors, effectiveness of interface/instruction/material, etc. will arise. If the ability to study these things is confounded by the fact that the vast majority quit before completion, it makes it harder to answer the "Why" and "How do we fix it" questions.

Again, what we quantify as "successful" is still up for debate; but if a student drops out, it becomes harder to tell if they learned from the course and if it was student or material that drove that decision.


I suspect something like Udacity (on which classes rae not free and there are more structures 'degree' programs) has far higher completion rates, since it's not reasonable to just impulsively sign up and then stop. That is certainly true for something like Georgia Tech's OMCS. I still think the hype was justified in that stuff like OMCS is legitimately the future of education.


Given the nearly non-existent barrier to entry, I'm not sure that completion/engagement percentages mean much. MOOC content can be useful for help learning certain types of things. But so can books, YouTube, etc. Useful resources for (mostly) continuing professional education has to be seen as a very large step back from changing the future of education for underserved populations.


Why would you complete a MOOC "cover to cover" when they offer no accreditation? People just pick up what they need and move on. If they were offering a complete program and degrees, millions would complete them. MOOCs are amazing, but a simple VOD system can't replace the education system on it's own.


> some never even log in to take MOOCs they register for

> If we then look to justify why bother, its a little hard to warrant hosting costs, etc.

Presumably the people who never log in aren't imposing any load on the system, no? Just rescale to 2% of the size your user-metrics say you need.


I can also confirm I've had this experience. I learned a lot from MOOCs I never completed. That's why I like Khan Academy's model for educational content - aligned in paths, and with some gamification around completing, but ultimately made to be searchable, and relatively easy to pick up in the middle. And that's how I've used it as well - I probably have only a module or two completed, but I've watched many videos just trying to get a better grasp on one concept that wasn't well enough explained in a book I was using, or just to refresh my recollection.


The biggest, most idiotic, and most frustrating problem IMO is that most MOOCs are inflexible in their schedule. This model is perfect for self-paced learning and very few serve that.

Plus, there are instructors and materials that aren't great, and are more useful for the basic introduction to concepts in the first few lessons, which can then enable self-learning that's more effective than the poorly taught meat & potatoes of the course.

(I've finished probably 3-4 MOOCs and didn't finish about the same amount.)


I'm not surprised the drop out rate is so high, it's a reflection of the low cost of entry. I'm not sure it means anything.


I wonder do MOOCs 2% in absolute numbers exceed the number of students in traditional universities e.g., Coursera had "over 22 millions enrollments" (2014?) http://tdlc.ucsd.edu/about/about-Learning-How-To-Learn.html

2% from several millions is more than the number of students enrolled in any UK universities https://en.wikipedia.org/wiki/List_of_universities_in_the_Un...


Its a currently studied phenomena (https://link.springer.com/chapter/10.1007/978-3-319-10671-7_...)

My personal thought is its the same reason people quit gyms , fail at New Year's Resolutions, or even quit Free-to-Play games. It is difficult to create and maintain habit.


Maybe the problem is in measuring them as sort of an all-or-nothing experience, like nothing happened if you didn't see it all the way through. Perhaps we should be judging progress on the basis of individual modules: did you master the material in this one? Or equivalently, the courses should be smaller and better linked with each other: "this course requires you to have passed one of these other courses...".

I think what stops us from using that model is that each course might define its terms differently and assume slightly different pre-requisites such that it's infeasible to teach just one small thing in a way that it can be directly applied to someone else's minicourse.


Exactly. I completed 5 out of ~50 courses I have ever enrolled in. One of the courses I did not complete was instrumental in getting me into graduate school in a field unrelated to my formal background. Focusing on the completion rates misses the point.


Similar experiences here. I get what what I need from a couple of videos and never look back.

I wish that lecturers would move to YouTube and structure each lecture as its own instead of having the burden to fit them into a course. For example, if I don’t have the appropriate prerequisite knowledge to understand what I’m currently watching, I should be able to search for and learn it quickly instead of having to browse through intermediate lectures.


"17. People rely mainly on their parents for advice on what to do about education, and that has to be among the worst places to go for educational advice"

This one certainly hits me hard. With a mother who dropped out of university and a father who didn't go myself and my sister both have good degrees (she's a newly qualified Vet and I'm a startup's birth and death out of uni), and nowhere to get decent advice.

I know a lot of advice I did get was wrong, my professors basically said they don't have good advice for me, and I know I'm not making the most of the experience I have behind me.

Searching for relevant mentorship is a very very hard problem.


I started writing a comment to share my similar story of a blue-collar upbringing, but realised that other than a few chance encounters most of the relevant mentorship I have gotten was as a professional or read in articles/books. The best things I've learned in life through tough lessons - the trick is don't fail too hard.

The important stuff I learned before graduating my bachelors was things like:

1) It's better to be a small fish in the big pond (my career councillor told me this when I was 16, I think I disagreed with it in my first 3 years of uni but now I can totally agree.)

2) You don't owe anyone anything and you aren't tied to anyplace on earth (Mum - grew up moving a lot so when I had an opportunity to attend a more prestigious university my mum gave me a lot of support)

3) Always swap jobs every few years (my uncle worked for the same company for 18 years before being laid off - he has done well but nearly everyone he started working with are execs now)

Most related to your comment is something I was told a few years after university where I was pushing work to fast track my professional development as much as possible. A senior manager told me there is only so much other people can teach you, in the end you need to do the time and build the experience.


Small fish in a big pond? I've heard a completely opposite advice stating that "in a big pond" there's much competition, and dominating a small niche yields better results. What is the rationale behind being a small fish in a big pond?


It's happened twice in my career.

Once choosing to do more difficult courses at a more prestigious university. I don't think this was great on my well being in the short term (always feel like the idiot in class) but in the long term being associated with the brand has helped A LOT. So it was probably more beneficial to get worse grades at a "better" uni.

Professionally rather than being a gas data scientist in a small city, moving to a more competitive more established field of web analytics/marketing (with many more peers) in a larger city has lead to a much higher salary.


I can't see how either choice could be always correct. A small pond can have great rewards, but also great risk. Like if it dries up totally.


I've been in both ponds, and I can honestly say that the small pond was more interesting and personally satisfying. That was, until the pond did in fact dry up.

The big pond on the other hand, pays a bit more but is much less interesting. The big pond is almost certainly never going to dry up though.


I take away something different from both of you. To me, the question is about whether it's better to be the most impressive person in a shithole town or a nobody in an amazing place. I'm not even sure I know the answer for myself, and it probably varies a lot based on personality.


There are probably a couple ways to interpret the metaphor depending on your context. There's more competition, yes, but there are also more opportunities to grow.

A similar analogy is it's better to be the worst player in the best band than the best player in an awful band. You're going to learn less and learn it more slowly in the latter situation. I think this is more what the author was going for.


> It's better to be a small fish in the big pond

I'd actually recommend that young professionals try a variety of situations as they switch jobs. I learned much different lessons from different situations at different stages of my personal development. Though starting that growth path with a position at a large company isn't a bad idea if it's an option.


My single working mother had no bandwidth to help me with getting into college. Neither of my parents had a dime to give me. Despite both my parents being college educated and emphasizing its importance, they had little practical advice or assistance to give. I knew nothing. I ended up applying to one state university, in-state, commuting long distance, not taking any loan, and working 30 hours a week to afford it. I dropped out after a semester from the unmitigated strain and sleep deprivation. I owe my quite exceptional career to our industry being lax about degrees.


Most advice parents or anyone who went to college before say, the mid 90s or before is fairly useless anyway, unless it's study advice. The job market, costs of college, and many other factors are incredibly different.

Just as an example, I had a friend in your situation but whose parents encouraged him to go to an expensive school and not worry too much what to study. You can probably guess what would've happened if he didn't end up doing his own research instead.


Like anything else, they should be able to teach you to fish. I.e. find out where to get the info you need.


Most middle class parents who aren't first or second generation immigrants won't even know the answer to this. To them, the answer is "high school counselor" or another similarly uninformed high school individual. Maybe millennial parents will be the first to recommend internet forums or similar.

Again, for them, it was a very different world.


Industries use degrees as a comparison level when there is nothing else to compare. Through luck and skill you may have ended up in an incomparable position, but many white collars in the industry end up better off and more respected just on the merit of having a doctorate in a field.


For me it was the reverse. Parents that grew up poor and saw college as the way out but didn’t understand dynamics beyond that, so it was, “Go to the best (most expensive) college you can, study anything at all, and you’ve got it made” which is either OK advice or terrible advice depending on what college that is and what you happen to study.


I got similar advice from my parents and followed it, somewhat bitterly. My dad regrets dropping out of college, but he also paid $400/semester for tuition, not $20,000 like I did. I feel like "any degree == $$$" was much more true in the 70s and 80s than it was when I graduated.


A much smaller fraction of the population earned a college degree 30-40 years ago so it was more of a differentiator.

It's still pretty clear that a (4 year) degree changes access to jobs, I guess it's less clear that it will matter for a given person.


A 4-year still matters a lot. The issue is that a 4-year from a state school has almost as much value as a 4-year from an Ivy League. A prestigious university on your resume will open a few extra doors, but in most disciplines it’s just not worth the extra cost. The University of Washington is $5.5k/semester (resident tuition). If you’re spending 20k/semester, you better be getting a degree from somewhere very prestigious and you better have a plan to leverage the network you build there. Otherwise you’re just wasting ~90k on your 4-year degree.

And of course the less in demand your degree is, the more the math favors cheaper schools.


For people where the $15K difference matters a great deal, the tuition at the elite school is increasingly likely to be greatly reduced or free.

Such policy is even trickling down to public schools with big endowments:

https://goblueguarantee.umich.edu/

The place where people get killed is when they go to a private school that is expensive and not particularly an academic standout (of course there's many more mediocre private schools than elite private schools).


> For people where the $15K difference matters a great deal, the tuition at the elite school is increasingly likely to be greatly reduced or free.

This isn’t entirely true. Tuition aid for most students is based on the parents’ income. My parents are (and were when I went to school) upper middle class on paper. They have never been great with money, though, so they couldn’t cover tuition out of picket and hadn’t set aside funding in advance. They definitely helped with living costs but tuition was covered by loans (and scholarships partially). So I left with loans to cover virtually all of my tuition.

If your parents are middle class and can’t cover your tuition, you’re probably in a similar situation. Combined with a low-demand degree, this could easily result in crippling debt.

> The place where people get killed is when they go to a private school that is expensive and not particularly an academic standout

My sister did this. A couple of years at a private religious school before she transferred to a public school. The student loans from those couple of years are absurd and dwarf the rest of her loans. She’s still paying them off a decade later.


Exactly the same experience here -- my mother got a bachelor's right before my brothers and I went to college, which just ended up penalizing us because it looked like we should have a lot more savings than we really had. I think there's a very very small portion of the population who gets their tuition reduced enough to be comfortable. Loans end up covering the rest, so yeah, your average student really does pay close to $20k a semester at most schools if you account for room & board + tuition.


The real issue is that we want "means testing" for tuition but almost no students have significant "means". So you either make (at least public) universities free, or you make the rather strange assumption that a bunch of adults will be handed tens of thousands of dollars from their parents. In pretty much no other case do determine social program eligibility of an adult by means testing different adults.


This is a new thing (not the means testing but the idea behind it) AFAICT.

It was just assumed that if your parents were above the "means" they would help out any way they could because that's what was expected of them as parents.

College was also seen as an enabler and not an expectation for my generation.

Or...if your parents spent all the college fund sending your sister to UCLA you could join the army and emerge as an "adult" so your parents' income wasn't taken into account and get grants and loans on top of the G.I. Bill.


Of course when college was a stretch but not median-income-barely-covers-tuition expensive, the assumption that parents would cover it were more reasonable. And it was less of an issue if they didn’t since the debt load wasn’t insane.


True enough...

I don't think my parents could've afforded to send my sister to Stanford back then which is roughly the equivalent to your average state school today (completely guessing here). They probably would've found a way though.

My point still stands though, means testing was intended to get the best and brightest into college even if they're poor, couldn't even imagine Reagan/Bush entertaining the idea of giving middle-class kids a free ride -- in fact I can even picture Reagan on the TV saying "Uncle Sam ain't your baby daddy!"


Or we could do loans with repayment based on post-education income. There are various ways of doing that, of course, with their own sets of problems.

Note that the UK does exactly this, last I checked.


> Or we could do loans with repayment based on post-education income

The US already does this for federal loans, though it's not the default option, and the servicer (Navient) has been accused of actively steering borrowers away from the option, because it hurts the value of the loan-backed securities that Navient sells.


Ah, I did not know this had started happening. Thank you for the pointer!


When I started looking for answers on what to do, even when I found good ones, it wasn't always easy to do what they said. And I know I failed at this a few times. Just having the right advice isn't always enough, you need to be able to do something with it, often times that's what parents help with too.

Edit: my advice to my kids, get and stay educated (don't care how, college is one way), work hard and treat people decently. Don't need much more than that to succeed.

Edit: Dave Chappelle said in an interview that his dad warned him not to go to hollywood, "you might not make it." Dave said "..that depends on what making it is, dad." And summed up by saying that if he could "make a teachers salary as a comedian", he's made it. That's just awesome.

Worth watching this bit here: https://youtu.be/SyZsxCyoARM?t=1268

NOTE: He convinces his dad to support him going into comedy. It's brilliant. "I was the first person in the family 'not' to go to college, that had not been a slave." - Dave Chappelle


> Searching for relevant mentorship is a very very hard problem.

I completely agree. As someone just reaching a bit over 3 years out of a bachelor's program, I had no one that I could talk to for a solid 20 years of my life about computers, let alone what I wanted to do. Like a bunch of people here, computers were my "thing" but no one around me seemed to take much interest in them. When it came time to go through the college hunt in high school, my mother had no idea what to tell me since although she had a background in math, she worked as a lab tech, social worker, and teacher. I ended up railroaded onto a liberal arts education until I reached my community college, where I encountered people - not just professors - who knew that the magic hardware box could do other things besides send and receive email. I didn't even know it was POSSIBLE to get a degree in computer science until community college, and later interacting with people I met from video game communities. I still couldn't actually get my bachelor's in computer science for financial reasons (have a LibArts BA), and even with knowing about CS as a path, I didn't have anyone who could help me or even just talk to about what to do. Professors at my university didn't know what to do with me, and the university career center tried to shove me into roles like inner-city teacher, advertising, and camp counselor.

It wasn't until two years ago that I became good friends with someone who had been through a lot from his college to several jobs in industry, and he helped me get on track to doing something I wanted to do. Went from a dead-end job doing forced cowboy coding and data entry at a small insurance company to QA automation at a nice software company that works in the transport industry. Even now, I wouldn't really consider him my mentor because I know he's got a lot more work responsibility than I do and I don't like asking him questions that remind him of work, especially since he's my friend first. I firmly believe that if I had someone LIKE my friend as an actual mentor earlier in my life, I'd have been studying things I liked doing way earlier.

EDIT: Something that occured to me after I hit submit is that I still feel that I don't have an actual mentor, yet many articles/blogs and people I've spoken to insist that it's easy and almost natural to find and have one, be it at the workplace or somewhere else; any articles I've read or listened to (example of the latter being the Hello World Podcast) about people who have done it without mentors were people who grew up with computers/programming texts in the home. I'm still not sure if this is a reality or something that happens to maybe 5% of the programmer population.


> I ended up railroaded onto a liberal arts education until I reached my community college, where I encountered people - not just professors - who knew that the magic hardware box could do other things besides send and receive email.

True story: When I was in college, a girl came to me with a question: If she put Borland C++ on her laptop, could she compile programs on it just as easily as on her desktop? I replied that I don't see why not.

Then she asked: "But isn't there a chip in my desktop that does the compiling? Does my laptop have one, too?"

This was a CS major. And an A student. At a technical university.

I feel your pain, man.

And yes, for undergrad education, community or even state college may be more suitable to your interests. A hard-learned lesson for me. The goal of Ivy League undergrad, it seems, is to mold you into a "certain kind of man". "Harvard men", for instance, look and behave a certain way, and know and say the right things. So that when the Harvard men at the helm of power see you, they recognize you as "one of their own" and "good people".

Some of us already have some idea what sort of person we are, and don't want to become anything else. The good news is, we can get all the education we need for pretty cheap, even in the USA. The bad news is, Google is looking for "Stanford men"...

(I should have replied, "Yes, it's called the CPU. And yes, every computer has one.")


I was helping a math undergrad (senior, graduating with honors) from a top 50 university study for the actuarial exam. They were stuck on a question that amounted to "what's the volume of this circular cylinder?"

As I walked them through that, it dawned on me that the problem wasn't recalling or applying the formula, but that they had no idea what x = cos theta, y = sin theta, theta ranges 0 to 2*pi, z ranges 0 to 2 would graph. No idea and no idea how to even begin...


err this is level 2 technician stuff


I agree. I also think that knowing cosine and sine are unit circle X and Y coordinates is at a similar level for university math majors.


Maybe it wasn't clear I meant this is L2 plane mechanics tech i.e. year 2 of a 3 year to BTEC in no way a university level maths


Same here, only the opposite way. Anyone in my family have had degrees for at least 3 generations (majority had 2 degrees in different fields, and some pursued academic careers as well). Dropping out of university to do what I love is still something I feel guilty about - even 10 years later, being pretty much successful in the field I chose, I still feel like I'm a failure just because I don't have this damn piece of paper.


You can always get the paper later -- they are for sale. The main issue is just deciding to spend the time they demand.

Go to school when you feel you need something they can teach you, and not before.


I would gladly invest time into some graduate course, but they require undegrad degree - and getting an undergrad CS degree after programming my whole life, studying a lot of theory and making personal hobby projects in almost every field (from OS to db to compilers to 3d renderers to ML) seems like a total waste.


My advice to my kids "Find a career in something you enjoy. If you love it, it is not work".

What did they do instead? Both went into CS!


That depends on your parents class if your a bright working class kid whose parents didn't go to uni you don't get told the unspoken rules.

You also don't have access to the behind the scenes influence eg being a legacy at Harvard or similar, I was surprised to hear my mum saying I we had stayed in Brummagem they would have used my grandfathers influence to try and get me into King Edwards.


The failure of MOOCs #10 is a failure of metric measurement.

In the old days if you wanted to learn something, often you'd pay $20 to $50 for a book. For example, Amazon order search claims that somewhere I have a copy of "Paradigms of Artificial Intelligence Programming" by Norvig, which is supposed to be a decent book in the field, cool. Ironically maybe that IS the textbook for the famous AI MOOC class, I don't know, maybe it should be if it isn't. However I never finished the class, only glossed thru the book, and barely remember either.

I was curious about the topic, not trying to meet someone elses arbitrary goalpost, and the "graduates" metric isn't a useful number.

Kinda the difference between running because its a beautiful day out and I need exercise, and running a competitive marathon against live athletes in person solely for the goal of getting a number when I run thru a gate. I would guess in the general public the former outnumbers the latter 50 to 1, so the MOOC stats aren't unusual at all.

Another interesting analogy is "The Mona Lisa" painting is a failure if you measure its worth solely based on number of PHD students graduating who give it credit for them selecting "Art History" as a major. That doesn't mean it needs to be burned as a failure.


OP here (shocked to see this on HN):

The context is missing but is important. When I say MOOCs have failed I mean something very specific: asynchronous online learning has proven to be so dramatically less effective than synchronous, guided learning. Better than nothing existing, but we’re learning that only a small subset of people are able to effectively learn from them.

In other words, take a random sample of 100 people with the same level of motivation. 90% will complete a live, guided course. 5% will complete a MOOC of the same quality. Which is unfortunate because the live/synchronous part is what is expensive.

There was a time when we thought all education was is just a bunch of video lectures, and that if we got a bunch of professors to record their lectures college would be free.

We now know that doesn’t work, at least for the vast, vast majority of people, both because we’re unable to discipline ourselves and because most true learning happens in a multi-way environment with accountability measures.


How are you measuring the same level of motivation?

From my anecdotal experience, the people taking the "live, guided course" have committed to:

  - synchronous attendance at a particular time, probably at a particular place
  - probably a dollar investment
  - face-to-face contact with instructor
  - near-traditional learning environment
People taking a MOOC have committed to:

  - browsing on an electronic device
There's generally a really high barrier to entry to the synchronous, guided learning, so it seems like comparing students who have paid tuition and enrolled in a class vs prospective students expressing interest on a university recruitment page.


There have been quite a few studies that have done this in a scientific manner.

Internally we have taken sets of students that apply to our free “mini code bootcamp” and sent half to an archived version published each night and send the other half a live link that becomes an archive each night. So we didn’t control scientifically, jsut an AB test, but the shift has been breathtaking.

We can also take students that are struggling learning from our MOOCs and they finish the live course almost every time. I know the latter is unscientific, but it becomes obvious over time.


This Harvard study & Atlantic article by the study authors give the same 5% completion rate you cited, but give a completely opposite interpretation: basically, that the MOOC broadened access, allowing more people to get what they wanted out of the course.

https://www.theatlantic.com/education/archive/2014/01/the-tr...

https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2381263

"In these course[sic], “dropping out” is not a breach of expectations but the natural result of an open, free, and asynchronous registration process, where students get just as much as they wish out of a course and registering for a course does not imply a commitment to completing it."

...

Also the discussion of the "Colbert Bump", where increased publicity to less-motivated (just curious) students doubled completion but tripled registration (thus reducing completion rate).

Also the point about whether to allow or disallow registration after the course begins (late registrants are ineligible for certification, so they are automatically "failed").

"HarvardX could boost its certification rate by closing those courses to new registrants or restrict courses to those most likely to complete them. Instead, it keeps courses open to maximize the number of students who are learning something new."

I absolutely agree that (at least some) users can be helped by live instruction or supplemented online instruction, but I disagree that commonly cited, low completion rates measure students with the "same level of motivation".


'Mini code bootcamp' suggests that your target audience is people looking for careers? What are the end employment results like for all the participants?

For instance, are MOOC students more likely to not complete the course because they have already found work in the meantime? Perhaps because they have more freedom to look for work during business hours, for example, rather than sitting in on a live class during the optimal employment search time?

If the goal is to find work, there is no incentive to keep going in the course after you have met your goal.


The "mini code bootcamp" doesn't prepare people for jobs, it is a "miniature" version of code bootcamps. It actually only prepares you to begin our full computer science academy, which is six months long.

If you want the stats for our full computer science academy I can give you those; they're good. 50% hired within one month, median $90k salary.


Thanks for the clarification. I'm still most interested in the results of the 'failures'. What did the people who failed to compete the MOOCs end up doing? How about those who succeeded with the mini live classes, but did not continue into your full program?


I don’t have a good sense for what the “failures” went on to do, they fall off our radar.


That is unfortunate, although I understand the practical reasons for why that is the case. However, how do we resolve the bias of only looking at the success stories? What if all of the 'failures' also went on to $90k average development jobs within the first month of quitting the MOOC? $90k developer jobs are not exactly difficult to come by in the current market to anyone who has expressed an interest in programming. The fact that they were willing to try a code bootcamp MOOC puts them miles ahead of the general population.


As I stated above, we've A/B tested it with random samples, but it's hard to get any data on "failure to complete" populations because of obvious sampling bias


Classes for this crowd (older non-traditional students) tend to be in the evening or during weekend. Because they have jobs, education is not something they do while they are unemployed with intention to drop it the moment they find jobs. Through, major reason why they drop out is that handling work, family and school and harder, more demotivating and more tiresome then handling just school.

Beyond broadening access, if you have to be somewhere in person you have to make it priority at that time - and you are not interrupted while you are there. In home study is very easily interrupted and easy to postpone due to other very real duties until it is too late.

The motivation hit from meeting other students is also missing.


The student handbook for the author's school suggests a graduate should expect to spend three to six months applying for five jobs a day. That's about 500 to a 1000 applications following six months of 45 hours a week at their online school.


Five per day is a pretty low application rate for what it's worth. You can get through twenty in an hour if you're organized and motivated.



As long which ones you send to each link is determined entirely randomly, you have indeed done a "scientific" (i.e., statistically valid) control.


That's one element.

This first couple of things I'd check would be:

  - Double-blind: What if you tell students it is live, but it is actually pre-recorded? 
  - Are the live+archive and archive-only videos the same? 
  - Are the live+archive students actually watching the video when it is live, or are they using the archive feature?
  - Do archive-only students who consistently watch at the same time do better? 
  - Does time/day of watching affect results? (archive-only students get the link at least [duration of class] later.)
You also need to control other interactions with the participants, track the results faithfully, avoid bias because instructors are more familiar with students from "live" interactions, ensure cohort size and result size are large enough to be meaningful, that group composition isn't biased by confounding factors, etc.

But random assignment is a good start!


What studies would you cite for someone interested in learning more about the field?


I'll dig some of them up later today, but it will be a bit as I need to finish some stuff.


I think the time commitment is the biggest issue. Let's assume the costs are the same to me (perhaps due to employer subsidy):

When I take a MOOC, I'm pledging at least 5 hours a week. That covers 2 hours of videos, 2 hours of homework and 1 hour of additional research / review.

If I pledge 5 hours /week to a traditional class, I don't get nearly as much learning. I have to commute to the class, which is at least 30 minutes each way. Now my 5 hours buys me 2 hours of class, 2 hours of commuting and 1 hour of homework. I've lost an hour of homework and an hour of research.

I know which one is better for me.


Ah OK interesting I think I get what you are saying in that given an authority figure's specification, the student in a class will be far more likely to succeed at reaching that arbitrary externally imposed spec than a MOOC subscriber.

With the side issue that 100% of everyone in my 20 person automata theory class back in ninety-something is a much smaller number than 2% of 24000 people signing up for Ullman's automata theory MOOC.

The fundamental problem is authority along with excessive broadness. If I want to learn Y and the MOOC teaches X Y and Z, I will fail the MOOC. There are also Skinner box conditioning issues where my assignment for something like using bootstrap framework, will be scratching my own itch of project or doing some project at work, such that again I'll fail the MOOC because I'll refuse the assignment; I already have a more ambitious self assigned assignment and am not doing both my assignment and someone elses much less interesting assignment. Whereas if I sit in class I'll have peer pressure and financial pressure to do the class assignment regardless of work or hobby interests.

Sort of a traditional in-person or online class is a goal in and of itself; MOOCs are a means to an end for work or hobby for most.


> 90% will complete a live, guided course. 5% will complete a MOOC of the same quality.

What does complete mean here? Complete within a given timeframe? Doesn't that miss the entire gain that MOOCs bring to the table? Which is that education becomes a lifelong endeavour instead of something you try and fit into a few years and, for most, never return to again?


Major counterproof: The Open University.

Sounds like MOOCs need to learn from educational institutions that actually have a track record in distance learning.


The big difference is you have to pay for the OU, there is a commitment up front that filters out a lot of people.

Saying MOOCS are a failure because only 2% complete the course is like saying the OU is a failure because only 2% of their web site visitors sign up.


How well are you controlling for the sheepskin effect? Students are highly motivated to do what it takes to get a degree from a university, and much less motivated to actually learn the topics nominally offered.

I suspect that a MOOC offered with the same incentives and structure as a live class - ie, as part of a degree-granting program with mandatory tests at set dates - would have significantly higher completion rates.


What is the purpose of a MOOC? Is it to make training available, or to provide academic success?

If it is the former, then it is definitely a success. If it is the latter, then I think that not only percentage rates of success are important, but also the nominal completion rate itself.

A MOOC supplies the opportunity for someone getting a degree in communications, business, or some other not-entirely-technical degree, that might be wonder what a computer science or high level mathematics course is like to go and find out. Does their failure in such a course mean that the class itself is a failure, or can the fact that they had access and were able to put a toe in the water mark success?

I would suggest the latter.


Right. They're only a failure inasmuch as the initial promise was "these will basically replace college but for free!" which is now clearly not true.


> Nearly everyone recognizes that MOOCs are by and large a failure with ~2% completion rates

I don't see how that is a failure? MOOCs are a platform for "just in time" learning. If you run into a problem at work you're not going to sign up for a four year university degree program to figure it out, but there is a good chance you will sign up for a MOOC and drill down into the specific parts that you need to close the gap to solve the specific problem you have. And when the next problem arises, you may continue further into the course. There is no reason to cover everything at once. This flawed opinion about failure may be directly related to #3.

> Economically we vastly undervalue education.

Because it hasn't shown economic value. The percentage of the working age population who have attained a postsecondary education has skyrocketed over the past 50 years, but incomes have remained stagnant. If the more and more people attaining a postsecondary education really were adding $10k to their yearly income like the article suggests, incomes would be rising substantially.

Those who have a degree, on average, make $10k more than those who don't. But since this simply measures high-achieving people against low-achieving people, income conclusions cannot be meaningfully drawn from it. It is important to remember that universities and colleges reject the lowest-achieving people at enrolment time, preventing them from signing up for class entirely. Medium-achievers who make it past the initial filter are put through academic rigour that sees them drop or fail out. That leaves only the highest achieving people able to graduate.

Nobody should be surprised that high-achieving people are able to make more money than low-achiving people. This would remain true even if colleges and universities never existed. Finding something that correlates with high-achievers does not tell us anything other than this group of people is more likely to be high achieving.

Allowing more and more medium-achievers to graduate from university still leaves them as medium-achievers in life and they will still end up in the medium-achiever work that they always would have (after all, someone has to do it). All while incomes remain stagnant.


Incomes have remained stagnant because of productivity capture, not because of education.

The whole point of OP is that education does not correlate with achievement, because there are any number of showstopper issues - mostly lack of up-front cash, and/or debt, often in really quite small amounts - that keep talented people from achieving a much more productive and rewarding life.

The bigger issue is that social class determines "achievement" far more than either education or talent do. And this is incredibly toxic and destructive, especially over the longer term, because it creates a poverty economy where the entire economy runs at some small fraction of its true potential.

The fact that a few people are super-rich doesn't alter this. It's about the opportunity cost of not recognising, nurturing, and rewarding talent and ability on an industrial scale - while using narratives like yours to pretend that the problem is individual personal failure, and not systemic economic self-harm.


> Incomes have remained stagnant because of productivity capture, not because of education.

Not because of education. The education is simply irrelevant and has no bearing on income. The market doesn't value that output. Nobody goes to the grocery store and pays more for produce that is certified as grown by university educated farmers. Why would they?


General consumers are not the customers of education, employers are. And you can bet your bottom dollar that when HR is hiring at Purdue they take the ag science BS grads over the high school grads.

Not necessarily because they know what they are doing, but because they have that option - there is no practical reason to not hire a college graduate for almost any job nowadays because there are so many of them looking for work at any rate beyond part time minimum wage.

Even if their degree is wholly inappropriate for your job, and even if your job has no reason to expect a college degree, just the fact they have one means two things - A. they can put up with rote bullshit while having the choice not to for years, and B. they are almost always indebted and thus are going to be more loyal to the paycheck.


> you can bet your bottom dollar that when HR is hiring at Purdue they take the ag science BS grads over the high school grads.

And I bet that is true. The person who graduated with a BS in agriculture science is able to complete the BS in agriculture science for the same reason that they will do well in the work place. The fact that the person was selected by the school to enrol in a ag science program means that the person is already determined to be better suited to the workplace the someone of the average population. They were already the best hire before they even enrolled to study agriculture science.

I mean, if you handed a agriculture science BS to a homeless person with a drug addiction, I bet the high school graduate is going to start looking far more appealing in comparison. However, someone who is homeless with a drug addiction problem is going to be immediately escorted off college campus even if he tried to get a degree. The degree isn't the appealing part to employers. It's the type of person who is able to attain a degree.

> there is no practical reason to not hire a college graduate for almost any job nowadays because there are so many of them looking for work at any rate beyond part time minimum wage.

I am not sure what you mean. Only about 30% of the workforce have a college degree. Employers don't have the choice to hire college graduates, even if they wanted to. Although that you say that a large portion of that 30% are struggling to find work is fascinating and, as income is a result of supply and demand, supports that incomes are not rising with attainment.

> Even if their degree is wholly inappropriate for your job, and even if your job has no reason to expect a college degree, just the fact they have one means two things - A. they can put up with rote bullshit while having the choice not to for years, and B. they are almost always indebted and thus are going to be more loyal to the paycheck.

Which is all well and good, but the fact remains that the income bonus isn't showing up in the data. These people are making the same amount as they would have had they not gone to college. They are making more than the drug addicted homeless person, yes, but that is not because of school. They would still be making more than the drug addicted homeless person even without school. And the Bill Gates of the world would still be making more money than you even if school wasn't a thing. Income diversity across the population has been around as long as we have been able to measure income and it was never a result of who went to school and who didn't.


MOOCs were propagated as alternative to traditional higher education. Not just something educated people use to add to their education, but something an uneducated person with high school diploma can use to get equivalent of college. I remember reading those optimistic articles.

At that it failed. It is not cheaper education for non traditional non full time student with lack of money. They drop out faster or equally fast then non traditional students on colleges (who drop out a a lot too).

It is great free easy to use option for educated professional through. But, that was not original sell.


> Not just something educated people use to add to their education, but something an uneducated person with high school diploma can use to get equivalent of college.

It seems premature to say that it isn't provided that. You have your entire life to gain a college-level education. The primary reason that brick and mortar schooling is provided up front, in a relatively short span of time, is because it is impractical to visit that brick and mortar school on a regular basis throughout your entire life. Like the article points out, a major hurdle for a large segment of the population in accessing brick and mortar schooling is proximity to those brick and mortar schools.

Online learning has no such constraints. You can be out in the middle of nowhere and still pop into class whenever you feel like it. If it takes you 80 years to complete your college-level studies, great. To say that is a failure, only a few years after becoming available, seems to miss the point of what alternative means. If a MOOC had to be exactly like brick and mortar schooling, it wouldn't be an alternative, it would be the same thing.

Although I concede that a fatal flaw of MOOCs is that there are an infinite number of options online. While it may be impractical to visit 100 different brick and mortar schools to see who teaches a specific topic the best, online it is easy to do so and can be done in an instant. With that, there is no incentive to go through your studies only in one spot that can be tracked by a single entity. This makes it challenging from a business perspective, but not an education perspective.


>The primary reason that brick and mortar schooling is provided up front, in a relatively short span of time, is because it is impractical to visit that brick and mortar school on a regular basis throughout your entire life.

I would rather say it's because neuroplasticity is high until you're still relatively young. In later years it takes more effort to learn and retain.


17% of college students are already over the age of 35 and that number is growing[1]. This does not seem to be any kind of real impediment.

[1] https://www.nbcnews.com/business/business-news/back-school-o...


> They drop out faster or equally fast then non traditional students on colleges (who drop out a a lot too).

However, they don't drop out owing tens of thousands of dollars in student loans. That in itself would seem to be a major advantage.


>The percentage of the working age population who have attained a postsecondary education has skyrocketed over the past 50 years, but incomes have remained stagnant.

Your numbers are out of proportion with every single statistic that measures salary of a college graduate vs salary of a high school graduate.

Maybe incomes have remained stagnant because the overall income is dropping nationwide and college graduates are the ones keeping the average stable.

Suffice to say your statement is not accurate. Education shows massive economic value.

However, EVEN if it didn't show massive economic value, exposing young people to a variety of ideas and perspectives before they enter the work world gives them a solid foundation to make decisions and critically think about life is massive value in itself.

Human beings are limited by the information in their heads and don't typically seek out new information voluntarily. Education gives them new information to reason about the world with.


> Your numbers are out of proportion with every single statistic that measures salary of a college graduate vs salary of a high school graduate.

The original article gives the $10k figure and a quick Google search suggests other sources agree with it. But really, the exact figure is irrelevant. The fatal flaws with that way of measuring income remain. The numbers don't really matter here.

> Maybe incomes have remained stagnant because the overall income is dropping nationwide and college graduates are the ones keeping the average stable.

I don't think the data supports this. Incomes are stagnant for even the lowest income group. Unless you are suggesting that college graduates in the lowest income group are responsible for keeping that lowest income group afloat? If that is the case, what should we take from that? Why are college graduates in the lowest income group in the first place if college attainment leads to higher incomes?


>If the more and more people attaining a postsecondary education really were adding $10k to their yearly income like the article suggests, incomes would be rising substantially.

Adding $10k to your yearly income does not add $10k to the aggregate yearly income. In large part, educational credentials are a positional good - if we gave out twice as many law degrees, we would not have twice as many practicing lawyers. In other words, completing a degree often helps you get a job at the expense of others who do not hold degrees.


> (after all, someone has to do it)

Until it gets automated.

I think you're over-assuming that people's "achievement level" is a completely general, fixed number, when there are a lot of suggestions that it's more situational.

There is some suggestion that the supply of graduates isn't well matched to job market needs, which usually manifests in people being opposed to anything that isn't STEM. But that's not the same thing as an argument about achievement.


As a museum educator turned classroom teacher, this resonates with me.

Readings that may be relevant to those who find this Twitter thread intriguing:

- "On the Wildness of Children," on what kind of educational experiences we might imagine for our children if we consider what children are like first: http://carolblack.org/on-the-wildness-of-children/

- "Deschooling Society" (on my reading list): https://archive.org/embed/DeschoolingSociety


You should also check out "Free to Learn" and "Free at Last" books, and Sudbury school model. As a parent, it's something I've been contemplating a lot, but haven't quite committed to yet.


Some other surprising things about school:

- parents care more about extracurricular and athletic pursuits than academic. They'll help their kids play a sport, but rather than try to help their children learn, they'll complain that the teachers are not doing their jobs

- children react to changes in environment the same way adults do, but this isn't considered by schools because they know the students have no choice but to comply

- many children still go hungry at school, making learning difficult to impossible

- one in ten children are diagnosed with ADHD; somewhere between 1 in 20 and 1 in 10 may be on stimulants, antipsychotics, antidepressants, etc

- private schools may pay half the salaries of public school teachers, and their curriculums are not as strict (virtuslly no required curriculum)

- children are still told today (by teachers) that they will be destitute and homeless if they do not go to college, but they do not provide any extra assistance either, making them either extra stressed or give up

- teachers pay, pensions, etc and overall school budgets are still cut short by politicians whenever they need to trim their budgets, while other areas of budgets are increased

- children in 8th grade often have a 3rd grade reading level in many parts of the country, but especially major cities with a history of segregation and a lack of access to jobs

- some kids are still booted from one school to the next and used as scape goats if anything negative is on their transcript


I can help my children to learn sport and I can help them learn math. I can not help them learn subjects I have long forgotten or was never good at in the first place.

I dont expect teachers to teach my kids sport, I do expect them to teach math and history. Not sure what is wrong with that.


I think what's wrong with that is assuming teachers even can teach your child things that even you weren't good at or can't even remember. Parents seem to think teachers can do things they can't do. But they're not magicians. They're normal adults with textbooks. Many times they're not even experienced in the subjects they're teaching.

Teachers have to teach 30 kids at a time, who all have different backgrounds and lives. They can't possibly give them all enough individual attention and time. And somehow they're supposed to get them all to surpass all the factors holding back each specific kid in a myriad of ways, to succeed at the same material, at the same time, at the same pace.

They have to be babysitters, disciplinarians, counselors, role models, and educators. And then they get yelled at by parents or lectured by administrators for not keeping up to this standard. It's crazy. It's one of the reasons so many teachers quit within five years. It's one of the reasons it's so hard to find good teachers. And of course, they never pay them accordingly.

If it weren't for the fact that teachers are idealistic and want to help children learn, we'd have a nation of morons, because only desperate people would subject themselves to this career.


> I think what's wrong with that is you assume that teachers even can teach your child things that even you weren't good at or can't even remember.

Teachers taught me things my parents were not goot at and did not remembered. Right now, teachers are teaching my kids things I was never good at. A math teacher I know (friend) it teaching kids math they parents down know - right now. And I met kids, they do know that stuff.

So yes, teachers can teach kids what kids parents don't know.

In your attempt to defend teachers, you made them sound to be completely useless. Some are, but plenty of them are not.


They're not useless, they just have a job which is far more difficult than people give them credit for. When some kids get bad grades, people point at the teacher, even though other kids are getting good grades. Or the same teacher teaching the same curriculum for 20 years will suddenly find their whole class failing - and they point at the teacher. But the teacher never changed.


- There are large numers of people who think, at least in the US, that paying taxes so that other people's kids might get ahead is a dumb idea.


Pullout: "We had one student on the edge of homelessness, was $400 short on bills and almost had to quit because of that. I personally loaned him the money, and his income moved from $10/hr to $70k+/yr. It only took $400, but he didn’t have anywhere to get that from. Insane."


I don't think it's that surprising. Anecdote: Right after I barely graduated from High School after my family clawed its way out of homelessness, I spent a few years working minimum wage jobs with long commutes in an old car. I could barely afford gas and groceries let alone decent housing and college.

After about 3 years of work, I managed to scrape together just enough to enroll in my local community college for two semesters, one class per semester. Total cost? At that time about $400 including books. And that $400 could have vaporized with one flat tire, or one visit to the doctor.

An acquaintance of my parents heard that I was going to college, suddenly took an interest and hired me sight unseen to come work at his startup in an entry-level developer job at 3x my then current pay. I worked there for a couple years and when I was close to finishing college, another acquaintance hired me and doubled my pay. After that I was attached to a growing career and haven't looked back.

But it all started with getting that tiny little bit of money together.


It is incredibly surprising.

You aren't surprised because you have experienced this. But any economist would look at this and say something like this:

> You claim that there are people where an investment of $400 would > quickly (within a couple of years) give them a salary increase in > the range of $20,000 to $50,000 per year. That is obviously untrue. > If it WERE true, then at lease ONE person in the country would have > spent $4,000 to seed a 10-person program which granted the $400 in > exchange for a promise to contribute $8,000 back in a couple of > years. A x20 replication factor would easily swamp administrative > costs and participants who refuse to pay afterward and the program > would still be growing exponentially.

I think the source of the fallacy is that while there ARE people for whom $400 is a difficult-to-surmount barrier beyond which no other major impediments stand between them and middle-class success, they are an impossible-to-find minority in a sea of people who are willing to ASK for $400 and those for whom $400 is a major impediment, but further impediments would prevent them from succeeding.

Alternately, we can look at the literally thousands of charities (including schools themselves) that offer merit-based and need-based scholarships, and conclude that the economist is partly right and the system DOES exist, but isn't meeting all of the need.


It's self-contradictory - he did have somewhere to get it from :)

I'm only half-joking; I think the problem here is lack of information. That the student was likely to move from $10/h to $70k/y is something that the teachers might know, but random lenders probably don't.

On the other hand, I don't know if I want to see financial companies partnering with schools to provide loans - seems ripe for abuse.


You're describing one of the other points he makes in this: as bad as you think for profit education is, it's worse.

Austen has been railing against places like University of Phoenix, etc for a while as what's been discovered about them is just nuts. The stat that stood out for me was that at one point they had a substantial number of recruiters (thousands) but exactly zero job placement staff.


The school should partner with one of those "we invest in you in return for 1% of your life earnings" hedge funds.

That way it'd have easy access to short-term capital needs like this.


OP here. We are a school that is completely free and takes a percentage of income (though for two years, not for life). Even still the $400 to make it through the last month was a barrier.

We’re working on having a fund available for living expenses, but “I’m going to pay you cash so you can go to school” is a different risk profile than “This School is free until you’re hired.”


That's awesome. The only reason the school should take the burden is to distribute the risk pool and also theyre the most capable of providing some value framework on whether this $400 or that $400 is a better investment.

I personally would loan money for this sort of thing if there was a reasonable ROI and someone vetting the students.


Lenders aren’t very good at quantifying future changes in income. They’re trying, but it’s a hard problem to solve.


Why couldn't that student take out a student loan, like most students?


>Nearly everyone recognizes that MOOCs are by and large a failure with ~2% completion rates, but they make us feel good because now it’s the students’ fault not they’re not learning, not the school’s

Before: simply getting familiar with a subject required me to jump through bureaucratic admission hoops, go through arbitrary prerequisites, pay tons of money, bend my life around some class schedule and then get mediocre lecturing from some random guy who happened to teach the subject at the nearby university at that time.

After: I can watch lectures from world-class professors and universities at any time, starting with any difficulty level, on a wide variety of subjects. I can also get free help from other people if I need to.

How is this a failure? MOOCs are one of the greatest thing that ever happened to the Internet and education.


I think they are great if the alternative is nothing. I don't think they are so wonderful if, as many have proposed, they start replacing regular classes in universities people are paying to attend.


I would be ecstatic if I had an option to replace some of my university classes with MOOCs. The leap in quality of lecturing would be gigantic. MOOCs made me realize that most of the subjects in college I found "complicated" really weren't. I just had horrendous professors and bad books.

I'm currently in the process of "fixing" my college education by watching lectures on fundamental subjects like biology (which I never had in college for some reason), chemistry (which was taught really badly) and linear algebra (taught awfully). It's astounding how many new things I'm picking up.

Plus, there are classes like AI, Cryptography, Model Thinking and so on. Nothing of this sort was available to me in college at all.


I think that 3 is consequence of 10 - prototype of working online school was not created yet.

On 1: The "is this required" question is direct result of students being goal oriented with their education and time. They do it because they want job or because of social pressure. It sounds depressing, until you realize the alternative is not caring about job when choosing what are you going to learn, which is criticized in other points.

We can't have this one both ways. Either it is for purpose of job and they will ask that question, or it is hobby and they will care less about utility (e.g. job).


In the UK at least, 2 isn't true, loads of universities do follow ups to find out where you are and what you earn and then advertise their post-degree employment rates.

I, perhaps cynically, think some universities have even started "hacking" that process. In Nottingham you can hire graduates from either of our 2 big universities where the University pays part of their salary! [1]

[1] https://www.nottingham.ac.uk/careers/employers/vacancy-adver...


My hunch is that these surveys would suffer from

1. selection bias (unemployed or low-income graduates would feel embarrassed to respond)

2. the mean of the (biased) sample being much higher than the median (die to outliers)


Very interesting thread. I want to know how to measure effectiveness, beyond "got a job". I think there is more to it than that, and getting a job doesn't work as a yardstick in many contexts (e.g. most commercial training where everyone already has a job.)

"99% of people, when left long blocks of time alone to work on something without anyone to be accountable to, will watch Netflix." I have to agree with this, and yet the appeal remains a deep mystery to me.

It's also worth noting that a number of those points are specific to the US system. Basically anything involving money. The maximum tuition LambdaSchool (OP's school) charges for six months is almost enough for three years of tuition here in the UK, or about 50 years in Germany.


Effectiveness is measured by intent. Those who go to University with the intent of getting 'a job' measure their success by getting 'a job'. Those who go to get a deep education on a specific topic...likely measure their success based off their understanding of the topic when they leave.

In this day and age I would hazard a guess that over 90% over University students...are there for a job and thats it.


OP here, also happen to be a YC founder (S17 - https://lambdaschool.com). Shocked to see this on HN.

A lot of people are asking me about number 7:

> 7. We had one student on the edge of homelessness, was $400 short on bills and almost had to quit because of that. I personally loaned him the money, and his income moved from $10/hr to $70k+/yr. It only took $400, but he didn’t have anywhere to get that from. Insane.

We train people to be software engineers for free in exchange for a portion of their future income for two years (we only get paid if they make above a certain salary threshold doing what we train them to, otherwise we don't get paid). So as a school we focus on eliminating the risk and upfront cost to the student, which lets many brilliant people access a world-class education who otherwise might not afford to. We're also incentivized to make the perfect school that will cause you to be successful in your career, and those incentives make an enormous difference.

The average stats from our first classes are remarkable: our average student increased his or her annual income by over $40,000/yr, 50% were hired within weeks, and the median salary was $90,000, including a lot of low-cost-of-living areas.

Since we're paying for all the other costs of running a world-class school without upfront revenue (we're spending over $300,000/month all-in right now), we're not in the place to pay for living expenses as well, and currently students need to cover their own living expenses for six months. Usually they can live with family, some have part-time jobs, we do have a part-time program, etc. but it's never easy.

This kind of small discrepancy forcing people to quit is something we see all. the. time. I probably see a similar situation with different small dollar amounts weekly. There are times when I am fairly confident I can swing someone's future earning power by several million dollars, but I don't have the thousand dollars it takes to do that. It's incredibly frustrating.

We can partner with lending companies, but if they're paying for students' living expenses they also require students to pay their tuition upfront and be saddled with debt, which goes against our mission, and we think there has to be a better way.

I'm looking into raising some sort of non-profit, living-expenses-only fund/endowment to lend during these sorts of shortfalls, but I'm new to raising nonprofit dollars. If anyone has any suggestions I'm happy to hear them. I don't have a good sense of what the interest rates would need to be for this to work in a for-profit world, so we'll see.


I wonder if there is a model whereby you can have many investors donate small amounts in crypto and earn deferred interest on their investment.

Like how people purchase a mining contract for BitcoinCash .. except you get a delayed payout, and your investment directly helps educate people. Some sort of variant on Patreon.


With respect to #15 and 16, how is LambdaSchool not the same AND what happens if your numbers do not go as projected? How will you avoid becoming these points in the future if LambdaSchool does not gain traction?


We don't get paid unless our students get a job (we're $0 until you're making $50k), so we would go out of business if that didn't happen.

We have 3x as much coursework as code bootcamps, spend a lot of time in lower-level fundamentals, spend time in CS theory, write JavaScript, Python, and C. It's a pretty big difference from "build your first javascript app!"


" 99% of people, when left long blocks of time alone to work on something without anyone to be accountable to, will watch Netflix."

Do people on here agree with this? I think that is far too cynical... education (even this guy's education) tends to be misaligned from a person's actual interests, and so most would rather not do it in their free time. But I do think people want to do work/productive things and not just watch Netflix.


It depends. Most people are exhausted at the end of the workday. It isn't the time that is the key component; it's the energy.

Give me a few days to a week of pure relaxation, and my creative juices start to flow.


We’ve been reworked purely for extrinsic reward and forgotten how to learn entirely. The most frequently asked questions we get are “is this required?” and “donwe get a certificate for this?” It’s sad

If the students had their own projects that they were working on, would you know about it? Offhand I'd guess they have things they'd rather be working on than your assigned coursework, so if there's no benefit, why do it?


It's a delicate balance. If we assign personal projects in the beginning many just say, "I don't know how to do this, I don't know what you want," and will flail because they're expecting more structure than a personal project can possibly give. You have to slowly train them to the point where they will try things on their own and can abandon your training wheels. It's like they're in a mental cage, I don't know how else to describe it.


See note 9.


Note 9 is why I gave up trying to mentor developers.

Relevant background: I've held a career that is a dream of high profile projects: game studio owner at age 17, beta developer for original Mac in '83; on Mandelbrot & DeVanney's original Fractal research team, 3D graphics research community during the 80's; engineer on the 3DO and then original PlayStation OS teams; 15 years as a lead game developer of major release titles; 5 years VFX developer and artist and financial analysis for 9 major release VFX heavy feature films; and now 10 years in machine learning and facial recognition.

I love to mentor developers. I ran a free coworking space for 3 years where I tried mentoring many, many starting-their-career developers. I also have both undergrad and graduate school teaching experience.

Out of perhaps 3 dozen mentoring situations, only 2 people actually took the opportunity seriously. Both of them went from nearly minimum wage shit coding jobs to real development companies making in excess of $70K. The other 34 people I tried to mentor never even finished the first "let's see if this person is serious" task I gave them.


You should chat with us. Filtering for interest is important, but is very possible. We have 300+ very motivated students enrolled right now from thousands of interested applicants.


Lambda School - I'll contact your organization... which contact should I use? I can only find prospective student contacts.


careers@lambdaschool.com


got it.


What was the "let's see if this person is serious" task?


Something very basic, such as "comment what you're working on so I can help" or "just write me a brief email outlining what you want to learn" or "you are aware learning to be a developer takes more than 2 or 3 months from zero knowledge, right?"


6% success rate does not seem to be that bad result, honestly. Good work I guess :). It sounds to be good preliminary test, sort of primary checks whether that person is suitable for this kind of mentoring.

I remember myself testing waters or getting interested/curious in multiple things dropping out of most rarely keeping only some. Even within same general area sometimes I was motivated to continue and other times not.


I disagree with #6. I always felt like getting a degree is such a huge waste of time, but I need to jump through this hoop to get a job for which I didn't require much more knowledge than the programming knowledge I had after high school (self-taught).

But then again I went to school in Europe with cheap tuition, so time is the biggest investment here.


"I always felt like getting a degree is such a huge waste of time" is a humblebrag.

There, I said it.


I don't think its a humblebrag. I take it as someone who wants to get on and code but the system requires a hoop to be jumped through.


It really is. It's kind of "I'm so awesome that I learned nothing of value in 4 years at a university". Which is a sad waste of time if true.


That's not what I said though. I learned many interesting things, but most of them aren't directly of value to work. I don't regret having studied, I saw a lot of cool stuff. But it's a very long time investment which doesn't lead to an equal ROI in terms of work output if you ask me. It leads to a good ROI mostly in terms of employment offers and salaries, because it has signaling value.


> I learned many interesting things, but most of them aren't directly of value to work.

What did you study, and what do you do for work?


Electrical and computer engineering, and I'm working on embedded systems software in engine control units.


I find it interesting that you felt that programming had hoops to jump through. It has always seemed like a very accessible career in my experience. I started working as a programmer while I was still in high school and just kept with it. I have yet to meet an employer who has even asked about any particular credentials.


I think this is more my country (Germany) than a thing that's everywhere. I'm working with barely anybody who doesn't have a Master's degree. German employers value credentials a lot.


About that 2% completion rate.

Organisation A runs a traditional class that takes in 100 students. 50 complete it.

Organisation B runs a MOOC that, with a bit of advertising, gets 2500 people to sign up for free. 50 complete it.

Maybe my made-up numbers are off by orders of magnitude. But "2% completion rate" on its own doesn't sound so terrible if it reaches a much wider audience to start with - which both "free" and "online" could contribute to.

I have signed up to many MOOCs and completed a couple. I consider that a net benefit. MOOCs are not going to replace university education any time soon in my opinion, but for people who don't have the university option in the first place or want to take a single course every now and then they're a great thing.


It sounds fine in theory -- people are being helped -- but in practice, I don't think it's so simple.

People _register_ for a course, presumably, with the goal of finishing it. (Otherwise, why not do a google/wiki search for topic X? Find that specific topic on youtube/MIT OCW?)

When the vast majority of people fail to complete the goal, you decrease their confidence in the process. Unless someone starts with strong academic confidence, they may be demoralized by giving up/failing a course. We shouldn't be happy that 5% of people enjoy high school math class if 95% of people graduate hating it, and never touch the subject again.

We can move goalposts and now pretend that MOOCs primarily exist as a random buffet of knowledge and not replacements for actual courses, but that's not how they were positioned.

"That pill I sold you, turns out it's no better than placebo... but placebos are 30% effective, so it all worked out, right?"


> People _register_ for a course, presumably, with the goal of finishing it.

Most of the free MOOCs I've registered for I've not really had a particularly strong intent to finish. (Paid MOOCs are a different story.)

> (Otherwise, why not do a google/wiki search for topic X? Find that specific topic on youtube/MIT OCW?)

Because of a preference for interactive vs. static content, for one thing.

> We can move goalposts and now pretend that MOOCs primarily exist as a random buffet of knowledge and not replacements for actual courses, but that's not how they were positioned.

Actually, being low-barrier-to-entry so as to support more experimental, low-commitment exploration—which logically implies a lower completion rate—is a big part of how free MOOCs were positioned.


> 1. Students’ brains are broken by our existing system. We’ve been reworked purely for extrinsic reward and forgotten how to learn entirely. The most frequently asked questions we get are “is this required?” and “donwe get a certificate for this?” It’s sad

I suppose this is sad for academics, who teach as though they expect all of their students to become academics. But the students are just being realistic -- for almost all of them, an education is preparation for a career. I was in academia for a little while and it was amazing how many professors just didn't understand that.


A new book just came out by economist Bryan Caplan, called "The Case Against Education".

The basic thesis is that school isn't mostly about teaching/learning - it's mostly about signaling to future employers that you are smart enough (and conformist enough) to get through school. He guesses it's about 80/20 between signaling and actual education.

It's interesting to read most discussions of education with that idea in mind - his assumptions solve a lot of puzzles of how/why education works the way that it works.


"1. Students’ brains are broken by our existing system. We’ve been reworked purely for extrinsic reward and forgotten how to learn entirely. The most frequently asked questions we get are “is this required?” and “donwe get a certificate for this?” It’s sad"

That's because they are responding to the incentives that have been placed before them. Learning, when it happens, is done so in the pursuit of some form of achievement/validation (grades, test scores, etc.) Yes, properly evaluating learning capacity and knowledge accumulation is difficult, evidenced by the popularity of algorithm/fizzbuzz whiteboard problems. However the major issue is that culturally there is a lack of passion towards accepting ignorance and trying to learn ("We'll fix the problem with common sense"!), as well an open hostility by some towards those that are educated.

"9. 99% of people, when left long blocks of time alone to work on something without anyone to be accountable to, will watch Netflix."

This is caused partly (if not mostly) from issue number 1: Self direction and self control is absent from schooling. When no one is around to tell you what to do and yell at you when you don't do it, it's no surprise then that people lack the interest and discipline to pursue self-education when it is available to them.


> Everyone knows too many people attending universities don’t consider the financial burden, but NO ONE thinks about the time. Four years is a LONG time, but it doesn’t enter into peoples’ thinking generally when deciding if they should get a degree

You're right, I hadn't thought about the time. From 18 to 65 is 47 years; four years is a significant chunk of that. It's still a reasonable chunk if you come out of it with a degree that helps you get a good job (or, to worry less about credentials, with training that makes you more productive, or a mental framework that improves your life). But if you spend the four years, don't get a degree (or don't get one that gets you anywhere), and don't learn anything life-changing, four years is a lot of time.


> "Most universities, surprisingly, have no way to measure their effectiveness, and most don’t try. If you can’t measure effectiveness you can’t fail."

I don't know the facts-- and none are presented in the tweet-- but I do have a friend whose very job is to run studies on the effectiveness of programs at a University. I am sure this person is not alone.

Even if "most" (asserted) universities don't have a way to measure their effectiveness, the fact is that many of the best do and they all tend to benchmark each other. I would be really curious what the author is basing his assertion on because I think it is likely schools research a lot of things (like their effectiveness) but don't release that information publicly for many reasons.


I see billboards and the busses that shuttle people between the different ASU campuses going on about their effectiveness all the time.

#1 school for this metric.

In the Top 10 for some other metric.

Even the small California state school I went to goes on and on about the things they're good at...Environmental stuff and getting grads into the Peace Corp.


Right, and they know a lot about things they may have reasons not to share. Like, what majors are people likely to drop out of, which groups they accept are least likely to finish.


Anecdotally, I think I signed up for a MOOC back when that's what you had to do to hear Hinton's explanation of RMSProp. Of course I didn't complete it; I just fished out the bit I needed. I'm deeply dubious of these completion metrics...


If MOOCs are an even better measure of conscientiousness than live courses then the market should eventually price that in.


> Lack of access to a computer almost kept some of our best students from being able to attend. Those aren’t expensive.

This is a bias from the lower middle to upper class. A computer without some knowledge or support to maintain it (more costs) make that price rise.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: