I understand the context the author is approaching this from - but it discounts the idea that MOOCs can be used the same way we use books.
I've utilized about a dozen different MOOCs and I've _never_ completed a single one of them. Instead I learned the concepts I needed to, maybe completed some test material if it was freely available, and moved on.
Some of those MOOCs were gateways into my career as a developer - so they were absolutely vital to me. This again seems to be an issue with measurement/metrics - I don't believe completion rates are worth that much in MOOCs, just like they aren't worth that much in technical books.
However, there is a truth to the statement. Almost no one finishes a MOOC (some never even log in to take MOOCs they register for), and predictive models can determine whether someone will complete the MOOC as early as the 2nd week (if not sooner). Some people do grab what they want and leave, but that is not the case for everyone. If we then look to justify why bother, its a little hard to warrant hosting costs, etc. if you aren't seeing anything come from it. If I ran a MOOC with a 95% fail rate, how can I label it successful.
We are unable to do meaningful research on MOOC strategy effectiveness because we are too busy trying to understand why people never log in in the first place.
I've done this many times. Why? On some platforms (like Coursera) you retain access to the content of a course you have registered for, but that content isn't available for people who didn't register once the class is over. If I think I might ever want to view that content, my best bet is to register (there's no cost) and then have that content available when I want it.
I've also done this on platforms like Udacity where that's not the issue, because by registering I can create a curated list of courses I'm interested in possibly looking at in the future.
In my experience, the big failings of MOOCs is that they try to copy university classes, and university classes just aren't that good a lot of the time. Many people don't find lectures to be useful at all, but the vast majority of MOOCs focus on them. After the fifth time I'm stuck waiting for the teacher to finish their long personal anecdote or humorous story I usually give up on the MOOC and go looking for a good textbook.
I don't think I've ever seen a MOOC that had written content anywhere close to the quality of Dive Into Python or Learn C the Hard Way.
Also, MOOC efforts for community building are (at least were) focused on a particular class rather than a particular subject. This doesn't lend itself well to long-term community building, especially when people are going through these courses and very different paces.
There seems to be lots of discussion about how MOOCs can be improved, but not much effort into actually trying different approaches.
We are still trying to understand what makes for better "learning". If I practice the piano everyday, I will no doubt have a better ability to play, but I may not understand music theory. As a shameless plug, I am trying to add practice to be a core part of learning Computer Science and a link to my research platform is in my profile.
This is the same separate of vocational schools and coding bootcamps to graduate schools. The traditional 4-year school sits in the middle ground between application and theory and is probably why we have such difficulty deciding "success".
> to finish their long personal anecdote or humorous story I usually give up on the MOOC and go looking for a good textbook
This is ultimately student preference. Some students like a more affable teacher, some don't want a social-able instructor (so long as its not interfering). I will say that your post is confusing as you say you don't want instructor anecdotes but then MOOCs should be doing more for larger-scope community building.
I will argue larger-scope community building is difficult and MOOCs struggle with it; I would say because of its availability. It is harder to build community when the individuals come from such diverse backgrounds. Many factors can make it difficult to build bounds (social, cultural, temporal, etc.). Furthermore, you cannot control why the student wants to take a MOOC. They may not want to give that additional effort to community building for a subject they do not perceive as having higher priority to other life decisions.
I would disagree with "actually trying different approaches". This is being done and academic journals are trying to study them (the deadline for the Journal for Education Data Mining just passed). There are also non-academic approaches like Duolingo and Khan Academy, and they are trying different techniques as well.
Ultimately, there is no panacea for learning. We know engagement can lead to learning and therefore if we can maintain engagement, we might be able to teach. Community-building can be that engagement. Self motivation can be. At that point we need to research these things more to know for sure.
Sure. But the last time I checked, almost all MOOCs were lecture focused. Like I said, I've yet to find any with a text component as good as Dive Into Python or Learn C the Hard Way or other free online textbooks. Perhaps there's are a few out there, but every time I've browsed MOOCs I've found just about everyone to be lecture focused. Out of the dozens I've looked at (on multiple platforms) I can only think of one that wasn't, and that was eventually removed.
Plenty of people use the internet to learn plenty of things. I think it's time to consider how much of the "failure" of MOOCs is due to the failure of online learning, and how much of it is due to the fact that university classes aren't a great way to teach people things (at least for a large chunk of the population).
> I will say that your post is confusing as you say you don't want instructor anecdotes but then MOOCs should be doing more for larger-scope community building.
Instructor anecdotes were an example of why I don't like lectures. With a book you can scan over content that is superfluous and get to the information you need. You can re-read or take slowly the parts you have trouble with, and quickly skip over the parts you already know. All of this is much, much harder to do with lectures.
I'm confused as to why you think professor anecdotes are related to large-scope community building? They're quite different things, and if people view them as serving the same purpose then there's even more of a misunderstanding when it comes to education than I had previously believed.
As for the difficulty of community building, I think you should broaden your horizons a bit. For instance, if you're a bit late on a Coursera course the forums are pretty much dead, and you're better off discussing things or asking questions on another site. Likewise with Udacity - the course might still be active, but everyone that has completed the course has moved on and won't see your message. This isn't because of a difficulty in community building, but because of a conscious decision that most MOOCs take to segregate their forums by class.
While I cannot speak for all MOOCS, I have not had the same experience. MIT's CS 6.00x course on EdX was work heavy, Udacity's Web Development course was work heavy, and even Coursera's Design of Everyday Things had participation components. In the only (barely a) MOOC that was video only, it was predominantly a "follow along with me" coding process.
For instructor lectures, I give more positive responses than negative about my anecdotes. At the end of the day, I'm human and I'm trying to enjoy my job (which students can tell if you don't). Development of tutoring systems is still in research phases as we identify knowledge components for different subjects. A history course operates different than a computer science course.
I will say EdX has produced research on instructor lecture videos. Users prefer 3-5 videos and so instructors should design courses like that. Instructors that simply upload a classroom lecture are not appropriately transitioning their material for online use. This satisfies your being able to flip from topic to topic.
I will end the Dunning-Kruger effect suggests that students are not the best at assessing what they know and that a professional instructor has a better idea of what "knowing" something means. This again gets back to my discussion on vocational vs. graduate school and again, humans are flawed, imperfect beings. While an intelligent tutoring system can alleviate this issue, humans are still building them for the foreseeable future.
In my personal experience, it really doesn't. Breaking something into segments can make things a bit easier, but it still leads to a lot of wasteful time if there's a minute and a half of useful information within an 8 minutes block. You can't scan through it the way you can with a book, and it's much more difficult to review a difficult piece of information you just received (it's easy to re-read a sentence slowly, whereas rewind a video to the beginning of the last sentence is more cumbersome and you're going to be watching it at the same speed).
Text also tends to be much more succinct, where as lectures are often repetitive and meandering.
I think the assumption that multiple video segments solves the problem is instructive. Might not a large part of the problem be instructors saying "We've done X, which has solved Y issue" instead of saying "We've done X, let's look at whether or not it has solved Y issue"? I appreciate the fact that people are attempting to solve these issues, but if there's no differentiation between attempting to solve something and successfully solving something than it shouldn't be a surprise
> I will end the Dunning-Kruger effect suggests that students are not the best at assessing what they know and that a professional instructor has a better idea of what "knowing" something means.
This is a pretty big assumption, and one that I don't think is accurate (based on personal experience, and my experience talking with both students and professors). The best way I've found to test one's ability is to actually apply it to a task, where it usually becomes quickly clear to the individual where the holes in their understanding are.
Can you check and confirm?
> If I ran a MOOC with a 95% fail rate, how can I label it successful.
How is the completion rate a better measure of value added than the absolute number of finishers?
For extremely self motivated people, MOOCS are a win, no doubt. Again, MOOCs are definitely better than no MOOCs.
But the promise was that MOOCs can just replace college to a large degree, because the content you get in college will be online. In that instance the question becomes "do MOOCs work for most people?" For what we'll call the "average" person, studies have shown that just isn't the case. Most people don't learn well from MOOCs, unfortunately.
Therefore what we're learning is that a non-trivial amount of what makes a college education successful is some combination of the following things that MOOCs don't have: external pressure, scheduled courses, due dates, a community of learners, a physical campus, etc.
As such, I view part of the next step to figure out which of those aspects MOOCs are missing that are vital to the mix in order to help average people learn the things they need to know. Traditional education just says all of them are necessary, but I'm not convinced that's the case.
If dropping out of a MOOC involved a big financial hit, social humiliation, and the loss of a great deal of personal freedom, the completion rates would skyrocket. Completely different incentives.
The reasoning that traditional education is successful because it has high completion rates because of external pressures is a bit circular, isn't it?
MOOCs also come with a number of unique advantages compared to traditional universities:
- They widen access to education to people that aren't able to attend university
- People can take courses they know they might struggle with without fear of flunking them and harming their academic record
- Mature students don't need to make life sacrifices in order to take a MOOC. They don't have the dilemma of whether to up roots and move to attend a prestigious university
I think all educational outcomes should be measured with respect to the answers to these two questions.
K12 schools in extremely wealthy areas can compare themselves to like peers, but comparing themselves to high-poverty areas is a useless comparison. Neither institution would do well under the other's constraints.
For-profit higher ed is definitely the same way.
For example, those who take MOOCs are probably working full / part-time, trying to pick up some knowledge on the side. They will obviously have less time for learning than full-time students.
My point is that quantifying the effectiveness of a MOOC is an ill-defined domain. Instead, we acknowledge the attrition rate and do analysis on those that pursue the course. Current research looks at system interactions, the social networks of MOOC forums, etc. to identify student behaviors that show higher "gains", be it course completion or grade.
If a student leaves a MOOC, there is no way to identify why they leave. Likewise, unless asked, it is hard to identify why a student joins a MOOC. Students enrolled in a MOOC can come from a variety of culture, geographical, sociological, motivation, etc. backgrounds. As such, assuming traditional student behaviors is ill advised. However, in the process of learning, access to material is not enough to learning the material and additional effort is needed by the student to build the necessary mental models in their head. If a student drops out of a MOOC, it is hard for the runners/analysts of the MOOC to appropriately say the student learned the material.
As austenallred mentions, current education research is looking at the motivation of learning, as well as what constitutes motivation (and learning for that matter).
So what? How is this at all relevant?
If we look at how many people have walked into a bookstore or library, have browsed through a book but never bothered to actually buy and read it, would we say books are a failure? Would you say books have a 95% fail rate? This is such a pointless train of thought.
MOOCs are marketed as courses, and sometimes as an alternative form of higher education. Furthermore, I doubt the people who go to the trouble of designing curricula for MOOCs expect students to drop out partway through. Comparing MOOCs to college courses seems like a better match to the image the MOOC companies themselves have promoted. And if we make that comparison, completion definitely matters.
Of course any particular MOOC can be made more or less like a normal class by putting the content behind a hefty paywall, enforcing strict due dates, and adding perverse performance incentives.
Are there metrics that have measured this? What percentage of users find a MOOC useful even if it wasn't completed?
What I was trying to get at is that by looking at completion rates we might not be looking at the whole picture. Especially if the question we're revolving around is whether the course is worth the creation/hosting costs.
> some never even log in to take MOOCs they register for
> we are too busy trying to understand why people never log in in the first place
So you have at least 4 classes of 'user':
- did not register
- registered but never logged in again
- logged in at least once
- completed course
It seems to me that there's a different way to categorize the users in measuring course effectiveness:
- did not find the course useful/helpful
- did find the course useful/helpful
So if you are able to craft a course that has magically high engagement, but people don't find it particularly useful - you've still failed.
Maybe the value that course-makers hope to deliver through completion of the course simply isn't high enough?
If a user completes 70% of a course but doesn't see the need for the rest of it, then stopping at that point is the logical move. What sets college/university apart in this is degree which has a very high social impact. The fact remains that MOOC achievements are still widely looked down upon as somehow lesser than their college equivalent.
This whole discourse has encouraged me to revisit those courses I found most effective and complete them. At the very least I can help those courses stats and provide some further narrative of their utility.
The issue then looks at what is "success" in a MOOC. If the goal is to just have videos online, I'll just watch YouTube (as I have my own lecture series there). However, observing is considered one of the lower level of learning and things like Bloom's taxonomy point out there needs to be some type of interaction for better learning gains. These interactions require the student to have a more active role in the learning process. If they are not interacting with the system (logging in, watching videos, completing exercises, etc.), then they are not taking this needed active role. This is where "drop out" begins to be quantified and where then we can measure what worked, what didn't.
To address you example, if the user stops at 70%, researchers will ask "why?" From there, analysis of student behaviors, effectiveness of interface/instruction/material, etc. will arise. If the ability to study these things is confounded by the fact that the vast majority quit before completion, it makes it harder to answer the "Why" and "How do we fix it" questions.
Again, what we quantify as "successful" is still up for debate; but if a student drops out, it becomes harder to tell if they learned from the course and if it was student or material that drove that decision.
> If we then look to justify why bother, its a little hard to warrant hosting costs, etc.
Presumably the people who never log in aren't imposing any load on the system, no? Just rescale to 2% of the size your user-metrics say you need.
Plus, there are instructors and materials that aren't great, and are more useful for the basic introduction to concepts in the first few lessons, which can then enable self-learning that's more effective than the poorly taught meat & potatoes of the course.
(I've finished probably 3-4 MOOCs and didn't finish about the same amount.)
2% from several millions is more than the number of students enrolled in any UK universities https://en.wikipedia.org/wiki/List_of_universities_in_the_Un...
My personal thought is its the same reason people quit gyms , fail at New Year's Resolutions, or even quit Free-to-Play games. It is difficult to create and maintain habit.
I think what stops us from using that model is that each course might define its terms differently and assume slightly different pre-requisites such that it's infeasible to teach just one small thing in a way that it can be directly applied to someone else's minicourse.
I wish that lecturers would move to YouTube and structure each lecture as its own instead of having the burden to fit them into a course. For example, if I don’t have the appropriate prerequisite knowledge to understand what I’m currently watching, I should be able to search for and learn it quickly instead of having to browse through intermediate lectures.
This one certainly hits me hard. With a mother who dropped out of university and a father who didn't go myself and my sister both have good degrees (she's a newly qualified Vet and I'm a startup's birth and death out of uni), and nowhere to get decent advice.
I know a lot of advice I did get was wrong, my professors basically said they don't have good advice for me, and I know I'm not making the most of the experience I have behind me.
Searching for relevant mentorship is a very very hard problem.
The important stuff I learned before graduating my bachelors was things like:
1) It's better to be a small fish in the big pond (my career councillor told me this when I was 16, I think I disagreed with it in my first 3 years of uni but now I can totally agree.)
2) You don't owe anyone anything and you aren't tied to anyplace on earth (Mum - grew up moving a lot so when I had an opportunity to attend a more prestigious university my mum gave me a lot of support)
3) Always swap jobs every few years (my uncle worked for the same company for 18 years before being laid off - he has done well but nearly everyone he started working with are execs now)
Most related to your comment is something I was told a few years after university where I was pushing work to fast track my professional development as much as possible. A senior manager told me there is only so much other people can teach you, in the end you need to do the time and build the experience.
Once choosing to do more difficult courses at a more prestigious university. I don't think this was great on my well being in the short term (always feel like the idiot in class) but in the long term being associated with the brand has helped A LOT. So it was probably more beneficial to get worse grades at a "better" uni.
Professionally rather than being a gas data scientist in a small city, moving to a more competitive more established field of web analytics/marketing (with many more peers) in a larger city has lead to a much higher salary.
The big pond on the other hand, pays a bit more but is much less interesting. The big pond is almost certainly never going to dry up though.
A similar analogy is it's better to be the worst player in the best band than the best player in an awful band. You're going to learn less and learn it more slowly in the latter situation. I think this is more what the author was going for.
I'd actually recommend that young professionals try a variety of situations as they switch jobs. I learned much different lessons from different situations at different stages of my personal development. Though starting that growth path with a position at a large company isn't a bad idea if it's an option.
Just as an example, I had a friend in your situation but whose parents encouraged him to go to an expensive school and not worry too much what to study. You can probably guess what would've happened if he didn't end up doing his own research instead.
Again, for them, it was a very different world.
It's still pretty clear that a (4 year) degree changes access to jobs, I guess it's less clear that it will matter for a given person.
And of course the less in demand your degree is, the more the math favors cheaper schools.
Such policy is even trickling down to public schools with big endowments:
The place where people get killed is when they go to a private school that is expensive and not particularly an academic standout (of course there's many more mediocre private schools than elite private schools).
This isn’t entirely true. Tuition aid for most students is based on the parents’ income. My parents are (and were when I went to school) upper middle class on paper. They have never been great with money, though, so they couldn’t cover tuition out of picket and hadn’t set aside funding in advance. They definitely helped with living costs but tuition was covered by loans (and scholarships partially). So I left with loans to cover virtually all of my tuition.
If your parents are middle class and can’t cover your tuition, you’re probably in a similar situation. Combined with a low-demand degree, this could easily result in crippling debt.
> The place where people get killed is when they go to a private school that is expensive and not particularly an academic standout
My sister did this. A couple of years at a private religious school before she transferred to a public school. The student loans from those couple of years are absurd and dwarf the rest of her loans. She’s still paying them off a decade later.
It was just assumed that if your parents were above the "means" they would help out any way they could because that's what was expected of them as parents.
College was also seen as an enabler and not an expectation for my generation.
Or...if your parents spent all the college fund sending your sister to UCLA you could join the army and emerge as an "adult" so your parents' income wasn't taken into account and get grants and loans on top of the G.I. Bill.
I don't think my parents could've afforded to send my sister to Stanford back then which is roughly the equivalent to your average state school today (completely guessing here). They probably would've found a way though.
My point still stands though, means testing was intended to get the best and brightest into college even if they're poor, couldn't even imagine Reagan/Bush entertaining the idea of giving middle-class kids a free ride -- in fact I can even picture Reagan on the TV saying "Uncle Sam ain't your baby daddy!"
Note that the UK does exactly this, last I checked.
The US already does this for federal loans, though it's not the default option, and the servicer (Navient) has been accused of actively steering borrowers away from the option, because it hurts the value of the loan-backed securities that Navient sells.
Edit: my advice to my kids, get and stay educated (don't care how, college is one way), work hard and treat people decently. Don't need much more than that to succeed.
Edit: Dave Chappelle said in an interview that his dad warned him not to go to hollywood, "you might not make it." Dave said "..that depends on what making it is, dad." And summed up by saying that if he could "make a teachers salary as a comedian", he's made it. That's just awesome.
Worth watching this bit here:
NOTE: He convinces his dad to support him going into comedy. It's brilliant. "I was the first person in the family 'not' to go to college, that had not been a slave." - Dave Chappelle
I completely agree. As someone just reaching a bit over 3 years out of a bachelor's program, I had no one that I could talk to for a solid 20 years of my life about computers, let alone what I wanted to do. Like a bunch of people here, computers were my "thing" but no one around me seemed to take much interest in them. When it came time to go through the college hunt in high school, my mother had no idea what to tell me since although she had a background in math, she worked as a lab tech, social worker, and teacher. I ended up railroaded onto a liberal arts education until I reached my community college, where I encountered people - not just professors - who knew that the magic hardware box could do other things besides send and receive email. I didn't even know it was POSSIBLE to get a degree in computer science until community college, and later interacting with people I met from video game communities. I still couldn't actually get my bachelor's in computer science for financial reasons (have a LibArts BA), and even with knowing about CS as a path, I didn't have anyone who could help me or even just talk to about what to do. Professors at my university didn't know what to do with me, and the university career center tried to shove me into roles like inner-city teacher, advertising, and camp counselor.
It wasn't until two years ago that I became good friends with someone who had been through a lot from his college to several jobs in industry, and he helped me get on track to doing something I wanted to do. Went from a dead-end job doing forced cowboy coding and data entry at a small insurance company to QA automation at a nice software company that works in the transport industry. Even now, I wouldn't really consider him my mentor because I know he's got a lot more work responsibility than I do and I don't like asking him questions that remind him of work, especially since he's my friend first. I firmly believe that if I had someone LIKE my friend as an actual mentor earlier in my life, I'd have been studying things I liked doing way earlier.
Something that occured to me after I hit submit is that I still feel that I don't have an actual mentor, yet many articles/blogs and people I've spoken to insist that it's easy and almost natural to find and have one, be it at the workplace or somewhere else; any articles I've read or listened to (example of the latter being the Hello World Podcast) about people who have done it without mentors were people who grew up with computers/programming texts in the home. I'm still not sure if this is a reality or something that happens to maybe 5% of the programmer population.
True story: When I was in college, a girl came to me with a question: If she put Borland C++ on her laptop, could she compile programs on it just as easily as on her desktop? I replied that I don't see why not.
Then she asked: "But isn't there a chip in my desktop that does the compiling? Does my laptop have one, too?"
This was a CS major. And an A student. At a technical university.
I feel your pain, man.
And yes, for undergrad education, community or even state college may be more suitable to your interests. A hard-learned lesson for me. The goal of Ivy League undergrad, it seems, is to mold you into a "certain kind of man". "Harvard men", for instance, look and behave a certain way, and know and say the right things. So that when the Harvard men at the helm of power see you, they recognize you as "one of their own" and "good people".
Some of us already have some idea what sort of person we are, and don't want to become anything else. The good news is, we can get all the education we need for pretty cheap, even in the USA. The bad news is, Google is looking for "Stanford men"...
(I should have replied, "Yes, it's called the CPU. And yes, every computer has one.")
As I walked them through that, it dawned on me that the problem wasn't recalling or applying the formula, but that they had no idea what x = cos theta, y = sin theta, theta ranges 0 to 2*pi, z ranges 0 to 2 would graph. No idea and no idea how to even begin...
Go to school when you feel you need something they can teach you, and not before.
What did they do instead? Both went into CS!
You also don't have access to the behind the scenes influence eg being a legacy at Harvard or similar, I was surprised to hear my mum saying I we had stayed in Brummagem they would have used my grandfathers influence to try and get me into King Edwards.
In the old days if you wanted to learn something, often you'd pay $20 to $50 for a book. For example, Amazon order search claims that somewhere I have a copy of "Paradigms of Artificial Intelligence Programming" by Norvig, which is supposed to be a decent book in the field, cool. Ironically maybe that IS the textbook for the famous AI MOOC class, I don't know, maybe it should be if it isn't. However I never finished the class, only glossed thru the book, and barely remember either.
I was curious about the topic, not trying to meet someone elses arbitrary goalpost, and the "graduates" metric isn't a useful number.
Kinda the difference between running because its a beautiful day out and I need exercise, and running a competitive marathon against live athletes in person solely for the goal of getting a number when I run thru a gate. I would guess in the general public the former outnumbers the latter 50 to 1, so the MOOC stats aren't unusual at all.
Another interesting analogy is "The Mona Lisa" painting is a failure if you measure its worth solely based on number of PHD students graduating who give it credit for them selecting "Art History" as a major. That doesn't mean it needs to be burned as a failure.
The context is missing but is important. When I say MOOCs have failed I mean something very specific: asynchronous online learning has proven to be so dramatically less effective than synchronous, guided learning. Better than nothing existing, but we’re learning that only a small subset of people are able to effectively learn from them.
In other words, take a random sample of 100 people with the same level of motivation. 90% will complete a live, guided course. 5% will complete a MOOC of the same quality. Which is unfortunate because the live/synchronous part is what is expensive.
There was a time when we thought all education was is just a bunch of video lectures, and that if we got a bunch of professors to record their lectures college would be free.
We now know that doesn’t work, at least for the vast, vast majority of people, both because we’re unable to discipline ourselves and because most true learning happens in a multi-way environment with accountability measures.
From my anecdotal experience, the people taking the "live, guided course" have committed to:
- synchronous attendance at a particular time, probably at a particular place
- probably a dollar investment
- face-to-face contact with instructor
- near-traditional learning environment
- browsing on an electronic device
Internally we have taken sets of students that apply to our free “mini code bootcamp” and sent half to an archived version published each night and send the other half a live link that becomes an archive each night. So we didn’t control scientifically, jsut an AB test, but the shift has been breathtaking.
We can also take students that are struggling learning from our MOOCs and they finish the live course almost every time. I know the latter is unscientific, but it becomes obvious over time.
"In these course[sic], “dropping out” is not a breach of expectations but the natural result of an open, free, and asynchronous registration process, where students get just as much as they wish out of a course and registering for a course does not imply a commitment to completing it."
Also the discussion of the "Colbert Bump", where increased publicity to less-motivated (just curious) students doubled completion but tripled registration (thus reducing completion rate).
Also the point about whether to allow or disallow registration after the course begins (late registrants are ineligible for certification, so they are automatically "failed").
"HarvardX could boost its certification rate by closing those courses to new registrants or restrict courses to those most likely to complete them. Instead, it keeps courses open to maximize the number of students who are learning something new."
I absolutely agree that (at least some) users can be helped by live instruction or supplemented online instruction, but I disagree that commonly cited, low completion rates measure students with the "same level of motivation".
For instance, are MOOC students more likely to not complete the course because they have already found work in the meantime? Perhaps because they have more freedom to look for work during business hours, for example, rather than sitting in on a live class during the optimal employment search time?
If the goal is to find work, there is no incentive to keep going in the course after you have met your goal.
If you want the stats for our full computer science academy I can give you those; they're good. 50% hired within one month, median $90k salary.
Beyond broadening access, if you have to be somewhere in person you have to make it priority at that time - and you are not interrupted while you are there. In home study is very easily interrupted and easy to postpone due to other very real duties until it is too late.
The motivation hit from meeting other students is also missing.
This first couple of things I'd check would be:
- Double-blind: What if you tell students it is live, but it is actually pre-recorded?
- Are the live+archive and archive-only videos the same?
- Are the live+archive students actually watching the video when it is live, or are they using the archive feature?
- Do archive-only students who consistently watch at the same time do better?
- Does time/day of watching affect results? (archive-only students get the link at least [duration of class] later.)
But random assignment is a good start!
When I take a MOOC, I'm pledging at least 5 hours a week. That covers 2 hours of videos, 2 hours of homework and 1 hour of additional research / review.
If I pledge 5 hours /week to a traditional class, I don't get nearly as much learning. I have to commute to the class, which is at least 30 minutes each way. Now my 5 hours buys me 2 hours of class, 2 hours of commuting and 1 hour of homework. I've lost an hour of homework and an hour of research.
I know which one is better for me.
With the side issue that 100% of everyone in my 20 person automata theory class back in ninety-something is a much smaller number than 2% of 24000 people signing up for Ullman's automata theory MOOC.
The fundamental problem is authority along with excessive broadness. If I want to learn Y and the MOOC teaches X Y and Z, I will fail the MOOC. There are also Skinner box conditioning issues where my assignment for something like using bootstrap framework, will be scratching my own itch of project or doing some project at work, such that again I'll fail the MOOC because I'll refuse the assignment; I already have a more ambitious self assigned assignment and am not doing both my assignment and someone elses much less interesting assignment. Whereas if I sit in class I'll have peer pressure and financial pressure to do the class assignment regardless of work or hobby interests.
Sort of a traditional in-person or online class is a goal in and of itself; MOOCs are a means to an end for work or hobby for most.
What does complete mean here? Complete within a given timeframe? Doesn't that miss the entire gain that MOOCs bring to the table? Which is that education becomes a lifelong endeavour instead of something you try and fit into a few years and, for most, never return to again?
Sounds like MOOCs need to learn from educational institutions that actually have a track record in distance learning.
Saying MOOCS are a failure because only 2% complete the course is like saying the OU is a failure because only 2% of their web site visitors sign up.
I suspect that a MOOC offered with the same incentives and structure as a live class - ie, as part of a degree-granting program with mandatory tests at set dates - would have significantly higher completion rates.
If it is the former, then it is definitely a success. If it is the latter, then I think that not only percentage rates of success are important, but also the nominal completion rate itself.
A MOOC supplies the opportunity for someone getting a degree in communications, business, or some other not-entirely-technical degree, that might be wonder what a computer science or high level mathematics course is like to go and find out. Does their failure in such a course mean that the class itself is a failure, or can the fact that they had access and were able to put a toe in the water mark success?
I would suggest the latter.
I don't see how that is a failure? MOOCs are a platform for "just in time" learning. If you run into a problem at work you're not going to sign up for a four year university degree program to figure it out, but there is a good chance you will sign up for a MOOC and drill down into the specific parts that you need to close the gap to solve the specific problem you have. And when the next problem arises, you may continue further into the course. There is no reason to cover everything at once. This flawed opinion about failure may be directly related to #3.
> Economically we vastly undervalue education.
Because it hasn't shown economic value. The percentage of the working age population who have attained a postsecondary education has skyrocketed over the past 50 years, but incomes have remained stagnant. If the more and more people attaining a postsecondary education really were adding $10k to their yearly income like the article suggests, incomes would be rising substantially.
Those who have a degree, on average, make $10k more than those who don't. But since this simply measures high-achieving people against low-achieving people, income conclusions cannot be meaningfully drawn from it. It is important to remember that universities and colleges reject the lowest-achieving people at enrolment time, preventing them from signing up for class entirely. Medium-achievers who make it past the initial filter are put through academic rigour that sees them drop or fail out. That leaves only the highest achieving people able to graduate.
Nobody should be surprised that high-achieving people are able to make more money than low-achiving people. This would remain true even if colleges and universities never existed. Finding something that correlates with high-achievers does not tell us anything other than this group of people is more likely to be high achieving.
Allowing more and more medium-achievers to graduate from university still leaves them as medium-achievers in life and they will still end up in the medium-achiever work that they always would have (after all, someone has to do it). All while incomes remain stagnant.
The whole point of OP is that education does not correlate with achievement, because there are any number of showstopper issues - mostly lack of up-front cash, and/or debt, often in really quite small amounts - that keep talented people from achieving a much more productive and rewarding life.
The bigger issue is that social class determines "achievement" far more than either education or talent do. And this is incredibly toxic and destructive, especially over the longer term, because it creates a poverty economy where the entire economy runs at some small fraction of its true potential.
The fact that a few people are super-rich doesn't alter this. It's about the opportunity cost of not recognising, nurturing, and rewarding talent and ability on an industrial scale - while using narratives like yours to pretend that the problem is individual personal failure, and not systemic economic self-harm.
Not because of education. The education is simply irrelevant and has no bearing on income. The market doesn't value that output. Nobody goes to the grocery store and pays more for produce that is certified as grown by university educated farmers. Why would they?
Not necessarily because they know what they are doing, but because they have that option - there is no practical reason to not hire a college graduate for almost any job nowadays because there are so many of them looking for work at any rate beyond part time minimum wage.
Even if their degree is wholly inappropriate for your job, and even if your job has no reason to expect a college degree, just the fact they have one means two things - A. they can put up with rote bullshit while having the choice not to for years, and B. they are almost always indebted and thus are going to be more loyal to the paycheck.
And I bet that is true. The person who graduated with a BS in agriculture science is able to complete the BS in agriculture science for the same reason that they will do well in the work place. The fact that the person was selected by the school to enrol in a ag science program means that the person is already determined to be better suited to the workplace the someone of the average population. They were already the best hire before they even enrolled to study agriculture science.
I mean, if you handed a agriculture science BS to a homeless person with a drug addiction, I bet the high school graduate is going to start looking far more appealing in comparison. However, someone who is homeless with a drug addiction problem is going to be immediately escorted off college campus even if he tried to get a degree. The degree isn't the appealing part to employers. It's the type of person who is able to attain a degree.
> there is no practical reason to not hire a college graduate for almost any job nowadays because there are so many of them looking for work at any rate beyond part time minimum wage.
I am not sure what you mean. Only about 30% of the workforce have a college degree. Employers don't have the choice to hire college graduates, even if they wanted to. Although that you say that a large portion of that 30% are struggling to find work is fascinating and, as income is a result of supply and demand, supports that incomes are not rising with attainment.
> Even if their degree is wholly inappropriate for your job, and even if your job has no reason to expect a college degree, just the fact they have one means two things - A. they can put up with rote bullshit while having the choice not to for years, and B. they are almost always indebted and thus are going to be more loyal to the paycheck.
Which is all well and good, but the fact remains that the income bonus isn't showing up in the data. These people are making the same amount as they would have had they not gone to college. They are making more than the drug addicted homeless person, yes, but that is not because of school. They would still be making more than the drug addicted homeless person even without school. And the Bill Gates of the world would still be making more money than you even if school wasn't a thing. Income diversity across the population has been around as long as we have been able to measure income and it was never a result of who went to school and who didn't.
At that it failed. It is not cheaper education for non traditional non full time student with lack of money. They drop out faster or equally fast then non traditional students on colleges (who drop out a a lot too).
It is great free easy to use option for educated professional through. But, that was not original sell.
It seems premature to say that it isn't provided that. You have your entire life to gain a college-level education. The primary reason that brick and mortar schooling is provided up front, in a relatively short span of time, is because it is impractical to visit that brick and mortar school on a regular basis throughout your entire life. Like the article points out, a major hurdle for a large segment of the population in accessing brick and mortar schooling is proximity to those brick and mortar schools.
Online learning has no such constraints. You can be out in the middle of nowhere and still pop into class whenever you feel like it. If it takes you 80 years to complete your college-level studies, great. To say that is a failure, only a few years after becoming available, seems to miss the point of what alternative means. If a MOOC had to be exactly like brick and mortar schooling, it wouldn't be an alternative, it would be the same thing.
Although I concede that a fatal flaw of MOOCs is that there are an infinite number of options online. While it may be impractical to visit 100 different brick and mortar schools to see who teaches a specific topic the best, online it is easy to do so and can be done in an instant. With that, there is no incentive to go through your studies only in one spot that can be tracked by a single entity. This makes it challenging from a business perspective, but not an education perspective.
I would rather say it's because neuroplasticity is high until you're still relatively young. In later years it takes more effort to learn and retain.
However, they don't drop out owing tens of thousands of dollars in student loans. That in itself would seem to be a major advantage.
Your numbers are out of proportion with every single statistic that measures salary of a college graduate vs salary of a high school graduate.
Maybe incomes have remained stagnant because the overall income is dropping nationwide and college graduates are the ones keeping the average stable.
Suffice to say your statement is not accurate. Education shows massive economic value.
However, EVEN if it didn't show massive economic value, exposing young people to a variety of ideas and perspectives before they enter the work world gives them a solid foundation to make decisions and critically think about life is massive value in itself.
Human beings are limited by the information in their heads and don't typically seek out new information voluntarily. Education gives them new information to reason about the world with.
The original article gives the $10k figure and a quick Google search suggests other sources agree with it. But really, the exact figure is irrelevant. The fatal flaws with that way of measuring income remain. The numbers don't really matter here.
> Maybe incomes have remained stagnant because the overall income is dropping nationwide and college graduates are the ones keeping the average stable.
I don't think the data supports this. Incomes are stagnant for even the lowest income group. Unless you are suggesting that college graduates in the lowest income group are responsible for keeping that lowest income group afloat? If that is the case, what should we take from that? Why are college graduates in the lowest income group in the first place if college attainment leads to higher incomes?
Adding $10k to your yearly income does not add $10k to the aggregate yearly income. In large part, educational credentials are a positional good - if we gave out twice as many law degrees, we would not have twice as many practicing lawyers. In other words, completing a degree often helps you get a job at the expense of others who do not hold degrees.
Until it gets automated.
I think you're over-assuming that people's "achievement level" is a completely general, fixed number, when there are a lot of suggestions that it's more situational.
There is some suggestion that the supply of graduates isn't well matched to job market needs, which usually manifests in people being opposed to anything that isn't STEM. But that's not the same thing as an argument about achievement.
Readings that may be relevant to those who find this Twitter thread intriguing:
- "On the Wildness of Children," on what kind of educational experiences we might imagine for our children if we consider what children are like first: http://carolblack.org/on-the-wildness-of-children/
- "Deschooling Society" (on my reading list): https://archive.org/embed/DeschoolingSociety
- parents care more about extracurricular and athletic pursuits than academic. They'll help their kids play a sport, but rather than try to help their children learn, they'll complain that the teachers are not doing their jobs
- children react to changes in environment the same way adults do, but this isn't considered by schools because they know the students have no choice but to comply
- many children still go hungry at school, making learning difficult to impossible
- one in ten children are diagnosed with ADHD; somewhere between 1 in 20 and 1 in 10 may be on stimulants, antipsychotics, antidepressants, etc
- private schools may pay half the salaries of public school teachers, and their curriculums are not as strict (virtuslly no required curriculum)
- children are still told today (by teachers) that they will be destitute and homeless if they do not go to college, but they do not provide any extra assistance either, making them either extra stressed or give up
- teachers pay, pensions, etc and overall school budgets are still cut short by politicians whenever they need to trim their budgets, while other areas of budgets are increased
- children in 8th grade often have a 3rd grade reading level in many parts of the country, but especially major cities with a history of segregation and a lack of access to jobs
- some kids are still booted from one school to the next and used as scape goats if anything negative is on their transcript
I dont expect teachers to teach my kids sport, I do expect them to teach math and history. Not sure what is wrong with that.
Teachers have to teach 30 kids at a time, who all have different backgrounds and lives. They can't possibly give them all enough individual attention and time. And somehow they're supposed to get them all to surpass all the factors holding back each specific kid in a myriad of ways, to succeed at the same material, at the same time, at the same pace.
They have to be babysitters, disciplinarians, counselors, role models, and educators. And then they get yelled at by parents or lectured by administrators for not keeping up to this standard. It's crazy. It's one of the reasons so many teachers quit within five years. It's one of the reasons it's so hard to find good teachers. And of course, they never pay them accordingly.
If it weren't for the fact that teachers are idealistic and want to help children learn, we'd have a nation of morons, because only desperate people would subject themselves to this career.
Teachers taught me things my parents were not goot at and did not remembered. Right now, teachers are teaching my kids things I was never good at. A math teacher I know (friend) it teaching kids math they parents down know - right now. And I met kids, they do know that stuff.
So yes, teachers can teach kids what kids parents don't know.
In your attempt to defend teachers, you made them sound to be completely useless. Some are, but plenty of them are not.
After about 3 years of work, I managed to scrape together just enough to enroll in my local community college for two semesters, one class per semester. Total cost? At that time about $400 including books. And that $400 could have vaporized with one flat tire, or one visit to the doctor.
An acquaintance of my parents heard that I was going to college, suddenly took an interest and hired me sight unseen to come work at his startup in an entry-level developer job at 3x my then current pay. I worked there for a couple years and when I was close to finishing college, another acquaintance hired me and doubled my pay. After that I was attached to a growing career and haven't looked back.
But it all started with getting that tiny little bit of money together.
You aren't surprised because you have experienced this. But any economist would look at this and say something like this:
> You claim that there are people where an investment of $400 would
> quickly (within a couple of years) give them a salary increase in
> the range of $20,000 to $50,000 per year. That is obviously untrue.
> If it WERE true, then at lease ONE person in the country would have
> spent $4,000 to seed a 10-person program which granted the $400 in
> exchange for a promise to contribute $8,000 back in a couple of
> years. A x20 replication factor would easily swamp administrative
> costs and participants who refuse to pay afterward and the program
> would still be growing exponentially.
I think the source of the fallacy is that while there ARE people for whom $400 is a difficult-to-surmount barrier beyond which no other major impediments stand between them and middle-class success, they are an impossible-to-find minority in a sea of people who are willing to ASK for $400 and those for whom $400 is a major impediment, but further impediments would prevent them from succeeding.
Alternately, we can look at the literally thousands of charities (including schools themselves) that offer merit-based and need-based scholarships, and conclude that the economist is partly right and the system DOES exist, but isn't meeting all of the need.
I'm only half-joking; I think the problem here is lack of information. That the student was likely to move from $10/h to $70k/y is something that the teachers might know, but random lenders probably don't.
On the other hand, I don't know if I want to see financial companies partnering with schools to provide loans - seems ripe for abuse.
Austen has been railing against places like University of Phoenix, etc for a while as what's been discovered about them is just nuts. The stat that stood out for me was that at one point they had a substantial number of recruiters (thousands) but exactly zero job placement staff.
That way it'd have easy access to short-term capital needs like this.
We’re working on having a fund available for living expenses, but “I’m going to pay you cash so you can go to school” is a different risk profile than “This School is free until you’re hired.”
I personally would loan money for this sort of thing if there was a reasonable ROI and someone vetting the students.
Before: simply getting familiar with a subject required me to jump through bureaucratic admission hoops, go through arbitrary prerequisites, pay tons of money, bend my life around some class schedule and then get mediocre lecturing from some random guy who happened to teach the subject at the nearby university at that time.
After: I can watch lectures from world-class professors and universities at any time, starting with any difficulty level, on a wide variety of subjects. I can also get free help from other people if I need to.
How is this a failure? MOOCs are one of the greatest thing that ever happened to the Internet and education.
I'm currently in the process of "fixing" my college education by watching lectures on fundamental subjects like biology (which I never had in college for some reason), chemistry (which was taught really badly) and linear algebra (taught awfully). It's astounding how many new things I'm picking up.
Plus, there are classes like AI, Cryptography, Model Thinking and so on. Nothing of this sort was available to me in college at all.
On 1: The "is this required" question is direct result of students being goal oriented with their education and time. They do it because they want job or because of social pressure. It sounds depressing, until you realize the alternative is not caring about job when choosing what are you going to learn, which is criticized in other points.
We can't have this one both ways. Either it is for purpose of job and they will ask that question, or it is hobby and they will care less about utility (e.g. job).
I, perhaps cynically, think some universities have even started "hacking" that process. In Nottingham you can hire graduates from either of our 2 big universities where the University pays part of their salary! 
1. selection bias (unemployed or low-income graduates would feel embarrassed to respond)
2. the mean of the (biased) sample being much higher than the median (die to outliers)
"99% of people, when left long blocks of time alone to work on something without anyone to be accountable to, will watch Netflix." I have to agree with this, and yet the appeal remains a deep mystery to me.
It's also worth noting that a number of those points are specific to the US system. Basically anything involving money. The maximum tuition LambdaSchool (OP's school) charges for six months is almost enough for three years of tuition here in the UK, or about 50 years in Germany.
In this day and age I would hazard a guess that over 90% over University students...are there for a job and thats it.
A lot of people are asking me about number 7:
> 7. We had one student on the edge of homelessness, was $400 short on bills and almost had to quit because of that. I personally loaned him the money, and his income moved from $10/hr to $70k+/yr. It only took $400, but he didn’t have anywhere to get that from. Insane.
We train people to be software engineers for free in exchange for a portion of their future income for two years (we only get paid if they make above a certain salary threshold doing what we train them to, otherwise we don't get paid). So as a school we focus on eliminating the risk and upfront cost to the student, which lets many brilliant people access a world-class education who otherwise might not afford to. We're also incentivized to make the perfect school that will cause you to be successful in your career, and those incentives make an enormous difference.
The average stats from our first classes are remarkable: our average student increased his or her annual income by over $40,000/yr, 50% were hired within weeks, and the median salary was $90,000, including a lot of low-cost-of-living areas.
Since we're paying for all the other costs of running a world-class school without upfront revenue (we're spending over $300,000/month all-in right now), we're not in the place to pay for living expenses as well, and currently students need to cover their own living expenses for six months. Usually they can live with family, some have part-time jobs, we do have a part-time program, etc. but it's never easy.
This kind of small discrepancy forcing people to quit is something we see all. the. time. I probably see a similar situation with different small dollar amounts weekly. There are times when I am fairly confident I can swing someone's future earning power by several million dollars, but I don't have the thousand dollars it takes to do that. It's incredibly frustrating.
We can partner with lending companies, but if they're paying for students' living expenses they also require students to pay their tuition upfront and be saddled with debt, which goes against our mission, and we think there has to be a better way.
I'm looking into raising some sort of non-profit, living-expenses-only fund/endowment to lend during these sorts of shortfalls, but I'm new to raising nonprofit dollars. If anyone has any suggestions I'm happy to hear them. I don't have a good sense of what the interest rates would need to be for this to work in a for-profit world, so we'll see.
Like how people purchase a mining contract for BitcoinCash .. except you get a delayed payout, and your investment directly helps educate people. Some sort of variant on Patreon.
Do people on here agree with this? I think that is far too cynical... education (even this guy's education) tends to be misaligned from a person's actual interests, and so most would rather not do it in their free time. But I do think people want to do work/productive things and not just watch Netflix.
Give me a few days to a week of pure relaxation, and my creative juices start to flow.
If the students had their own projects that they were working on, would you know about it? Offhand I'd guess they have things they'd rather be working on than your assigned coursework, so if there's no benefit, why do it?
Relevant background: I've held a career that is a dream of high profile projects: game studio owner at age 17, beta developer for original Mac in '83; on Mandelbrot & DeVanney's original Fractal research team, 3D graphics research community during the 80's; engineer on the 3DO and then original PlayStation OS teams; 15 years as a lead game developer of major release titles; 5 years VFX developer and artist and financial analysis for 9 major release VFX heavy feature films; and now 10 years in machine learning and facial recognition.
I love to mentor developers. I ran a free coworking space for 3 years where I tried mentoring many, many starting-their-career developers. I also have both undergrad and graduate school teaching experience.
Out of perhaps 3 dozen mentoring situations, only 2 people actually took the opportunity seriously. Both of them went from nearly minimum wage shit coding jobs to real development companies making in excess of $70K. The other 34 people I tried to mentor never even finished the first "let's see if this person is serious" task I gave them.
I remember myself testing waters or getting interested/curious in multiple things dropping out of most rarely keeping only some. Even within same general area sometimes I was motivated to continue and other times not.
But then again I went to school in Europe with cheap tuition, so time is the biggest investment here.
There, I said it.
What did you study, and what do you do for work?
Organisation A runs a traditional class that takes in 100 students. 50 complete it.
Organisation B runs a MOOC that, with a bit of advertising, gets 2500 people to sign up for free. 50 complete it.
Maybe my made-up numbers are off by orders of magnitude. But "2% completion rate" on its own doesn't sound so terrible if it reaches a much wider audience to start with - which both "free" and "online" could contribute to.
I have signed up to many MOOCs and completed a couple. I consider that a net benefit. MOOCs are not going to replace university education any time soon in my opinion, but for people who don't have the university option in the first place or want to take a single course every now and then they're a great thing.
People _register_ for a course, presumably, with the goal of finishing it. (Otherwise, why not do a google/wiki search for topic X? Find that specific topic on youtube/MIT OCW?)
When the vast majority of people fail to complete the goal, you decrease their confidence in the process. Unless someone starts with strong academic confidence, they may be demoralized by giving up/failing a course. We shouldn't be happy that 5% of people enjoy high school math class if 95% of people graduate hating it, and never touch the subject again.
We can move goalposts and now pretend that MOOCs primarily exist as a random buffet of knowledge and not replacements for actual courses, but that's not how they were positioned.
"That pill I sold you, turns out it's no better than placebo... but placebos are 30% effective, so it all worked out, right?"
Most of the free MOOCs I've registered for I've not really had a particularly strong intent to finish. (Paid MOOCs are a different story.)
> (Otherwise, why not do a google/wiki search for topic X? Find that specific topic on youtube/MIT OCW?)
Because of a preference for interactive vs. static content, for one thing.
> We can move goalposts and now pretend that MOOCs primarily exist as a random buffet of knowledge and not replacements for actual courses, but that's not how they were positioned.
Actually, being low-barrier-to-entry so as to support more experimental, low-commitment exploration—which logically implies a lower completion rate—is a big part of how free MOOCs were positioned.
I suppose this is sad for academics, who teach as though they expect all of their students to become academics. But the students are just being realistic -- for almost all of them, an education is preparation for a career. I was in academia for a little while and it was amazing how many professors just didn't understand that.
The basic thesis is that school isn't mostly about teaching/learning - it's mostly about signaling to future employers that you are smart enough (and conformist enough) to get through school. He guesses it's about 80/20 between signaling and actual education.
It's interesting to read most discussions of education with that idea in mind - his assumptions solve a lot of puzzles of how/why education works the way that it works.
That's because they are responding to the incentives that have been placed before them. Learning, when it happens, is done so in the pursuit of some form of achievement/validation (grades, test scores, etc.) Yes, properly evaluating learning capacity and knowledge accumulation is difficult, evidenced by the popularity of algorithm/fizzbuzz whiteboard problems. However the major issue is that culturally there is a lack of passion towards accepting ignorance and trying to learn ("We'll fix the problem with common sense"!), as well an open hostility by some towards those that are educated.
"9. 99% of people, when left long blocks of time alone to work on something without anyone to be accountable to, will watch Netflix."
This is caused partly (if not mostly) from issue number 1: Self direction and self control is absent from schooling. When no one is around to tell you what to do and yell at you when you don't do it, it's no surprise then that people lack the interest and discipline to pursue self-education when it is available to them.
You're right, I hadn't thought about the time. From 18 to 65 is 47 years; four years is a significant chunk of that. It's still a reasonable chunk if you come out of it with a degree that helps you get a good job (or, to worry less about credentials, with training that makes you more productive, or a mental framework that improves your life). But if you spend the four years, don't get a degree (or don't get one that gets you anywhere), and don't learn anything life-changing, four years is a lot of time.
I don't know the facts-- and none are presented in the tweet-- but I do have a friend whose very job is to run studies on the effectiveness of programs at a University. I am sure this person is not alone.
Even if "most" (asserted) universities don't have a way to measure their effectiveness, the fact is that many of the best do and they all tend to benchmark each other. I would be really curious what the author is basing his assertion on because I think it is likely schools research a lot of things (like their effectiveness) but don't release that information publicly for many reasons.
#1 school for this metric.
In the Top 10 for some other metric.
Even the small California state school I went to goes on and on about the things they're good at...Environmental stuff and getting grads into the Peace Corp.
This is a bias from the lower middle to upper class. A computer without some knowledge or support to maintain it (more costs) make that price rise.