Hacker News new | past | comments | ask | show | jobs | submit login
University of Wisconsin to offer degrees based on tested ability (wsj.com)
150 points by anigbrowl on Jan 25, 2013 | hide | past | favorite | 96 comments



This is the magic quote: "Now, educators in Wisconsin are offering a possible solution by decoupling the learning part of education from student assessment and degree-granting."

Speaking as an educator (I teach undergrad-level CS) I would love to be able to not have to give high-stakes assessments: I'd still assign projects and give exams to be able to give students feedback and help both of us understand where they're at, but the whole process would be so much less stressful (on both sides) if the assessment-for-credit came from someone else somewhere else.

(Interestingly, I also am very interested in writing and working out how to fairly score the for-credit assessments; I just don't want them to be part of my class.)

That said, there's a lot that's a little scary about initiatives like this (from a job security standpoint), but I do think that something along these lines is both inevitable and, ultimately, the right direction to go.


> This is the magic quote: "Now, educators in Wisconsin are offering a possible solution by decoupling the learning part of education from student assessment and degree-granting."

The University of London has offered this http://en.wikipedia.org/wiki/External_degree since 1858.


Yeah, our current conflation of education and credentialing has some issues (though it's by no means new, on human-lifespan timescales). It's interesting to see it breaking down.


Any concern that the student then focuses on the assessment alone, less interested and less engaged in activities/classes/work that would otherwise teach her how to learn, how to work in a team, etc?

Does this focus students on an outcome that is arguably less important than the process to get there?


It sure does. Kinda happens in public schools now - teachers teach to the test because they need the students to pass the standardized tests.


I'd say there's already a sizable contingent that prioritizes scores over pure learning.


Based on your experience, would it be difficult to decouple the assessment of an instructor from the test results of his/her students? If the assessment of a student's learning comes from somewhere else, what other mechanisms could we use to evaluate the effectiveness and (or) competence of their instructors?


> would it be difficult to decouple the assessment of an instructor from the test results of his/her students?

In fact, I think it would be critical to do so. Students may perform well on the exam for any number of reasons other than having a good teacher, and they may perform poorly for reasons other than having a bad teacher.

> If the assessment of a student's learning comes from somewhere else, what other mechanisms could we use to evaluate the effectiveness and (or) competence of their instructors?

Well, it's not like we can use exam scores now to evaluate the college-level teachers. The sad truth is that to a large extent, most people who teach at the college level are evaluated primarily on things other than teaching. I know I'm a pretty good teacher not because of good test scores, or good evaluations, but because students come back to me months or years later and talk about things they learned in my class(es) that set them up well to find their way through a later class or a job. But that kind of feedback is very difficult to quantify.


I saw a suggestion elsewhere on hn that moving to a single, cumulative exam after 4 years can be very effective because it's almost impossible to pass by just cramming the week before and it allows students and professors to focus on learning for most of the students' time in undergrad instead of on passing a test or teaching to a test.


This is similar to how the system at Oxford works (I think this is how the rest of the British universities do it as well.). Your degree class (essentially your GPA) is solely based on tests you take at the end of every year of your degree (three for a BA, four for a MComp). There are practicals (lab sessions) and tutorials (conversations with tutors about problem sets), but those don't affect your degree class, unless you do terribly.

With all the testing in American universities, the British (?) system is quite refreshing.


I believe this is the case for most British universities, most "traditional" universities anyway (not sure about former polytechnics, etc). Sometimes there is also coursework that contributes to your final degree (my CS course had a group project in the second year and an a dissertation based on an individual project in the final year), but all the exams take place at the end of the year.

My uni also weighted the exams so that the final year contributed much more to the final "grade" than the first year, effectively putting most of the pressure on the final term of the last year.


I find this extremely interesting, especially coming from a university where the certification actually means something.

I think Coursera, OpenCourseware, etc, get things backwards. They assume that the knowledge is the valuable part, and so use the online format to deliver knowledge. But in reality, the vast majority of people go to college to get the certification, not the knowledge. That's the valuable part of college education and that's what U of Wisconsin is offering.


… but the certification has to mean something or else it wouldn’t work. If the certification were meaningless (i.e. not sufficiently correlated with ability) neither people nor companies would find value in it.

Now, it may be possible to trick this sometimes or for some time, but if it would be possible consistently universities would long have become meaningless.

Ideally, companies would like to just know exactly what ability you have. However, testing for that is a long, expensive, complicated and hard task. Testing for ability is a hard problem, one you probably need specialized personal for (with knowledge of what abilities exactly you want to test for and the social science chops to make an actual worthwhile test happen). That’s one of the reasons why universities exist.

But this removedness from what companies actually want is also the reason why it’s not perfectly efficient, i.e. you sometimes have to learn stuff you don’t actually need.


It is not as common these days but 20-30 years ago many courses had 100% of the mark depend on the Final exam. It was very common for students to do nothing for most of the year and then madly cram in the last few weeks.

All this does is make the "teaching" pary of the course optional.


> 20-30 years ago many courses had 100% of the mark depend on the Final exam. It was very common for students to do nothing for most of the year and then madly cram in the last few weeks.

It's still that way in most law school classes.


This was how it was in my 4 comp sci classes when I did a semester abroad in Australia in 2003. Attendance was also completely optional, so lots of beach time was put in. Then when it was time for the exam, they put us all in a room together with no supervision. Surprisingly, everyone did very well. It was quite a joke.


What university was that? (Was it even a university? Doesn't sound like any I've heard of...)


I can't decide whether I should googleably post the name of the Uni, partly because, despite the negativity in my comment, I had a blast there. The campus was beautiful and staff and professors were very knowledgeable and helpful. I learned a lot, but unfortunately you didn't need to if you didn't want to. In fact, I seem to recall my classes were only a pass/fail basis as well, so a D was the same as an A.

It was in Cairns, which actually narrows it down quite a bit if you want to google it.


What, you've never left the United States? Continental European universities are often organized like this. Or have been recently; I can't speak for right now.


Actually, part of the idea behind Coursera, Udacity, EdX etc. is to sell better, more accurate credentials to potential employers. The logic being that the responses to hundreds of exercises are more valuable as data than a single final grade.

http://chronicle.com/article/Providers-of-Free-MOOCs-Now/136...


Coursers has to start thinking about cheating if they plan to succeed there.


I think Coursera, OpenCourseware, etc, get things backwards.

Well, they did what they can. It's very hard to invent a new institution with a trusted reputation, almost by definition.

Also, the dissemination of knowledge is the part that scales the most. Evaluation is tough and expensive, especially if you are trying even modestly to prevent cheating.

Someone could go to a MOOC to learn the stuff, then get the assessment from UWisconsin.


"Someone could go to a MOOC to learn the stuff, then get the assessment from UWisconsin."

That is the model I expect to predominate in the long term, actually.


How is evaluation "tough" if you have the entire corpus of the student's work in a course? There's less need for testing if you can see all their work. The point of testing is to come up with a simple statistic that sums up someone's knowledge and performance.


It's hard to make sure it's the student and not Joe Blow who he hired to take the test for him, especially remotely.


Certification is nothing but acknowledgement of knowledge. I agree that coming from a reputable source has more currency that the knowledge. Would be interesting how the certification from separated from the source of knowledge pans out. There is absolutely no reason for the source of the knowledge to be one to certify rather than administrative ease.


> Certification is nothing but acknowledgement of knowledge.

Yes and no. When I hire a Harvard grad, is it because he knows things? Well, it's because I assume he doesn't not know anything, sure. But it's also because I know he must have done very well on his SAT's, and must be a really hard worker and very mature if he was able to get the perfect grades in high school you need to get into Harvard. I also know that his network, by virtue of going to Harvard, is probably better than that of someone who didn't go there.

It's much more complex than you make it out to be.


I expect significant chunks of the Ivy League to survive nearly unscathed.

However, there's only a handful of "Harvards". Whatever exact argument you're trying to make (it's a bad thing? it's not going to happpen?), it doesn't scale down past the Ivy League, or it doesn't scale very far.

Besides, past the initial adaptation period, it's not like these credentials are going to be handed out like candy (those that are will be ignored, just as such credentials are today); serious ones like engineering degrees are still going to be years-long affairs. Are you going to prefer the student who showed enough dedication to do it without an enormous centuries-old structure holding their hands and dragging them along with proscribed structure, or the one that did it on their own? I expect significant chunks of the Ivy League to survive; whether they end up held in the universal esteem they are today I consider to be much more up in the air. They may acquire a much more rarified audience, while the rest of the world begins to look at them as a sort of easy way out....


With all this online education, the Ivies and a few public ivies are going to be more competitive than ever. They're going to have all the student performance info, and recruit heavily from the best students.


When an excellent education can be had with the internet and self-study, universities will simply curate topic sources, and sell accreditation. Glad to see my alma mater acknowledge this inevitability.


I'm not entirely sure it would be applicable to all disciplines. I wouldn't want to hire a mechanical engineer who had never done group projects, for example, or an aerospace engineer that had never spent time in a wind tunnel.


But that can still be part of the credentialing process. Consider the medical field. A medical degree is only small part of the credentialing process. In addition, you'll usually need an internship, a residency, then board certification (optional).

Even at the low end of the medical field, here in Illinois, EMT-I and EMT-P (paramedic) require both clinic time and a practical (live simulation) exam.

You simply implement something similar for engineering.


We even have a word for it: Apprenticeship.

There's more theoretical knowledge behind nursing and engineering compared to carpentry and plumbing, but, really, the idea of on-the-job training with someone who knows what they're doing watching is something we know works because it has worked for hundreds, if not thousands, of years.

So some degree programs will entail an apprenticeship. That's going to be a lot more difficult to game than a multiple-choice exam would be.


Yes, and if the apprentice system is formally part of the process, what would be the point in cheating around the knowledge part? If you cheat your way through the medical exam, so what? You won't last ten minutes in your apprenticeship.

In fact, I've witnessed this, as has pretty much anybody here who has ever given an interview to a college grad with a very pretty GPA who flames out of the interview barely able to tie their shoes in code. Either they literally cheated their way through college, or more likely, just overfocused on the credential part of the process, but they don't make it into a position to do damage. (And if they did, well, whatever system accepted them probably isn't long for this world anyhow or has long since adapted to letting those people in and still somehow keeps them away from Big Red Buttons.)


um no Apprenticeships are strictly craft skills and sit at the base of the engineering pyramid.

An apprentice might make a wind tunnel model designed by an engineer but they would probably know zero about CFD.

I do wish people who have never worked in proper engineering would stop throwing craft and apprentice in to discussions when they have no understanding of what they are taking about.


Funny. Here in Germany you can

- study for your CS degree (formerly 'Diplom', now Bachelor/Master) at a university for a couple years

- do a 2.5/3 year thing including school and working in a company (actually having a company to take you on is required for these kinds of degrees most of the time). It's - an apprenticeship (using the same process, word and methodology as you'd use in a crafting skill)

Do you get the very same job offers, the same salary? Probably not. But I'd say that you have the chance to work on the same stuff as any bachelor/master of CS. I think the disadvantage is merely in the first couple of years and often (note: Not always. Sometimes the more theoretical CS background counts a lot, of course) unfairly so.

So - yeah. You can do an apprenticeship here. Starting with 16 (you'd be done with 19) or 18-19 (you'd probably be allowed to shorten the time to 2 years -> 20, 21).

Disclaimer: I have neither a ~normal~ degree from a university nor did I do this apprenticeship. But - it works.


I think your point "same job offers, the same salary? Probably not" i think you just made my point.

And I seem to recall the German education system has some quite strict streaming. I think going from apprentice-> ->technician->engineer (or similar level of job) might be harder in Germany than it is in the UK or US.

Having said that Germany has a very powerfull vocational training system unlike any where else in the world.


And here I am, having none of these things, posting on HN from a very reasonable (will I ever be stop looking for better things?) and well-payed development job.

So, two things.

1) You claimed that an apprenticeship is only for crafts and I told you that this isn't the case here. Locally your 'strictly craft skills' comment would be wrong. Very much so.

2) Yeah, a degree matters in some/most places, like everywhere else in the world I guess. That's not related to apprenticeships in particular though (having done stuff like that is better than having nothing of value in writing at all, like .. me). And too general to be 'truthy'.

Back to the topic: Apprentices (and former apprentices) are not unable to do the job of an engineer.


well you have to look at the wider picture Germany is rare (almost unique) in that engineers are looked up to as the equivelent or a doctor and lawyer.

And how many senior engineers at Audi and BMW and Mercedes started as apprentices.

In the English speaking word this is not the case we are look upon as greasy almost sub human.

for example the wife of the no2 at bt labs you know the place where the first computer was designed and built when she said that he husband was an engineer she was asked patronizingly "oh thats nice dear what sort of cars does he work on"


And yet PE certification is - under a different name - apprenticeship.


Huh. I started working directly after high school; less than 2 years after I graduated, I was The Tech Guy at the largest ISP in Chicago, and just a few years later a developer at the first commercial vulnerability research lab.

I never really even had a chance to think about a degree.

Maybe now I should!


What point are you trying to make here?


I'm not sure I understand your question. I wasn't being sarcastic.


I smelled sarcasm, thanks for clarifying. Otherwise, why would you consider getting back to college? Nostalgia for college days or just because you didn't had that experience? What value do you think it could bring? (I dropped for 2 years, and still not sure what to do next)


Seems like have you have done great without that piece of paper. What degree would you like go for? A BS majoring in CS?

Or maybe think about learning some of the things you missed out on by not having the time to go to college (mostly non tech stuff I guess).

Perhaps a degree is really just a well thougth out education plan more than a certificate. You learn this list of things and you are "educated" to some level generally (a liberal arts BA & BS) with an emphasis on some subjects (your majors). More advanced or vocational degrees would be more focused plans on a single area.


The BS might allow me to go to law school, is one of the big things I think about.


Credentials seem to make the most sense when you are just starting out or when you want to start over in a new area because you have little if any experience.

More regulated areas simply require a piece of paper. Not that this always seems to help: in California to be a mortgage broker you need a real estate license which requires you pass several college level classes, take an long exam, get fingerprinted and have a certain amount of experience. Yet none of this prevented all the problems we recently saw.


It takes just about ten seconds to verify that a sprinter is world-class, but it takes hundreds or thousands of hours to train that sprinter. Colleges are in the "sprinter" training business. They throw in the verification for free. (Colleges charge for their courses; the exams come included.)

Colleges are just delaying, not avoiding, their end by focusing on the verification. First, because they don't have the cost structure to compete effectively on verification. You don't need to spend $40,000 a year just to take a few tests. And, second, testing is not a stable long-term strategy for colleges because once they lose their status as the best place to learn, they will also lose their status as the best place to be tested. That is, I suspect that once Udacity becomes known as the best place to learn computer science, Udacity will begin offering certification tests that will have more prestige than colleges.


Finally! Separating certification from knowledge. This will eventually be the norm I feel.


I can't tell if you're trying to make an ironic joke or if you mean separating certification from the acquisition of knowledge?


Definitely not trying to make a joke. If someone had the knowledge equivalent to a bachelor's level, does it seem correct that to be considered equal, the individual would have to pay for at least 3 years of school full-time (even if he tested out of easy classes) and paid ~ $50,000? This is for a cheap school, not a top-tier.

This has never made sense to me. Knowledge is knowledge - who cares where it came from? I'm not saying other things are not learned in college - yes teamwork is important and such. But you can literally learn that in any situation. What about experience? If you're not actively trying to go to school, and doing BS work, you can actually get a relevant job, show some creativity in your field, or (I wish) hopefully get an apprenticeship. I hope these come back some day - the learning is unparalleled.

Three years ago, I spoke to a friend about designing a computer science bachelor's exam. It could literally be as grueling as you want, over a 10 day period, and you could charge up to $5,000 easily. Most people I spoke to said they had concerns about people "cramming for the exam". In truth, those people do not know how to design tests. It is VERY easy to prevent cramming, especially with such material, all you need is a little creativity. By "prevent cramming" I mean creative questions that force the candidate to know the material so well that cramming --> learning involuntarily.

I chose to start with a comp sci exam because it requires the least physical resources for the exam (no circuits, no lab, no nothing). Eventually, other degrees could be created once a proof of concept had been completed.

The original reason Universities were created were for efficiency (source: Sir Ken Robinson's TED talk). Teaching 50 students by mail, making each one order each textbook and other reference texts is crazily inefficient. Locating them all in one place, having lectures for all 50 students at once, and a library where all relevant texts are held is much better. What we're seeing now is just a modern version. Instead of lectures, we have video. Instead of library, we have ebooks/internet/curated content.

Sorry if I rambled for a bit, I'm very passionate about this issue. The thing is, without power in the field, changing education is hard. This is definitely the gateway, and I'm happy that I get to see all the fantastic (but slow for me) changes. The next step is re-optimizing spending in high schools. My mother works at a high school, and the stories I hear about wasteful technology spending are hard to hear.


I too have concerns about rigor. At the same time, I'm incredibly impressed with the world this week.


Wow, nothing in the article mentioning how UW will ensure that the people taking the tests are the same people receiving the degree. Will all of this testing take place at UW?


Online courses often require you to find and hire proctor locally meeting some criteria. For example, http://www.uis.edu/colrs/learning/gettingstarted/ProctoredEx...


I've proctored an exam for an online class. There was no vetting. Nobody even checked that I was faculty anywhere.


Say a couple dozen credible schools start to do this. What would be the differentiating factor? Their classroom reputation, which shouldn't apply much to this path at all?


The rigor of their testing, of course.

Possibly to a lesser degree, fees, accessibility, that sort of thing.


Yep.

I would expect MIT's exams to be outright impossible for a mortal to pass.

University of Wisconsin's will be nearly impossible to pass even with concerted study.

Northern Illinois will be possible to pass with study.

and then there will be institutions whose certifications don't mean a whole lot. There's no question about that.

But MOST people will try to pass the impossible ones... because the payoff will be so much bigger. Only a few will pass though. I'm thinking the tests will be designed that way. I strongly suspect getting a degree from UMichigan, UWisconsin, UTexas, Berkeley, etc. will be MUCH easier if you just attend. But this is a good option for those who, for whatever reason, don't wish to do so.


Well, you can easily test your theory in some cases. Many of MIT's open courseware classes include the exams. Are they impossible for someone without any knowledge on the subjects? Pretty much, but if you follow the course content, they are passable.


Would you say that the differentiating factor of, say, a Yale certification is its classroom reputation?


I'm not sure the Yale degree is as valuable if one misses out on the networking opportunities.


What I want to know is if, in the future, I will be able to differentiate between "Is college educated in this subject matter" and "Knows as much about this subject matter as someone who was college educated in it".

There is a keen difference between the two I think.


I'm guessing the later is the preferred candidate?


Since they both know as much about the relevant subject matter, no. The former.

I know HN has a thing against college education, but I sincerely believe that neither major relevant information nor a piece of paper certifying that were the most valuable things I got from my time in college.

Peer interaction and exposure, exposure to ideas and studies outside of my apparent interests, practice with long-term commitment and responsibility, etc. These are all things that you can certainly, perhaps even easily, have without going to college... but a college education serves as a certification of these things as well.

The "relevant coursework" part of a college degree is not of particular interest to me (I saw complete idiots get the same degree as I did, and I know many people without relevant degrees that know as much as I). Given two candidates of equal knowledge, one with no degree and one with a degree in art history, I would hire the art history guy with little hesitation.


You paint a very black and white picture when we all know that the landscape isn't truly that polarized. Firstly, given two candidates of equal knowledge, one with no degree and one with a degree in art history, I would _look at the work history_ and then make my decision.

A college education doesn't serve as a certification of exposure to new ideas and studies or long-term commitment and responsibility. No, one example of long-term commitment and responsibility is taking the route less traveled, knowing all the while that you are succeeding, while others laugh at your seemingly obvious missteps.

Tell me, how would you evaluate a person that skated through HS with Cs and Ds; duplicated the same effort in community college for one and a half years before dropping out; and then showed up at your company looking for a job? You'd probably show him or her the door, when you should have asked what was done in lieu of grinding through school. Then maybe you would find the person with a certified thirst for new ideas and studies who has more dedication and commitment than any college graduate, but feels that the education system is slow, broken, and nonqualitative.

Allow me to brush this chip off my shoulder. Don't get me wrong, I understand where you are coming from as well, and I love the entire school experience. But UW is taking the right step forward for people whom deplore the superfluous evaluations such as attendance and homework, to name a few.

UW can consider me one of their first customers.


My thinking was that the college educated applicant has had the course content fed to them whilst the other person has presumably acquired the knowledge more organically.

The non-college acquisition is likely to be part of a wider more connected level of knowledge too as that knowledge - that matches the college acquired knowledge - is a snapshot, the college educated person is more likely to be limited by the boundaries of the course material.

In order to acquire such knowledge in the field the non-college applicant is likely to have needed just the same interpersonal skills and a whole heap of drive and enthusiasm for their subject matter. On top of that they, presumably, have proven ability in their field if someone has been paying them a living whilst they've been applying this knowledge.

>Given two candidates of equal knowledge, one with no degree and one with a degree in art history, I would hire the art history guy with little hesitation. //

Art History is awesome; a single term of that at undergrad level enriched my life considerably. That aside the guy with art history and (presumably) programming knowledge is likely to have a useful diversity of approach.

I think I'd have a greater expectation that the subset {k} of the candidates domain knowledge implies a larger superset {K} for the non-college candidate.

[disclosure: I'm uni educated and don't work as a programmer]


I've always found HN to be pro-college, so I'm not sure why you think people have something against it. However, someone who has a demonstrated college-level education without direction shows extreme motivation, and ability to adapt to adversity. Going to college is the easy way out, so to speak. Of course the attributes you list are valuable too.

I don't think you can say one is better than other. You will find great and not so great people in both groups. As always, you need to evaluate the individual, not the piece of paper.


There is certainly the other side of the coin, as you point out. Which is better depends heavily on perspective, so yeah, I'll accept that one is probably not definitively better than the other.

I do think that they are certainly different though. It should not be the same piece of paper because both routes, regardless of relative merits, are very different animals.


This sort of program has existed since the 70's. Excelsior, Charter Oak, and Thomas Edison all offer similar programs across a wider range of fields than Wisconsin is. The news here is that a school with a name is offering such a program. That said, nursing is a field where school reputation doesn't matter much and IT isn't a prestigious major. This is better seen as a harbinger of change than as a significant change itself.


I have mixed feelings about this. It could be great good if done correctly, but there is also a lot of room for error.

On a related note of "credit based on knowledge", I am always fond of college professors who have the policy "If you make an 'A' on the final, you make an 'A' in the class", the premise being that you clearly have learned the material, and their job is to make sure you know the material.


I don't know... I feel that a final examination is no test of knowledge. It's a test of memory. It's regurgitation. I'd never do that in my class.

I do have one class where doing well on the final project can get a failing student a passing grade though. But the project requires they apply what they've learned.


My algorithms final was entirely problems that we had never seen before that required that we create new algorithms or heavily modify existing ones. The average score was a 55% (this is a professor who makes extremely hard tests and then curves up).

Had he just been asking us to recite implementations of quicksort and whatnot I might agree with you, but I think that your premise is flawed. Not all final exams are what you have described.


Will the faculty be creating the assessments? I'd wager no. In many places businesses and curriculum-design people weigh in and approve curriculum created by faculty, so I could see those same folks creating assessments.

Perhaps I'm overly pessimistic from my experiences, but I'd bet the tests'll be multiple-choice, just like the UW entrance exams are.

Because it's easier for a machine to grade them.

EDIT: the article does say the faculty will be creating the tests. That does give me hope.


I don't understand that, unless your final consists of mutliple chioce quetions. Why can't you just set novel problems that require written answers? Americans are in love with multiple choice tests for some bizarre reason. Most countries in Europe conduct extensive testing through school,b but multiple-choice only makes up a small portion of the whole exam. Students have to answer essay questions and show their work on problems.

I see no reason why this UW initiative can't work the same way. I hope that it can work this way and attract credibility, since it would make it easier for people like myself who find the spiraling cost of college tuition hard to manage.


Presumably, the cost of taking the tests is the cost of having someone thoroughly vet your work, in addition to developmental costs. If it costs, say, $1,000, you'd expect that to mean that people would spend around 10 hours actually verifying your work. And so, even if you have to write 10 papers and develop 10 projects, 10 hours of grading should be plenty to provide a fairly comprehensive assessment (30 minutes per large assignment is much more than most professors put in now. At least, in my department). For $1,000, you could even up the evaluation time to something like 40 hours (at ~$15/hr, plus administrative costs) by using TAs who really understand.


This is why college won't get any cheaper: there will be more value in having a qualified person evaluate not only the test, but all your schoolwork. You can game a test, but it's harder to game a few years of school.


The "bizarre reason" is laziness. You can grade multiple-choice tests with a key. I don't think they really assess anything myself. I think a test like you describe is better but I don't see it in practice at the places I've worked.

Thus I'm not hopeful.

And to be honest, they're gonna get their money from you one way or another. :)


I received my BA through a similar program that used to be offered by the State of New York through their fully accredited Regents College program (now called Excelsior College, and not offering this pathway to a degree anymore). When I started I already had about a semester's worth of credits from various colleges; I sent in those transcripts and took two CLEP tests (each worth about three units), and three GRE subject exams in Economics, History, and English Literature (each worth 30 units, or one full year). A balance of Humanities, Social Science, and Science/Math credits were required.

I started in Sept. 1987, I passed my last exam in December and started graduate school in January (didn't graduate). It was a wonderful program. I even wrote a book about it and sold maybe 25 copies.

That degree has helped me get several jobs that required a college degree, and now I'm getting ready to go back to grad school again.


I consider this a farce, but U of W isn't a top tier college, so that's OK. It's a good college. They're really devaluing their regular degree by offering a test-based certification.

The future of education isn't in testing. That's not education at all, really. The future is going to be in helping the student to develop a body of work that can be evaluated. There will be testing, but here's what it will be: every bit of work will be recorded, and evaluated. It'll mostly be evaluated by computer programs.

Think about it. If you have a test with 150 questions, how is that really better than, say, 500 homework problems? If you have a few 500 word test essays, how is that better than, say, 5,000 words, with some stats about time taken, revisions, and other information about the body of work?


Good teachers aren't lecturing. They're facilitating. They're bringing in discussions, showing examples, and helping students learn. The reason the online classes work is because students figured out that instructor/content focused courses don't require students to actually attend.

Look at "flipped classrooms" where students read and watch lectures at home and then come to class prepared to discuss and perform.

I don't think I'm an exceptioanl teacher but I think I'm quite good, and students who've taken online classes prefer having some face-to-face time with me and other students because I don't lecture and read powerpoints for 60 minutes.

Any instructor/professor that lectures, recites, and PowerPoints is absolutely in danger of being replaced by an online course because they add little value.


I think educators have made learning so prevalent throughout society, creating learning infrastructure, that they now must adapt to their own success. A person can basically use the same mechanisms that the educators created to teach themselves in their home or work. It's proof of success that they now must incorporate this into their institutions.


Wow, that's even better than the University of Illinois at Springfield, which requires you to do 4 semesters of online courses.


This is so they can ride out the end while the degrees mean something. I live in WI and am familiar with what's going on here with the higher-ed budget.

If you give people a test they can take online, how long do you think it will be before the answers get out there and people are just easily getting degrees? The degrees will mean less than they do now.

I can tell you from both the student side and the teacher side that students who cannot perform the tasks required in their field can still pass tests. The "fizz-buzz" failers, for example.

I will be 100% okay with this approach, however, if they are performance-based assessments. The college I teach at is moving in that direction - students must practice for and demonstrate the abilities for the competencies.

I've never been a believer in tests, so I like performance based learning, especially for programming.


>If you give people a test they can take online, how long do you think it will be before the answers get out there and people are just easily getting degrees?

The tests are very unlikely to be online. The most probable scenario is that they are proctored at local professional test centers just like nearly every other distance education program.


You think so? If students can never show up to a classroom, why would they show up for a test? I've taken DE courses from a few different places - all of them used the LMS for the final. No proctor needed.

Protoring costs money. The idea here is to charge for degrees while keeping resources the same, or reducing them. I can give you a few online tests, charge you $12,000 for an accredited degree, and we both win. You get a degree without having to show up, and nobody has to extend resources other than a little bit of time to grade your stuff.


That's the future of universities, they will be centers for research and certification. Learning is a 24/7 activity nowadays.


Award a bachelor's degree based on knowledge, not credits - what a novel idea :)


Hey, someone's got to make money off of those free online courses...


Distance learning has always existed, it's not a new thing invented by internet people.

The problem has always been that University branding matters, so a Cambridge degree counts as real in a way that an Open University degree doesn't. (http://www.open.ac.uk/)

In the same way, the new online courses are not good enough. (http://en.wikipedia.org/wiki/Disruptive_innovation)

This problem has been bad because some people are hardworking and talented enough but are excluded from places like Cambridge and Harvard for socially damaging reasons: the are from the wrong background, too poor etc etc.

It would always have been saner to separate the credentials from the expensive on-site-tuition, so the poor kid from Madras would find it easier to get his Cambridge degree (http://en.wikipedia.org/wiki/Srinivasa_Ramanujan)

But the University prestige has allowed places like Harvard and Stanford to jack up the list prices of their fees, and their competitors nationally and internationally have followed along. The top US places have been able to do so, I have been told, because of the expansion of student loan availability, leading to a pricing level that may not be feasible. (http://www.theonion.com/articles/man-has-alarming-level-of-p...)

The result is that colleges may have priced themselves out of the market (http://www.time.com/time/nation/article/0,8599,2116059,00.ht...) (http://www.bloomberg.com/news/2012-10-22/five-reasons-colleg...)

I hope more Universities do the right thing, and respond in an accommodating manner, as Wisconsin has done, as a way to climb down from that tree. It's about time. (http://www.insidehighered.com/news/2013/01/25/public-univers...)


Elite universities are priced like many other goods: the list price is not the price most people will pay.

They set a very high list price and then offer "scholarships" because this allows them to efficiently segment the market. If they did it the other way around (cheap "base" price and then escalating prices), it would cause horrible brand damage and arouse regulatory interest.

Financially, though, it's identical.

Notice how many retail places offer "cash discounts" rather than having "credit card surcharges".

Financially identical. Psychologically, worlds apart.


The pricing is simple. They have a lot of wealthy legacies - students whose parents went to the school - and those kids get in. They also want to get the top talent, and for them, there are scholarships. The model is simple - the massive capitalist economy requires intelligence and a lot of will. The fact is, genetics and chance, while favoring the rich, don't favor them that much, and can give you mediocre-performing offspring. Someone has to do the hard work - and it's going to be the ones who get in with scholarships, who will get hired by the children of the wealthy. I know, it sounds distasteful, elitist, and plutocratic, but that's how the system works.


Your last step doesn't follow. Scholarship students get great connections. Example: Sergey Brin.


Nevertheless, the price most people pay is some factor of the list price and that can be very steep.

For example, here is the price page for the Harvard MBA: http://www.hbs.edu/mba/financial-aid/Pages/cost-summary.aspx

Harvard do not give more than 50% scholarships.

So we are taking about 54K over a 2 year period, just for fees, excluding accommodation, food and health insurance.

That is enough to keep all kinds of people out. And this is for MBA, a course that has a relatively high probability of leading to a job. The effect will be much more acute for Art History.


The MBA is a profit center. To see what it costs for poor kids, just look at the undergrad scholarship policy. Odds are, they will get a free ride if they have great grades.


Harvard does not have a professional school of art history. Art History is leisure person's degree anyway, and has no relevance to personal economics. Space ship rides are expensive too, and similarly irrelevant to personal advancement.


There are 2 problems:

A college education exists to teach you how to learn. A programmer who took 30 years to acquire skills that took another 4 years could be wholly inappropriate, especially if you want problems solved in reasonable time. My professors proudly stated that they had forgotten more than they had learned. A college education carries a cognitive component that must also be addressed.

A successful college degree requires persistence, organization, and emotional stability. A critical deficit in these attributes may make somebody incompatible with the modern work environment. It is not hard to imagine someone who bails early or is not sufficiently organized to asses the problem.

tl;dr Something other than technical skills happened during those 4 years




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: