Hacker News new | past | comments | ask | show | jobs | submit login
Skills aren't taught at university; we develop them to cope with university (jeffdechambeau.com)
87 points by jeffdechambeau on Jan 26, 2012 | hide | past | favorite | 41 comments



> Why not make the problems harder and let students use every possible tool or resource to solve them?

Some of us do exactly that. Many of the exams I sat in grad school (Brown) were that way, and nearly all of the exams I have set are open-book, open-notes, open-internet, and take-home. I've always felt that's a better reflection of what I actually want to be testing anyway.

I'm not the only professor that does this, but the cost is that exams take a lot longer to grade, generally. So it's impractical at scale.


In my experience, a lot of students strongly prefer memorizing to thinking, and will penalize you on student evaluations if you force them to.

My evaluations went up a full point when I switched from writing my own open note/open book exam to using the department's standard "memorize all the things and do mechanical transformations" exam. Students complained a lot less also.


I had a professor who offered a novel scheme that left me pretty happy with take home exams. Students are usually turned off because a good take-home is really, really hard. What this professor did was give you the exam, mark incorrect problems, and then if the class average was poor, give you a second copy. You would turn the second copy in a few days later, and you would get some fraction of the credit for any improvement in your score. (Say you got a 60 the first time and an 80 the second time, you might end with a 65-70)

The result was that, after being challenged by difficult problems that require your complete focus, you are given some feedback on what you got wrong, at which point you go back to the book to work out what the real answer is. That revision step is key; the promise of some recovery points is mostly just motivation to take it seriously

I felt like those exams were huge learning experiences. The caveats are: it is more work for the professor; this was an upper division course, so students were serious and the class size was small; the best problems are hard and open ended.


Mechanical exams are compatible with cramming. That's why students like them; that's how they've studied for years.

Eventually it stops working.


It's not just that it's easier to grade, closed book exams actually have a useful purpose IMO. In some classes there are a core set of ideas that are extremely important to understand and a much larger set of application concepts. Because tests require memorization it is much easier to focus only on the core ideas whereas homework or projects that did this would be frowned upon for being "too easy."

I've had this especially happen with finals, like my Differential Equations final where I managed to get all the concepts of the entire semester onto a single page when studying. I had a lightbulb moment where I realized the important parts of the class were really simple, it's just all the twists and challenges about application made it seem more complicated.

Similarly I got mad at a grad teacher recently when he would not make our last lecture a review because according to him we shouldn't need "hand-holding." It ticked me off because a day to summarize what we've learned over several months seems like a smart idea to me. I guess I feel like tests can be similar, a way to filter out things that are taught that are good to learn, but not as important as other things. Kind of like when a teacher says, "If you leave this class and only remember a single thing I want it to be this."

On the other hand I see tests not working for some people. I have a close friend who has memory problems. She is so frustrated by college because due to her memory her application of the classes is completely different to how she is tested.


    > Similarly I got mad at a grad teacher recently when he would not make our last lecture a review
Huh. A graduate course that ended with a review would feel really weird to me. I've never seen one or heard of one, and when taking a course never asked for a review, or saw a classmate ask for a review. (My direct experience is with graduate courses in mathematics at three universities on in Mexico, one in Canada and one in the US.)

Edit: I just realized I'm not sure if the phrase "grad teacher" means a teacher of a graduate course, if not, disregard my comment.


> A graduate course that ended with a review would feel really weird to me.

Also, I don't think most of the courses can be summarized in a review lecture.


yeah I meant teacher of a grad course, and I should probably give a better explanation, it wasn't so much about a class not having a review as the teacher's attitude. Also it was a computer science class, being both a comp-sci and math major I know in computer science reviews are far more useful, especially in this particular class, a distributed systems course that was a general requirement, covered tons of topics (the book was closer to a dictionary than a textbook), and no one really understood what was taught.

The thing is that any speaker will tell you that if you want to teach someone something you should tell it to them three times - intro, actual discussion, and conclusion, so I view a last day study session as the conclusion part, making it a useful teaching tool. I totally understand that graduate students should be able to learn on their own and not need something like that, but I still think teachers should keep an open mind and try to teach their students as well as they can.

However our teacher told us several times, in kind of an arrogant manner, he wouldn't have the review because otherwise he would be "holding our hands." He didn't have the attitude that it's a tool, he didn't ignore it, instead he looked down on it like if you want a review then you're stupid. This guy constantly used us being grad students as an excuse not to teach us, he would expect us to know entire classes that were never required, never taught, and not prerequisites, simply because we were grad students. We actually had a project worth half the class that centered on a major networking project when the entire class had no networking experience. The whole class was miserable, we learned nearly nothing, and then he curves tons so we pass and he doesn't look so bad. I just hate the attitude that a teacher's job isn't to teach, but instead to read the book to the class. I encountered that with all my required classes in grad school while the electives were amazing with teachers who were extremely challenging but at the same time actually taught us enough to deal with the challenges.


To a certain extent, I agree with you. But to play the devil's advocate, I must say that having a open-whatever test is inherently dangerous in creating inequality in terms of information usage.

For example, if you were to have a open internet test in 2003, people who knew about wikipedia would have a significant advantage vs people who didn't. Or if the student is poor, he or she might not have access to fancy electronics or even laptops, which would make the open internet test difficult. Even if it was a open-book test, if the student were substantially poor to a point where he/she was sharing a book with a friend? And there is also an element of luck with it comes to open-notes tests. If the teacher only allow a single page of notes, what are the chances that the question might be on a subject that you omitted to put on your notes?

I've only listed tech-savvy, income, and luck as a potential way to make the test unfair, but I'm sure there are plenty more if you include any other element that is not the study material itself other than memory. I agree that in real life we don't have to memorize everything (matter of fact I'm all for critical thinking than memorization in schools), but there are certainly some benefit to the traditional test taking methods.


Some exams at my university do allow you to bring all the notes and books you wish (no internet though). Although that sounds like a great deal, I don't really like those exams.

Normally if professors let you bring all your notes, they increase the complexity of the tasks and maybe even cut the time. So unless you wrote yourself a good summary or index to your notes, you will spent too much time looking through your stuff and won't be able to complete the test.

If you however wrote a good summary, chances are that you wont actually need it, since thats pretty much the best way to learn. But still the exam is harder than it would have been without books, so you are off worse.


> thats pretty much the best way to learn. But still the exam is harder than it would have been without books, so you are off worse.

So... they're the best way to learn, but you're worse off for getting them? Think about this argument a little more critically---I think what you're trying to say is that such exams are harder (which is true), but what is the purpose of the exam, and of the course itself? Surely an exam structure that is more effective at causing you to learn the material is better for you, not worse, not despite but because of the fact that it makes you synthesise a better understanding of the material.


Making them just open-book, open-notes is an easy compromise that scales nicely.


I agree, but only for certain subjects. For example, I was able to use a note card with equations for my physics courses. Memorizing equations is absolutely pointless; it doesn't test your comprehension of physics nor does it hint at your level intelligence. Give someone a notecard for a history exam and any sort of assessment of subject understanding is eliminated. Some courses are memorization based and there's no denying it. If your thought process for taking these types of exams is somehow comparing things in your mind, great, but you're still recalling everything from your memory. It's not like there's an equation you can plug in to know how WWII ended; it's just something you know and memorize.


I think you've been taking the wrong history classes. The interesting stuff is the "why" and making connections between different events; and if you're tested on that sort of thing, a notecard with names-and-dates is about as helpful as a notecard with equations in a physics exam.


For an advanced history class, where students are expected to be able to make informed opinions about complex issues, this will work (and I suspect most advanced history classes don't focus much on memorization, or even tests for that matter).

But for general education history courses, the notion of "why" is just as much memorization as a list of dates. What if you disagree about the "why" that the teacher/textbook claimed? You'll lose points.

Besides, for all the history courses I've taken (just gen ed ones), tests weren't a big part. There were usually short quizzes on sections of the text book, which were testing for reading comprehension more than memorization. Essays and research papers were most of the grade.


Once again, that's YOUR thought process. It seems that you happen to link events together to remember an outcome. By simply knowing a few pieces of information, it's not likely most will "solve" the outcome. This has nothing to do with writing a well crafted essay that is opinion based such as "I believe person x did this for these reasons..." 99% of the time in history classes you know the outcome, it's not your job to solve it.


What subjects in academia does this truly not apply to? If you can use a book to get the answers on the test, you can use a book to get the answers when you need them "in real life." The only exceptions I can think of are obvious trades like police or doctors.

You gave history as an example. I must ask then: what is the point of a history education? If you just like history and want to learn more about it, you'll try to learn the "real" stuff (beyond memorization) on your own. If you want to be a history teacher (especially for grade school), sure, you can probably consider the dumb memorization useful. If you actually want to work in the field (i.e. do research), the memorization won't be useful.


If a college-level history course is asking students to memorize and regurgitate simple historical facts and nothing else, they're doing it wrong. There are many ways to create history exams that require not only facts but also critical thinking.


most of the undergrad cs classes i took at berkeley were somewhat open. In the lower div classes, you would usually get unlimited notes, and the upper div classes would usually give you only a few pages (usually 2 to 4). it seemed to be a good compromise of scalability and usefulness.


In reference to memorizing material the OP writes: In the “real world,” having a copy of your notes is called being prepared. Instead, university exams expect us to tie one hand behind our backs and master a skill we’ll seldom if ever use again.

The truth is in the middle of the dichotomy you setup. Yes having notes, referencing materials, etc in the real world is being prepared. BUT, even in the real world, there's a reasonable expectation that you'll store and retrieve as much of that information as possible in and from your own memory. For example, I'm a pretty adept with python. I occasionally need to look at the standard library reference; yet, I try to commit as much as I can to memory. If I didn't, I would spend the majority of my time searching instead of doing. It goes without saying that a person who knows something off the top of his or her head is more efficient than someone who doesn't. You're correct that in the real world you'll rarely be in a situation where you don't have access to a reference of some kind. However to say at the same time that memorization is a skill that you'll seldom if ever use again is incorrect at best and reckless at worst.


The counterpoint here would be that memorization happens more naturally when practicing, as frequently used knowledge is automatically memorized.

I remember school courses in history and chemistry where requirements of memorization was the largest component of the course credit. What I suspect is the real reason is that measuring student understanding and engagement is a hard problem, and memorization is easily measured proxy for the above. The problem is that it is easily gamed, learning decays into this gaming process and then, students promptly forget material after the exams. In my experience, though, universities mostly do allow for notes in exams.


Precisely. My math courses never tested my memorization outright.

But if after pouring 40-50 hours a week into a class for a couple months didn't result in memorization of the important stuff as a byproduct, you probably weren't going to do well anyway.

I'd like to be optimistic and say that that this is how memorization began to be tested in schools. Instructors noticed that the best students seemed to have things memorizes, so they started testing this as it's an easy thing to test.

For some reason, the analogy of a doctor treating symptoms rather than the cause of the illness comes to mind.


I had a statistical machine learning course whose exam was mostly factual questions, closed-notes, and oddly enough I think it was reasonably relevant, despite the fact that I usually dislike pure memorization. It didn't ask for specific formulas, but more like concepts and terminology, and how they'd be applied. It's not that these are specific things you should memorize, but that it's at least a necessary condition: if you can't, without notes, say what an expectation is, what a loss function is, what nonparametric regression is, etc., and when you might use some of these things, then you probably didn't pay attention in class or work any of the problem sets, because after a semester of actually doing the course you should definitely know all that without even really thinking.

So even if an A doesn't guarantee you actually know statistics, it's at least, imo, justifiable to say that a low grade means you definitely don't know statistics. You can always argue that you'd look things up if it was open book, but past some point if you don't know any of the material or even the basic terminology of the field, saying you could look it up amounts to saying that you could learn statistics from scratch if you needed to. It's sort of a test of, "can you hold a reasonably intelligent conversation on the topic without constantly checking Wikipedia on your smartphone for basic definitions".

(That kind of exam is probably also particularly suited to statistical ML because not knowing those things is the most common kind of real-world mistake... the details of an algorithm you can always get from an R package or Weka, but not knowing how to analyze a problem or what the main issues even are can't be solved by open-source code.)


> It's not that these are specific things you should memorize, but that it's at least a necessary condition: if you can't, without notes, say what an expectation is, what a loss function is, what nonparametric regression is, etc., and when you might use some of these things, then you probably didn't pay attention in class or work any of the problem sets, because after a semester of actually doing the course you should definitely know all that without even really thinking.

Terminology is easy to remember once you understand the concept, and those things you mentioned are something that you do not memorize. Those things you have to understand. You can memorize a formula, and you can memorize a list of applications of a given concept, but both of them are worthless if you don't understand on a gut level, what the concept is and thus where to apply it.


Thats a good theory.

I always often hear about this kind of complains regarding algorithmic tests in interviews, that you have to "memorize" these algorithms, but i never understood this position as being constantly programming, these kind of algorithms really seem easy to do, and you don't need any memorization of them.


> Why not make the problems harder and let students use every possible tool or resource to solve them?

In science, we call this a thesis or research project. I don't see the need for all exams to take the same format (although some do so successfully), as closed-book exams test something quite different - the depth and breadth of your internal, longer-term comprehension.

> An “education,” whether for its own value or to help you get a job, is–at least to me–about developing the skills to find the information you need, assess its value, integrate it into the context at hand, and make a better decision than you otherwise could have.

An education -at least to me- is about building up an inner edifice of knowledge, so you can work fast, and formulate original and hopefully brilliant ideas and insights, with the skills the author mentions being accessory to this (and something that should really be in place by high school). The author writes as if knowledge is something to be retained as fleetingly as possible, to make room for whatever the next task is. But information you have committed to long-term memory can cross-pollinate, become a greater structure, open up new horizons. Information that you merely load and discard cannot - at least not in the same way.

> In the “real world,” having a copy of your notes is called being prepared.

In my world, being (professionally) prepared means that you have authoritative mastery of a subject. Of course you often refer to notes, and have the skill to quickly and perhaps temporarily assess and assimilate new concepts. It does not follow that holding the detail of our degree subjects at arm's length is a virtue, and that having to rely on our own memories in examinations is somehow "bad education". Yes, the open book exam format has its place, but so does the traditional one.

If you want a better education, try regarding your knowledge as something to be made more enduring, not more ephemeral.


I go to the University of Redlands.

In the past two years, they decided to remove my major (Computer Science) completely from the curriculum for new students (so current sophomores are unable to join the department except to minor), they cut tens of professors from the faculty, and they cut down on several other costs.

In the same time period, they spend tens of thousands of dollars redesigning the main website, several more thousand redesigning the internal student-facing website (based on Blackboard and Moodle in a bizarre zombie formation), and upgraded several nonfree services they provided (Outlook Web Access, Blackboard, Datatel). They also bought and paid for the construction of a park just south of campus, which cannot be feasibly used by students but serves to essentially advertise the university. They also purchased additional radio ads, several front-page ads in the LA Times, and well-placed billboard advertisements.

Higher education in America has made its priorities clear, I think.


I couldn't agree with this more, the education system seems to obsess over structure. Concepts get clumped together into classes, which are clumped again into majors. Prerequisites seem to cripple broad intellectual development and force specialization.

I do see the importance of this changing, with increasing weight being given to public work shared over a blog or Github vs. academic transcripts. Hopefully institutional education can evolve to help individuals learn how to search, triage and analyze in a given field.


I had lots of open book/note exams. They were also much harder. I even had an exam where the professor allowed us to use our laptops and the Internet. You can imagine how much harder that exam was.

There were very few exams where I wasn't allowed a limited amount of notes (usually a full notecard or sheet of paper). The ones that weren't were in classes like intro psychology where most of the class is just memorizing things anyway.


I'd have hoped branding was a skill taught at "Western" and their top-notch Business School could have helped them avoid this disaster.

Given I am in the same province as this university, I actually had to look up which school he was talking about, because I wasn't sure. Everyone in Ontario, and most of Canada, already calls this school Western, rather than its (now former it seems) official name, The University of Western Ontario. Calling it Western University is missing the point of what the Western brand is, party school or not, that isn't as an adult among the first few things that I think of when it comes to Western. As opposed to when I was an undergrad driving to their campus for a weekend party.

While the naming issue is off-topic, and my apologies for that, I did find this helpful/humourous article on the re-branding effort.

http://oncampus.macleans.ca/education/2012/01/26/thats-weste...


I agree with his sentiment for the most part. The reason I imagine hiring managers prefer those with university degrees is that without knowing other information they have some idea that such a person has those coping mechanisms to deal with the rigors of university. This is of course not perfect but when you're looking at a stack of resumes I imagine this has to be a differentiator.

FTA: "Why not make the problems harder and let students use every possible tool or resource to solve them? Even students singularly focused on learning for its own value would get so much more out of the experience."

From my own experience my open book exams in Physics were ridiculously difficult. We would use all sorts of tricks to cram as many equations onto the double-sided piece of paper we were allotted. I feel if you agree to an open book exam the difficulty of the exam can (and should) increase dramatically. Be careful what you wish for ;-)


Open-book exams should be ridiculously difficult, as they are intended to test understanding rather than rote memorization. Anyone can memorize a formula; figuring out when and how to use it is what the class should be teaching.

Similarly, in a programming class, the exam should not be testing whether or not you've memorized the Java boilerplate code. Sure, if you use it enough you will memorize it, but that's not what makes a good programmer. Either provide the boilerplate or let the students look it up; the focus should be on solving the problem.


> I feel if you agree to an open book exam the difficulty of the exam can (and should) increase dramatically. Be careful what you wish for ;-)

Most exams at Stanford CS are open book/open notes. I much preferred this to my undergrad experience of having to memorize stuff.


> my open book exams in Physics were ridiculously difficult. We would use all sorts of tricks to cram as many equations onto the double-sided piece of paper we were allotted.

Doesn't sound like it was an open book exam if you were just allowed a cheat sheet.


Real life is pretty hard too. We let so many people into university that difficulty almost necessarily has to drop in order to keep enrollment numbers up.

Did your school push you nearly hard enough?


Of course real life is difficult. My point was regarding open-book exams.


For those referring to open books, one of my law professors explained it to me this way.

It's a trap for the unprepared. Poor students think that they can rely on the books in an exam, but there's just not enough time to read, parse and apply the source material. Either you learned ahead of time or you didn't. Students who learned the material use it as a reference, not a source.

It also, he said, allows him to give "proper exams".

I quit studying law, but it was still an eye-opener.


(Relatively social) people also learn from the examples of others, which is why it can be so helpful to go to a school filled with smart high-achievers.


"Don't let your schooling interfere with your education"


It's not what your degree makes of you, it's what you make of your degree.

University is as much about learning coping skills as it is learning to get things done under the silliest circumstances. Strangely, this has a bearing to real life once entering the workforce.

Likewise for learning, as soon as we stop learning, we're no longer growing. Anything that no longer grows gets left behind.

What skills do universities teach startups? If we're lucky to have the instructors who can spark our minds, anything.


Mastery (or taking a few notches down, competence) has both a memorization, and application component. Can't get around that. The problem is that people assume that the memorization component must be learned by memorizing, and that the application component by applying. Rather, the memorization component comes from applying, and the application flows from memorization.

You're ability to apply your knowledge to the world hinges on your ability to remember what your knowledge is. Even now, when you just 'google it', that relies on your memory of broad concepts and ideas. Recognition IS memory (a specific kind of memory anyways).

So yeah, on one hand, over the top tests of memorization is ridiculous, since it just results in everyone cramming, and then forgetting 3 hours after the final, and then when the next course rolls around building on the knowledge, you lose 3 weeks doing review, you can't get away from memorization, or testing your memory in school.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: