
Skills aren't taught at university; we develop them to cope with university - jeffdechambeau
http://jeffdechambeau.com/skool-sux.html
======
blahedo
> _Why not make the problems harder and let students use every possible tool
> or resource to solve them?_

Some of us do exactly that. Many of the exams I sat in grad school (Brown)
were that way, and nearly all of the exams I have set are open-book, open-
notes, open-internet, and take-home. I've always felt that's a better
reflection of what I actually want to be testing anyway.

I'm not the only professor that does this, but the cost is that exams take a
lot longer to grade, generally. So it's impractical at scale.

~~~
yummyfajitas
In my experience, a lot of students strongly prefer memorizing to thinking,
and will penalize you on student evaluations if you force them to.

My evaluations went up a full point when I switched from writing my own open
note/open book exam to using the department's standard "memorize all the
things and do mechanical transformations" exam. Students complained a lot less
also.

~~~
sliverstorm
I had a professor who offered a novel scheme that left me pretty happy with
take home exams. Students are usually turned off because a good take-home is
really, really hard. What this professor did was give you the exam, mark
incorrect problems, and then if the class average was poor, give you a second
copy. You would turn the second copy in a few days later, and you would get
some fraction of the credit for any improvement in your score. (Say you got a
60 the first time and an 80 the second time, you might end with a 65-70)

The result was that, after being challenged by difficult problems that require
your complete focus, you are given some feedback on what you got wrong, at
which point you go back to the book to work out what the real answer is. That
revision step is key; the promise of some recovery points is mostly just
motivation to take it seriously

I felt like those exams were huge learning experiences. The caveats are: it is
more work for the professor; this was an upper division course, so students
were serious and the class size was small; the best problems are hard and open
ended.

------
epenn
In reference to memorizing material the OP writes: _In the “real world,”
having a copy of your notes is called being prepared. Instead, university
exams expect us to tie one hand behind our backs and master a skill we’ll
seldom if ever use again._

The truth is in the middle of the dichotomy you setup. Yes having notes,
referencing materials, etc in the real world is being prepared. BUT, even in
the real world, there's a reasonable expectation that you'll store and
retrieve as much of that information as possible in and from your own memory.
For example, I'm a pretty adept with python. I occasionally need to look at
the standard library reference; yet, I try to commit as much as I can to
memory. If I didn't, I would spend the majority of my time searching instead
of doing. It goes without saying that a person who knows something off the top
of his or her head is more efficient than someone who doesn't. You're correct
that in the real world you'll rarely be in a situation where you don't have
access to a reference of some kind. However to say at the same time that
memorization is a skill that you'll seldom if ever use again is incorrect at
best and reckless at worst.

~~~
nu23
The counterpoint here would be that memorization happens more naturally when
practicing, as frequently used knowledge is automatically memorized.

I remember school courses in history and chemistry where requirements of
memorization was the largest component of the course credit. What I suspect is
the real reason is that measuring student understanding and engagement is a
hard problem, and memorization is easily measured proxy for the above. The
problem is that it is easily gamed, learning decays into this gaming process
and then, students promptly forget material after the exams. In my experience,
though, universities mostly do allow for notes in exams.

~~~
gxs
Precisely. My math courses never tested my memorization outright.

But if after pouring 40-50 hours a week into a class for a couple months
didn't result in memorization of the important stuff as a byproduct, you
probably weren't going to do well anyway.

I'd like to be optimistic and say that that this is how memorization began to
be tested in schools. Instructors noticed that the best students seemed to
have things memorizes, so they started testing this as it's an easy thing to
test.

For some reason, the analogy of a doctor treating symptoms rather than the
cause of the illness comes to mind.

~~~
_delirium
I had a statistical machine learning course whose exam was mostly factual
questions, closed-notes, and oddly enough I think it was reasonably relevant,
despite the fact that I usually dislike pure memorization. It didn't ask for
specific formulas, but more like concepts and terminology, and how they'd be
applied. It's not that these are specific things you should memorize, but that
it's at least a necessary condition: if you can't, without notes, say what an
expectation is, what a loss function is, what nonparametric regression is,
etc., and when you might use some of these things, then you probably didn't
pay attention in class or work any of the problem sets, because after a
semester of actually doing the course you should definitely know all that
without even really thinking.

So even if an A doesn't guarantee you actually know statistics, it's at least,
imo, justifiable to say that a low grade means you definitely don't know
statistics. You can always argue that you'd look things up if it was open
book, but past some point if you don't know _any_ of the material or even the
basic terminology of the field, saying you could look it up amounts to saying
that you could learn statistics from scratch if you needed to. It's sort of a
test of, "can you hold a reasonably intelligent conversation on the topic
without constantly checking Wikipedia on your smartphone for basic
definitions".

(That kind of exam is probably also particularly suited to statistical ML
because _not_ knowing those things is the most common kind of real-world
mistake... the details of an algorithm you can always get from an R package or
Weka, but not knowing how to analyze a problem or what the main issues even
are can't be solved by open-source code.)

~~~
TeMPOraL
> It's not that these are specific things you should memorize, but that it's
> at least a necessary condition: if you can't, without notes, say what an
> expectation is, what a loss function is, what nonparametric regression is,
> etc., and when you might use some of these things, then you probably didn't
> pay attention in class or work any of the problem sets, because after a
> semester of actually doing the course you should definitely know all that
> without even really thinking.

Terminology is easy to remember once you understand the concept, and those
things you mentioned are something that you do _not_ memorize. Those things
you have to understand. You can memorize a formula, and you can memorize a
list of applications of a given concept, but both of them are worthless if you
don't understand on a gut level, what the concept is and thus where to apply
it.

------
gomphus
> _Why not make the problems harder and let students use every possible tool
> or resource to solve them?_

In science, we call this a thesis or research project. I don't see the need
for all exams to take the same format (although some do so successfully), as
closed-book exams test something quite different - the depth and breadth of
your internal, longer-term comprehension.

> _An “education,” whether for its own value or to help you get a job, is–at
> least to me–about developing the skills to find the information you need,
> assess its value, integrate it into the context at hand, and make a better
> decision than you otherwise could have._

An education -at least to me- is about building up an inner edifice of
knowledge, so you can work fast, and formulate original and hopefully
brilliant ideas and insights, with the skills the author mentions being
accessory to this (and something that should really be in place by high
school). The author writes as if knowledge is something to be retained as
fleetingly as possible, to make room for whatever the next task is. But
information you have committed to long-term memory can cross-pollinate, become
a greater structure, open up new horizons. Information that you merely load
and discard cannot - at least not in the same way.

> _In the “real world,” having a copy of your notes is called being prepared._

In my world, being (professionally) prepared means that you have authoritative
mastery of a subject. Of course you often refer to notes, and have the skill
to quickly and perhaps temporarily assess and assimilate new concepts. It does
not follow that holding the detail of our degree subjects at arm's length is a
virtue, and that having to rely on our own memories in examinations is somehow
"bad education". Yes, the open book exam format has its place, but so does the
traditional one.

If you want a better education, try regarding your knowledge as something to
be made more enduring, not more ephemeral.

------
MarkTraceur
I go to the University of Redlands.

In the past two years, they decided to remove my major (Computer Science)
completely from the curriculum for new students (so current sophomores are
unable to join the department except to minor), they cut tens of professors
from the faculty, and they cut down on several other costs.

In the same time period, they spend tens of thousands of dollars redesigning
the main website, several more thousand redesigning the internal student-
facing website (based on Blackboard and Moodle in a bizarre zombie formation),
and upgraded several nonfree services they provided (Outlook Web Access,
Blackboard, Datatel). They also bought and paid for the construction of a park
just south of campus, which cannot be feasibly used by students but serves to
essentially advertise the university. They also purchased additional radio
ads, several front-page ads in the LA Times, and well-placed billboard
advertisements.

Higher education in America has made its priorities clear, I think.

------
johncoogan
I couldn't agree with this more, the education system seems to obsess over
structure. Concepts get clumped together into classes, which are clumped again
into majors. Prerequisites seem to cripple broad intellectual development and
force specialization.

I do see the importance of this changing, with increasing weight being given
to public work shared over a blog or Github vs. academic transcripts.
Hopefully institutional education can evolve to help individuals learn how to
search, triage and analyze in a given field.

------
RandallBrown
I had lots of open book/note exams. They were also much harder. I even had an
exam where the professor allowed us to use our laptops and the Internet. You
can imagine how much harder that exam was.

There were very few exams where I wasn't allowed a limited amount of notes
(usually a full notecard or sheet of paper). The ones that weren't were in
classes like intro psychology where most of the class is just memorizing
things anyway.

------
nekojima
I'd have hoped branding was a skill taught at "Western" and their top-notch
Business School could have helped them avoid this disaster.

Given I am in the same province as this university, I actually had to look up
which school he was talking about, because I wasn't sure. Everyone in Ontario,
and most of Canada, already calls this school Western, rather than its (now
former it seems) official name, The University of Western Ontario. Calling it
Western University is missing the point of what the Western brand is, party
school or not, that isn't as an adult among the first few things that I think
of when it comes to Western. As opposed to when I was an undergrad driving to
their campus for a weekend party.

While the naming issue is off-topic, and my apologies for that, I did find
this helpful/humourous article on the re-branding effort.

[http://oncampus.macleans.ca/education/2012/01/26/thats-
weste...](http://oncampus.macleans.ca/education/2012/01/26/thats-western-
university-to-you/comment-page-1/)

------
Yhippa
I agree with his sentiment for the most part. The reason I imagine hiring
managers prefer those with university degrees is that without knowing other
information they have some idea that such a person has those coping mechanisms
to deal with the rigors of university. This is of course not perfect but when
you're looking at a stack of resumes I imagine this has to be a
differentiator.

FTA: "Why not make the problems harder and let students use every possible
tool or resource to solve them? Even students singularly focused on learning
for its own value would get so much more out of the experience."

From my own experience my open book exams in Physics were ridiculously
difficult. We would use all sorts of tricks to cram as many equations onto the
double-sided piece of paper we were allotted. I feel if you agree to an open
book exam the difficulty of the exam can (and should) increase dramatically.
Be careful what you wish for ;-)

~~~
jeffdechambeau
Real life is pretty hard too. We let so many people into university that
difficulty almost necessarily has to drop in order to keep enrollment numbers
up.

Did your school push you nearly hard enough?

~~~
Yhippa
Of course real life is difficult. My point was regarding open-book exams.

------
jacques_chester
For those referring to open books, one of my law professors explained it to me
this way.

It's a trap for the unprepared. Poor students think that they can rely on the
books in an exam, but there's just not enough time to read, parse and apply
the source material. Either you _learned_ ahead of time or you didn't.
Students who learned the material use it as a reference, not a source.

It also, he said, allows him to give "proper exams".

I quit studying law, but it was still an eye-opener.

------
leot
(Relatively social) people also learn from the examples of others, which is
why it can be so helpful to go to a school filled with smart high-achievers.

------
AznHisoka
"Don't let your schooling interfere with your education"

------
j45
It's not what your degree makes of you, it's what you make of your degree.

University is as much about learning coping skills as it is learning to get
things done under the silliest circumstances. Strangely, this has a bearing to
real life once entering the workforce.

Likewise for learning, as soon as we stop learning, we're no longer growing.
Anything that no longer grows gets left behind.

What skills do universities teach startups? If we're lucky to have the
instructors who can spark our minds, anything.

------
icegreentea
Mastery (or taking a few notches down, competence) has both a memorization,
and application component. Can't get around that. The problem is that people
assume that the memorization component must be learned by memorizing, and that
the application component by applying. Rather, the memorization component
comes from applying, and the application flows from memorization.

You're ability to apply your knowledge to the world hinges on your ability to
remember what your knowledge is. Even now, when you just 'google it', that
relies on your memory of broad concepts and ideas. Recognition IS memory (a
specific kind of memory anyways).

So yeah, on one hand, over the top tests of memorization is ridiculous, since
it just results in everyone cramming, and then forgetting 3 hours after the
final, and then when the next course rolls around building on the knowledge,
you lose 3 weeks doing review, you can't get away from memorization, or
testing your memory in school.

