Hacker News new | past | comments | ask | show | jobs | submit login
The Practice is not the Performance: Why project-based learning fails (pathsensitive.com)
241 points by nancyhua on Feb 24, 2018 | hide | past | web | favorite | 128 comments



This reads like a very thinly veiled ad for the author's $500 Software Design web class.

Project based learning won't work, traditional learning won't work, only my class will work. In fact the source he quotes (himself) claims he can give you a year's worth of experience in an hour.

But why doesn't project based learning fail? Well, in his example Bob, who wants to learn network programming, pairs with Alice, who somehow mysteriously takes over the entire project and does all the networking code so Bob learns nothing. That's sort of like arguing that you can't learn guitar from a teacher because a velociraptor might stroll by and eat the teacher.

I don't know what type of learning is best, but I know you need a little more evidence then that for a post titled "Why project-based learning fails".


agree with this. Theoretical example, seems to tilt to advanced the authors view point. Would love some data on project based learning.


8-year study of progressive education from a hundred years ago is probably the oldest result showing project-based learning works well. Miki Chi's ICAP is the current one. You have the same result repeated with a new methodology every couple decades in education research.

Neither shows project learning is the only thing which works well; just that it works better than the model the author proposes or than traditional classrooms.

The key fallacy the author makes is that free-form project working doesn't work. Let's say I want you to learn machine learning. I need a carefully designed and sequenced series of projects which exercise all the skills along the way. That's not an uncommon fallacy; many progressive schools made the same mistake and basically failed.

The random coding project model, in isolation, as the author described, falls flat on its face just as the author describes. It works pretty well for simple/broad things (e.g. learn an API), but for anything with depth, expert sequencing of knowledge and design of projects/assessments becomes important.


> but one student reports that she learned much more from traditional classes.

You can't argue with that. She got more from a 4 year computer science education than the 6 week bootcamp! QED

But seriously, as another commenter mentioned, these maximalist headlines don't really help us get to a better education system. Deliberate practice is very important, but that doesn't mean project-based learning "fails".

I'm tutoring a student 1-on-1 at the moment, and my biggest concern as a teacher is always motivation. The delight that comes from building your own little real world website/app (simple cookie clicker clone in her case) after only a few weeks of programming is so useful in getting students to actually appreciate the value of what they're learning, and then it just snow-balls.

Some good points raised in the article, but perhaps a little click-baity and one-sided.


agree with you absolutely. In broad field like CS, one can't keep interest so long unless there are some practical usefulness.


Yes that's what I thought I learnt as much from the practical and project based parts of my ccna course - even more so when I came in on a day off and spent a day investigating some edge cases where the course material was incorrect.


> This reads like a very thinly veiled ad for the author's $500 Software Design web class.

My takeaway as well.

> In fact the source he quotes (himself) claims he can give you a year's worth of experience in an hour.

Did he invent a form mind-meld?


I was imagining some sort of CS shark in its prime with tons of experience while reading the article. PhD candidate? Hm...


I attended a university with a strong co-op program, and I personally believe it is the best possible solution. Classes were focused on theory and my three 6-month co-ops were spent working on meaningful software projects at real companies. As a result, I was able to build up a solid background in Computer Science theory, didn't have to deal with the shortcomings of group class projects, and learned how to ship production-worthy code under the mentorship of real, senior engineers (and got paid to boot).

Why more universities don't leverage co-op programs, especially given the exorbitant and rising cost of college tuition in the US, is beyond me.


I don't know.

I attended a regular computer science program and I am glad I did. The goal of university is not to prepare you as best as possible for working life, but to teach academic thinking.

Class projects usually have a specific purpose, like teaching computer vision, operating systems or building compilers.

You'll spend the rest of your life working. There is no need to rush into it, in my opinion.

Then again, I didn't study in the US, so I didn't have the tuition problem.


For you, university was a means to the end of learning academic thinking. For me, it was a means to the end of attaining a decent job.

There are certainly arguments to be made that university is by definition, in original intent, and in theory solely a vehicle for teaching academic thought, but I think the pragmatic counter to them is a poll of "why are you here?"[1] , and should be considered valid.

The end result is that university is in practice not one or the other but both: it must teach some academic thought, and provide some job-readiness.

I wish to digress for a moment into why that is not just a failing of many students to understand school's true purpose: many people (hi) do not come from a background where lifetime economic security is a guarantee, and while programming is a rare example (at least in North America) where if you spend the time, you can probably do it, and if you can do it, you can probably be paid well for it, there are unfortunately still gatekeepers. HR departments frequently resort to very base credentialism to sort through candidate resumés, and those with schooling typically come first and get higher base offers of compensation. Not to speak of the network social effects of spending years with other entrants into the field, and the opportunities for extra-curricular pursuits which also are well regarded by those gatekeepers.

These among other things have forced many people with mere occupational goals to pursue university degrees. I wish it were not like this, because my professors universally did not feel that that was a good reason to be there, and occasionally avoided very useful practical topics on principle, but I (and many within my cohort) felt there was no better alternative for a successful outcome in life, and it has mostly worked out.

1 - I of course have not conducted this poll, but I do strongly believe that at least a plurality would respond with job-attainment as their primary goal, and would not be surprised by an even greater response.


Well then, might it be wise to realise that “why are you here” is a question that is based off perception?

A generation of financially secure parents have pushed the mindset of Degree==Job. An understandable choice, based on the idea of attaining the best start possible in life.

Whatever reasons, whatever perceptions, the role of the University is unchanging and should remain steadfast; A University is there to create new academics.


I feel that the perception is in fact material to the reality, because if people do go to university not to become academics, and do leave it not as academics, and do feel it has met its purpose in those cases, then clearly it is not presently there solely for that cause.

But I think we actually agree -- I would just amend your statement to be an aspiration rather than a description: "A university should be there to create new academics."

I feel saddened that in practice (in great part due to the reasons we've each outlined), it is not.


>The goal of university is not to prepare you as best as possible for working life, but to teach academic thinking.

Because academics don't work on projects? C'mon.

Except for a few things (like math) I wanted a lot to be taught in a project based fashion at university. There was a lot of CS stuff that we learned where the professors were super unclear about why it was important or when the correct time to apply the knowledge was. That not only made the teaching less useful it seriously sapped my motivation.

Years later I was surprised a lot by the applicability (or not) of some of the knowledge taught to us and I think probably the lecturers could probably have guessed that I would run in to these situations - they just didn't put much (if any) emphasis on bringing it up.

Now I'm teaching (as an amateur and not in a university, but still), I'm teaching in a project based fashion because it's how I wanted to be taught.

I do this partly because I'm trying to present every topic as a problem first in a context that seems realistic. I feel like getting students to recognize and 'feel' the problem themselves before teaching them the solution helps with their motivation and the applicability of that knowledge - wherever they decide to use it.


I had actual part time coding jobs for the majority of the time I was in undergrad. My family didn’t have much so I had to work.

Even though I was forced to work out of necessity, it turned out to be the best move. Learning how to balance theory with “we need this done in a couple hours to try and land the next client”, has been so valuable to my career (even through grad school). Engineering at every level is about trade offs, and nothing makes that more apparent then working a job.


I'm pretty sure you don't understand how co-op programs work at universities. You actually take the exact same courses as the "academic thinking" students, you just also alternate with work in a relevant industry.

Can you explain why you think taking courses and working flipping burgers or some other summer job is better than taking those same courses and being mentored by practicing engineers, programmers, scientists, poly-sci, <professional in your chosen field>?


I am assuming that those 18 month of work experience will be at the expense of coursework.


No, you take the exact amount of courses as anyone else would. You can do a 5-year program with 3 co-ops or a 4-year program with 2. Of course, you also don't have summer breaks and there is less time off between the fall and spring semesters.

I'd also like to point out that if you argue college should be about creating academics, plenty of my friends went to my school and went on to become one. They simply used their co-ops to do research, either on campus, for another university, or for a company.


This is not the case at pretty much every co-op school I know of, including my own. Usually the programs utilize summer breaks and take 5 years, though you can often do 2 co-ops in 4 years or the like.


> The goal of university is not to prepare you as best as possible for working life, but to teach academic thinking.

This is a well-worn false dichotomy. It sets up scientific understanding of a field against job skills. Scientific understanding is important because it saves you months or years of flailing around in the dark, trying to solve problems without understanding them, or worse, producing half-baked work that could be so much better if only you had a clue. But without context, it's very hard to make knowledge stick, and projects provide that context.


How does a strong co-op program work, and why do you think it's the best? Am from SEA university programs and I don't believe we have any of those. Would be awesome to understand more about the process.


Typically a student must maintain good academic standing and apply specifically to a co-op program to attend, and then an administrative department within the school will run a class on interviewing skills and resumé-building, before helping to coordinate a matching process between students and participating companies for work terms.

In many cases, large companies have special days where they'll come onto campuses to do interviews, and they may take a significant cohort of students each semester. There are often rules like, "you have 24 hours to respond to an offer and may only reject one offer" to ensure good faith been parties and high match percentages when some companies may be more prestigious or slow in their process.

Students then spend 3-16 months (this depends on the school's academic timing and the company's own structure -- 4 months in summer is most common) at a company and if it's a good company, they'll be given real work, strong mentorship, and deep integration into a team, and often students near graduation will get offers of employment directly at the end of their term.

There are also usually very tedious written work-term reports to fill out as a bureaucratically useful artifact of the process. They're typically expository essays of topics like "what I learned at $company", and everyone hates them.

Students will have 1-4 (usually 3) terms out in the real world over the course of their time in school, and graduate with meaningful practical skills and usually a taste of what's available to help assess opportunities.

It's on the whole a great system for a lot of people, particularly those who are using university to increase their employability.


Because education as a system isn't designed for actually learning, it's designed for perpetuating a meaningless and arbitrary game of credentialism to justify the socio-economic hierarchy.

And to babysit children so that their parents can spend more time participating in capitalism.


I guess you're down-voted for both having a non-obvious opinion and phrasing it in a hostile way.

You are correct however, but it's more understanable than you make out.

Rather, the education system is designed to impart the skill of information recall -- which was essential to actually acquiring the target skills (ie., you needed to remember lots about chemistry before doing any). Remembering sentences about a skill has no relationship to being able to do it, only, that as you're learning to do it, those will be an aid.

That hasnt been true for less than 20 years, perhaps 10 with the prevalence of smart phones in 2007.

The whole system is set up to impart the skill of recall and grade according to your ability to recall. There is a implicit general awareness that you're actually not aquiring the target skills.

eg., after 5-7 years of french in the UK, no 18 year old can speak french. And so on for every subject.

The same is true all the way up until PhD. It's our historical model of what education was for: giving you the library up-front.

The skills were to be obtained in employment, and with any self-motivated practice.

Today that is painfully ridiculous, and there's really almost no value in it. Leaving the warehousing, grading and certifying functions of educational institutions their only apparent use.

PS. There's an argument to say education hasn't needed to be this way since the advent of public libraries, cheap books, which is still late 20th C. This is plausible enough too. Our template of education is still basically a mix of medieval and Victorian, with the presumption that books are hard to obtain or difficult to survey oneself.

note that primary/elementary schooling isnt set up this way: it is deliberately skilling. This makes it very effective and very important.


eg., after 5-7 years of french in the UK, no 18 year old can speak french. And so on for every subject. The same is true all the way up until PhD.

Those seem like a series of stridently unrelated and evidence-free statements.


Completely unsupported, in fact.

Of course no foreign part-timer can speak French like a native, but able students can certainly hold a basic conversation and read French to a reasonable level.

Language degrees include an exchange program which offers extended full immersion, so it's simply nonsense to say that a graduate in French won't be able to speak it.

But... the problem with CS is that it has no idea what it is. Most engineering degrees have well-formed requirements, which include a lot of math and domain-specific detail.

The sciences and math have a core curriculum which hasn't really changed all that much for fifty years or so now.

CS has... what? What's the basic skill set, what are the core requirements, and how are they recognised?

No one agrees. Employers mostly want "Minimum Viable Developers" who can crank out code using the Framework du Jour.

Academics do what academics do.

You can certainly make the case for any one specific curriculum, which will include a mix of theory and programming projects in various languages and environments.

But there's really no such thing as a definitive core CS skill set. For everyone who says "You should know C" someone else is going to say "Bad example - learn this teaching language instead."

And so it goes for the rest. Machine code? Compiler theory? DSP? It's all optional.

At the same time, CS has spent far too little time on the psychology of programming and system design. There's a fair amount of unicorn chasing, in the form of tidy concepts like type theory and side-effect free coding, but far too little research into designing languages and practices that are verifiably better at managing concurrency, handling versioning, and minimising avoidable bugs.

So the "learn theory vs learn by practice" question is a side issue, and can't be answered unless there are specific goals. For now, there's no such thing as a "qualified developer" in the sense that there are qualified (actually chartered) engineers.

Until there is, how can anyone decide which teaching approach is better when there's limited agreement about what needs to be taught?


School gives some a useful fiction, which is primary motivational. E.g. campus, classrooms, professors make you feel better about spending your time on some esoterica.

The other benefit is subject gate keeping. Departments at least in STEM, don't offer pointless classes.

An autodidact might not know what's valuable and might not always muster the energy to get through. They also might not have the integrity to give themselves a fair test.

So yeah, school is useful beyond addressing the scarcity of books.


School is useful insofar as its guiding and motivating, yes.

This is tutoring.

Everything else it does is a waste of time. That's where you'll find most of the actual time spent though, and it is certainly not an educational aim. The aim is to grade on recall. It is not to motivate, or guide.


>The aim is to grade on recall.

No. At best this means you had a terrible experience, and are generalizing in ignorance. At worst you are dug into a position and are willing to defend it without integrity (a sadly popular and often successful approach).

Many (most?) of my tests were open book and they were hard because they were puzzles requiring you to have absorbed the book to the extent that you could reason beyond it. The tests would probe our understanding of an idea by asserting a small change in the initial assumptions, and have us complete the derivation with that new assumption. Test questions were novel complications of the problems in the book.

That was my experience, and it was a good one. I'm sorry you didn't get that.


This is my current experience as well, open book open notes means be familiar enough with the concepts at play that when I ask you to do something fairly specific, you can reason out a series of commands/functions to get the job done.


I don't know why this is downvoted. I think that Bryan Caplan's book The Case against Education - Why the Education System Is a Waste of Time and Money (https://press.princeton.edu/titles/11225.html) makes a pretty strong argument to support the related points.


Most people on hacker news have (as far as I've noticed) very much enjoyed their university programs and will defend them quite vehemently. Often these programs were quite high quality, with lots of close contacts with TA's and professors.

Further, for people who did theoretical CS or math programs, it can feel quite frustrated working in a field with so many "self-learned" people and find that these are often ignorant of what they don't know.

Finally, many people here have Ph.D.'s or have worked in academia or teaching in various ways.

This all results in people being somewhat skeptical of advice to "don't do school".


On the flip side of this, many of the most capable and intelligent people I have worked with had no degree (or a non-CS one, e.g. fine art is particularly common), and when I try to think of times my algorithms, OS, networking, maths, or even data structures courses have been directly beneficial to my daily work, I have little to show.

Those all had knock-on effects and have indirectly made me more capable and confident in my analyses of performance or occasionally in troubleshooting, but it is very rare that I spend a day doing something deeply theoretical, and it is conversely very common to spend days hitting my head against walls with configuration, integration, or versioning of build/deploy/test components and dependencies. My CS theory is no good there, only methodical troubleshooting and asking for help in IRC are effective, and anyone can learn that by doing.


Yeah, I'd agree with this. You see it play out in interviewing when the "academics" ask questions that are quite useless in determining whether or not someone is qualified to build a web app.

It's as if interviewers are just stroking their own ego trying to recreate the relationships they encountered in academia only with the roles reversed.


Even for the people getting Phds in, say, mathematics? Or the undergrads studying mathematics en route to getting their Phd?


PhD is skilling in academic skills.

Undergrad and most masters programs are still just going for information recall.

This is why PhDs are specific, intensive, daily, guided, etc.These are all the characteristics of skill aquistion.

Compare, say, with paino tuition. All skill acquisition is basically a form of apprentiship. PhDs are apprentice academics.

Ask yourself the simple question: after finishing this education what skillful activities can I actually do?

After primary education: reading, writing, remembering, etc. After secondary: basically the same. After tertiary: basically the same. After PhD: run a particular kind of experiment, analysis, etc. After 10 years in a programming team: solve professional programming problems. After 10 years with a piano tutor: play the piano.

The question "what can I do?" seems always answered with: not much. This should be quite shocking, and today it is -- really, that is the correct answer. Your BSc amounts to "not much".


The answer is that you've learned to learn.

You got some breath of knowledge, not everything is relevant but it helps putting things in perspective. This is crucial.

But the most important thing is that you've learned how to quickly adapt and pick up new things. This is what is valuable and in combination with a broad understanding is very much worth the education. It is okay if the understanding is shallow or even obsolete, point is you have the tools to recognize what you need to learn and you've learned how to quickly brush up on the relevant parts necessary to solve the problem at hand.

What kind of answer did you expect? My BSc amounted to me being very proficient in framework X? That would have been a waste of time and that is also what often is the alternative to an education.

People who self-learn are often very good at a specific tasks but have vast areas which for all intents and purposes is magic.

Maybe lack of education is why everything has to be made in javascript today?


No, a BSc in physics should and can impart actual skills in physics.

It would require the whole process to be lead by a tutor in small groups. And each day you would work through experiments, mathematics related to them, writing reports about them. You would read books together, and work through their practice exercises together.

You would be set to individual practice as often as a piano student is. And tutored as often as a piano student is, or much more often, given the compressed time.

For computing, you would need daily tutored high-skill computer use. You need tutored programming. Your time would be mostly in projects, with seminars where you read through the theoretical things together. The tutor would be guiding your progress individually, dealing with your issues and showing you the skill of programming, etc.

The "lecture" is a spoken textbook delivered to hundreds of students. It isnt giving you any skills. It's not designed to do that. Mistaking it for skill acquisition, which is what most people do, is pathological.

> Maybe lack of education...

You've confused the worst excess of the current educational system (knowledge about a framework!) as what I am arguing for.

That is what we have now.

Skills are modes of thought, imagination, creativity, attention, deliberation, etc. They require a lot of training in each. To be a programmer is to think, imagine, attend to, and deliberate differently.

A framework has nothing to do with it. This is confusing "skills" with whoknowswhat, recalling how to solve a problem.

In your terms, all of education is at the moment, learning the API of a framework.

What it should be is how to think and act like a programmer.

All of education is learning the pattern of notes in beethoven's fifth.

What is should be is how to play the piano, and compose for yourself.


A BsC in Computer Science imparts actual skills in computer science. Which is not the same as software engineering. Someone with a BsC in physics couldn’t fill the role of a mechanical or electrical engineer (not least because those engineering professions are regulated).


Not sure what education consists only of lectures?

Also, you've misread me completely. When I talked about learning framework X I explicitly said that it was a waste and that education will give you something much better.


Learning to learn is an important skill.

But so is the ability to do a job.

Some companies have the resources available to take a fresh grad that knows theory, concepts, and how to learn, and then get them up to speed on how to actually do the job they've been hired for. Some WANT people that only know how to learn and the concepts, so they can teach them how to do it the "X Company Way".

But a whole lot don't. A whole lot of companies need to hire people that can slot into a position and hit the ground running. A whole lot of companies need people that can deliver on the job description.

A strong Computer Science education will probably make you a significantly better programmer in the long run, but if you can't get a job or keep one because you aren't prepared to step in and do the job, it doesn't really matter.


Ability to do a job and education are orthogonal skills. No, you don't necessarily learn how to do a job during your education but that is a skill quickly picked up (or never learned) after. As is the case for anyone not getting any education.

> Some WANT people that only know how to learn and the concept, so they can teach them how to do it the "X Company Way"

Those companies would do well in not hiring people with ambitions that have not had any education, correct. But part of getting an education is avoiding having such mindless jobs.


I have always felt that these takes highly overstate how difficult practical part is. It is significatly easier then it is made up to be.

Also, you should not expect even seniors to hit the ground runing, unless you hire on total complete match of both culture, code style and exact libraries being used. In which case you should not be surprised about there being lack of qualified applicants. Even experienced seniors need to learn every single time they change a job and thus also tech stack.

Moreover, many agile companies are basically organized to cater to unexperienced programmer.


There’s simply a huge range of PhD programs out there - I think that makes such a broad conclusion difficult to conclusively assert. Some concrete examples which spring to mind:

- A PhD in Performance from the University of Indiana - A PhD in any technical field, from say, MIT - A PhD in comparative literature from Yale - A PhD in clinical psychology (which could be therapy-focused or research-focused), further subdividing it) from UC Berkeley - A PhD in EE/CS from Berkeley.

I have no doubt all of those PhDs will be able to do a lot. I have no doubt that some, but not all, of what said PhDs can do will be valued by employers. And that is why academia and scholarship are distinct from business and industry - they have different, but sometimes overlapping, aims. And a PhD is explicitly training to be a scholar/researcher - it’s not a professional degree, though it sometimes has value in a professional environment.


I am describing education as a system. Plenty of real learning happens within it, but this is not what the system is optimized for or what it depends on.


> So, for programming, we need to (1) figure out the core concepts to teach, and (2) pick languages that make the concepts readily available.

James, the main argument for project-based learning is ability to make learning interesting. Considering that students aren't vessels that need to be filled, but torches which need to be lit, this can trump other approaches. Importance of being interested in the material can be hard to overestimate.

You may still be right, but I was surprised to see the arguments for project missing this one.


There's a lot of psychology at work here which is not being given due credit.

People who are theorists at their core typically believe that the learning is interesting by itself, usually because they are focused on developing their own models for how things work. It's a subjective-creative process that's exciting: You get to play detective! They are getting creative outcomes _while_ learning about theory. They do not need to wait for a project outcome in order to feel accomplished. Their only other answer, like OP's, to learning things that aren't interesting by themselves but still need to be learned is rote learning, or drills. What other method could possibly educate the student and yet not introduce other rabbit holes that distract from the topic?

Project-motivated people find this approach boring, and they are ready to go down the various rabbit holes involved in the project, if necessary to achieve their envisioned outcome. They are focused on _applying existing theory_ to effect their concrete outcomes, which is an interest that straddles the theoretical and real-world-application zones. They get happy brain chemicals by e.g. watching people use a project they've built. These types use a referential thought process for higher leverage and shorter timelines. So rather than developing their own model from scratch (NIH, the theorist's pet method) which will take them a long time and might not be that effective, they borrow someone else's model ("hey, let's use Framework X, it has tons of features!") and make it serve their subjective outcome-vision. Seeing the vision brought to life is where their creative fulfillment is activated.

CS departments are stuffed full of the former case. Engineering departments are stuffed full of the latter case.

It can be absolutely maddening to be either one, and then find yourself being asked to accommodate the other's learning / executing style.


This is so true, and boy do I wish I had known this before choosing a CS program. I think there is much more education (or counseling) that could be done for high school students who are unfamiliar with CS and software engineering.


I think this is overly reductionist.

Plenty of “theorists at heart” still think it’s boring to do lots of rote drills. :-)

It’s not like doing drills really helps you develop new models/tools/concepts from first principles especially more than it helps you adapt existing models/tools/concepts to new situations.

* * *

I think the article under discussion here misses the point in a different way. He assumes that it’s impossible to get meaningful fast feedback about various particular aspects of your work (and focusing on those deliberately) while working on “projects”. But at least in computing (and probably most fields if you look around), that’s not necessarily true.

We have tons of quality feedback available in computing, starting with tools like a REPL or a debugger which interactively show you the results of small actions, and then compilers and static analyzers and profilers and ....

Then there’s IRC and mailing lists and open source ticket trackers etc., where expert strangers will spend tons of their time helping you out for free just to pass the time. If your project happens to be something that other people are interested in and you are working in the open, there’s feedback from customers, other programmers, etc. If working in a group of mixed ability, there are the other group members or possibly a skilled mentor/tutor, and if working in a company there are coworkers.

There is oodles of open source code, documentation, free textbooks, targeted videos of people demonstrating how to use particular development tools or their general method, or showing off particular tricks, much of which is reasonably searchable and can be called up from the internet on demand when a particular problem/roadblock is stuck. You can relatively easily try to implement parts of projects yourself, then go find an example project where someone else implemented the same thing, and compare the two. Typically you can find both toy examples and production code, with their entire version history, internal project discussions, bug tracker history, ....

Arguably you could learn a whole lot by coding up 30 different examples of data structures in C, one after another, getting expert feedback after each one. And then moving on to coding solutions to 20 different dynamic programming problems. And then 10 compilers for different little languages. And then 10 different CRUD apps for made up restaur. Or whatever....

But you could also get lots of good experience by building a whole project that you personally care about from bottom to top, using it for a while and watching where the bugs come up, seeing what parts are completely wrongly architected, etc., and then trying again several times (ideally with a bit of introspection and research in between) until you have a better idea what to do.


What I'm saying regarding rote drills is, when you tell a theorist that developing a model by yourself isn't interesting or helpful to you, their only other thought is, "you must be one of those weirdos who benefits from rote learning?"

Project-based learning simply doesn't come to their mind; it's completely off their radar because it is, as op implies, potentially full of little all-consuming rabbit holes that are unrelated to the CS principles in question. Look at the great theorist Feynman and his "computer disease" theory. He had a deep aversion to getting sucked into things like IT concerns, because theoretical model-detecting was his core motivator.


Feynman was complaining about something different. He was in the middle of working on large-scale projects and got annoyed that his coworkers got completely distracted by irrelevant (but amusing) tangents instead of doing their next task that everyone else was blocking on.

This is a problem of the coworker’s lack of executive planning / control / time management skills, not anything about learning via projects or theorizing. Arguably if the coworker had done a bit more project-based learning he would have better practiced those skills.


Agreed -- in the right kind (probably nonexistent?) of educational atmosphere, encouraging students to just build something interesting using the techniques that are being taught could serve the same purpose in my mind as project-based teaching.

It feels like what really good teachers excel at is inspiring students to spend more thime thinking about the subject than is strictly necessary, even getting them to like it, so their minds can run away with the subject.


>the main argument for project-based learning is ability to make learning interesting

Exactly. Btw are their any popular video games out there that actually help kids to write mods for them?


> the main argument for project-based learning is ability to make learning interesting

James already addresses this assertion in the very first sentence of the blog post. And the second sentence.


Which first sentence? The first sentence doesn’t have anything to do with that argument. Neither does the second.


The first two sentences EXPLICITLY state the type of arguments that James is responding to. Click on the links in those first two sentences and read the arguments.

The "main argument" of those posts -- i.e., the posts James is responding to, is not that "project-based learning... make[s] learning interesting".

That may be your main argument, but it's not the argument that the people who James is responding to are (mainly) making.


In Denmark we have a two universities that almost exclusively do project based learning. Basically students are put in groups and attached a mentor who is responsible for teaching them CS.

To this is coupled traditional courses in basic math/algorithms/scientific theory.

These lines produce candidates that are wasted supperior to our traditional universities. Well, if you need them to actually work with CS in the real world, I’m an employeer not a scientist, so I have no idea if they produce better candidates for research. But for real world jobs, these kinds of candidates are the only ones coming out of the universities that I can safely put in a position and expect to see them become productive after 1-3 months.

Traditional candidates take 6 months of mentoring before they start earning their salaries. They’ve often never even deployed a project in anything resembling a real world project. They’ve never worked in teams, and simply don’t know how to do so. They don’t know how to communicate with non-IT people. And so on.

As I said, project based teaching may not produce better CS candidates in terms of how good they are at CS, but it does produce candidates that can deliver a finished quality product on time.


I think this is right on: From the school and directly into consulting. I completely agree that these people are far superior on the work place. But hey, who ever decided that an academic computer scientist is a better hire than software engineer did something horribly wrong.

The problem is, as you mention, _universities_ are not, and should not, be geared towards the industry. That's why we have the engineering programs, which are. They are expertly taught all the Microsoft technologies completely as the mainstream Danish consultancy expect from there employees.

Furthermore, I reckon that "work with CS" for you means implementing Sitecore websites, do a bit of C#, and maybe proramming a Typescript webapp? To me those tasks have nothing to do with CS. They require no knowledge of typesystems, complexity theory, etc. (Keep in mind that CS on Danish universities is datalogi, which is the academic discipline and does not directly translate into CS as the american program)


That sounds great. If you're not practicing the theory you're learning, then it's just slipping away into oblivion.

I've always said that traditional computer science programs should really be split up into the "theory" part of computer science, and software engineering curriculum (the practice).


"wasted supperior" Did you mean "way superior" or, better yet, "far superior"?


Vastly superior?


So do you think Aalborg students are better than DTU's?


CS and SE are different lines, I think most DTU students would fall under project based learning.

I think Aalborg CD students are far superior to students from Aarhus or KU, but only as production ready candidates that can work in a team, not necessarily for R&D.


What about IT University Copenhagen?

I'm curious as I graduated from DTU several years ago. But since I'm not Danish and I moved back to Oxbridge I lack insights on the different universities out there.

DTU was a very nice theory + project-based learning setup. IMHO, along with Chalmers and a few others, a really underrated option. It was quite heavy and rigorous on the theory side of things.


As someone at the end of their undergrad, I have to agree that group based projects trend towards being unpleasant. The article describes the distribution of work harming full understanding of the content when all people contribute, however that's usually the best case.

Most students are oversubscribed with four or five classes. There tends to always be some portion of a group which contributes trivially or simply doesn't. When you've got a group of five, one is typically unreachable, apathetic, or just exhausted. Pairs are a gamble, a bad partner being a massive workload. When you're the person who's carrying the group, you usually end up grokking everything in the project because you touched all of it. However, since you touched all of it, you're exhausted and likely pissed.

What I've found works is individual work sample projects, particularly those timeboxed to 'hopefully a lab period but you have a week'. The GPGPU course at my school has labs which are 'fill in the blanks' for cuda code. As the course goes on, those blanks get progressively more complex and make you flex your core understanding of the course. For more theory driven courses, the standard set of assignments works nicely.

As an aside, group projects seem to be partially motivated by the TAs and profs trying to deal with larger class sizes. Taking a senior level graphics class with 30 people lets you write 4 gnarly opengl projects which the TA marks in depth; a software engineering course with 200 people, a weekly deliverable, and 5 TAs? Dividing that by 5 is more realistic.

(For context I go to a school not known for their undergrad CS program; experiences may differ in other institutions)


> Most students are oversubscribed with four or five classes.

Perhaps that's part of your problem?

At the school I'm going to (KTH) you always[0] take exactly two courses at a time. That makes scheduling group assignments a lot easier. You also tend to take most classes with the same people, so once you've found a decent group you can stick with them for most project courses.

[0]: Almost, you're allowed to take an extra course if you ask nicely and have kept up with coursework so far


You don’t need to have a group of apathetic students in an organized course in order to learn via projects.

Lots of people have taught themselves to program (and many other subjects and skills) by doing their own individual projects.


Yeah, but there is no denying that the external structure is helpful and the dedicated office hours can be a real lifesaver.


What a bizarre article. While the whole "Learning Styles" stuff is largely debunked by scientific research, we do know that different things motivate different people.

I would argue that motivation is the single most important aspect of learning - if you're not motivated to learn something, you're not going to do it. I tried to learn programming skills on and off all through my teenage years and early 20s, but it always petered out because I never had anything I really wanted to do in specific with them. I lacked the motivation to just learn the theory without having a project to work on.

When I finally found a project that excited me and could be broken down into manageable pieces to learn, I made infinitely more progress.

Recently I've been going back and learning actual computer science concepts. I've got enough programming knowledge that the computer science things actually interests me now, and I have the motivation to spend time on theory and exercise that I can't immediately apply to a project.

But I never would have gotten to that point without project based learning to begin with, because I never would have been able to stick with it. I wouldn't have cared enough.

Jimmy might be motivated enough by learning the theory that it works for him. But it doesn't work for everyone, and trying to act like there's any one single right way to educate people, when people are so massively different in so many different ways, seems to be fairly arrogant.


"Learning Styles" wasn't debunk, oversimplifying extremist claims were debunked. There are different ways to learn, but they are effective for different people to different extents at different times.


Can you please explain what you mean by "Learning Styles is debunked by scientific research" ?


"I am a visual learner." That has been debunked.


The author is not criticizing project-based learning in general, but instead how some curriculum (I wonder who's?) uses group projects as the only form of learning.

This feels a bit like a strawman. I agree group projects are not the best way to teach individual concepts. However, they're the best way to put what you've learned into practice. To use the author's martial arts analogy, group projects are like sparring in a safe environment.

Group projects are also the best way to practice your group communication skills, something isolated exercises cannot help with.


Group project communication and work communication are different. I have rarely had problem to communicate in work, but group projects were usually crap.

The worst about group project is when one person picks all the interesting tasks and learns, others don't. Which means that people who know least and need practice most, are the ones that are pushed to anciliary roles with only little practice happening.


> The worst about group project is when one person picks all the interesting tasks and learns, others don't.

The reason this happened is because of a lack of group communication skills. I would blame your instructor for not pre-addressing problems like this before things got started, or not checking in at all.


Pre-addressing these problems won't make shy person instantly not shy. It wont make controlling person not controlling nor change someone who is not self aware or don't think much about others.

While these exist in workplace too, setup is different, goals are different and dynamic is completely different. And you can usually find yourself non-group setup with own accountability and tasks.

Most important difference is that in school you are supposed to learn and this turns whole thing to waste of time where you don't learn - and then you miss stuff you was supposed to learn.


> Pre-addressing these problems won't make shy person instantly not shy.

Of course. But it at least sets the expectation that these things need to be talked about. Otherwise students won't know what to do, or even realize what's happening until after the work division has been decided.

Not just that, but a competent instructor should have checked in with the groups to make sure this sort of thing doesn't happen.


I think this has some good points, but ultimately comes to the wrong conclusion. The key idea, from my reading, was "isolate subskills and drill them."

I don't think that's necessarily separate from project-based learning.

To steal an example from another commenter, I'm actually nearing the end of a figure drawing course at a local community college (because, you know, sometimes I want to not be coding :) ). The three components of drawing (as described by this professor) are line, value (shading) and gesture (shape/flow).

One way you could teach that class would be: - Drill line. You're gonna straight lines, curve lines, squiggly lines until you can draw any arbitrary bezier curve you want. - Drill value. Just draw a metric ton of gradients. - Drill gesture. Draw lots of figure shapes without every really worrying about filling them in. - In the final class, try to put it all together and draw a whole person.

A much better way to teach it would be: - Draw a whole person, but don't worry too much about value and gesture- just try to get the line right. - Draw a whole person, but don't worry too much about line and gesture- just try to get the value right. - Draw a whole person, but don't worry too much about value and line- just try to get the gesture right. - Draw a whole person at the end using everything you've learned.

The latter way "drills" subskills, but it also practices the overall skill of "drawing a person."

In the same way, I'd strongly advocate project-based "drilling." Build an app to learn guis. Build an api-consuming cli to learn about apis. Build a chat app to learn networking. And if you want students to learn about networking, either assign it as an individual project or make sure all students work on the networking code (his chat app example was just a poorly-implemented assignment, not a problem with project-based learning).

Overall, it's hard to argue that someone who's spent most of their time building projects won't be better at building projects than someone who has not.


Let's be clear here: An academic is writing about project based learning. Of course concepts are important. But then according to Edison Genius and many would argue Innovation is 1% inspiration and 99% perspiration.

Academic success or conceptual understanding alone are not good indicators of professional success. Perseverance and collaborations are at least as important as is the ability to sell. Also let's not forget: The market is usually buying experience and not conceptual knowledge with unproven track record for application.

Projects may not be the best way to learn in depth about an isolated aspect if the project is end to end. But that is setting up a straw-man to be torched. A proper way to learn about networking e.g. would be measuring TCP performance and experimenting with the ramp-up. Or writing a protocol decoder for something non-trivial.


I disagree with this article for the most part, it doesn’t make sense to say that just because some group projects are not optimal implies all project based learning is bad. I think doing projects individually is hugely important both from learning theory and also learning software design. In college we had group projects and individual projects going on simultaneously, one of which was building google maps for a US state given only some location data. This project simultaneously taught me about various data structures and algos (Kd-tree, graph search, autocorrect) and we later extended it to use networking in a client server architecture. I think it’s invaluable to have students do many parts of a project individually and then come together to build larger projects and this addresses the concerns of the author.


I think this comment touches on the idea that, given a few hundred students enrolled in a course, there will be some subsets of students who get more out of one type of teaching/learning methodology vs some others.

Therefore, at random places at random times there will be courses that emphasise some methodology/ies where a majority of the students do well, or not, given some kind of bell curve distribution of student-methodology success.

The outcome of which is that, occasionally, someone can write a convincing argument for or against some particular methodology.

Teaching and learning, at scale, is difficult; outcomes are ill defined and hard to measure; correlation, causation, competing priorities and influences.


The 'in industry' equivalent is 'on-the-job-training'. Employers love the idea that a software professional instead of taking a few days to acquire a new skill or technology, can just do it by using said skill on a client's project without preparation and learn as you go along.

This always fails as there is never time budgeted for this in the schedule, so the software professional is not just facing the traditional underestimated deadline, but is now faced so doing with partially unknown tools and under intense pressure to just cut every corner and rush the first ting that doesn't blow up completely. Next project, the 'new tech' is now considered 'known' and so the kludges become practice and decent learning never occurs.


Ironic that this [0] is the top story at the same time as this article is on the front page.

How can an academic be arguably arguing against Constructivist Learning Theory [1] without mentioning Piaget [2] or Papert [3], from the birthplace of Scratch [4] no less.

The author recommends, "design drills for it" but at the same time deriding project-based learning as simulating work. I cannot think of a better way to encourage stimulus-response coding than, "doing drills". Projects provide context, which provide anchors for knowledge. Projects provide a constant stream of problems with a motivation to solve them, the teacher should be there to guide the student towards the knowledge and skills to solve the problems as they arise.

[0] https://news.ycombinator.com/item?id=16453192

[1] https://en.wikipedia.org/wiki/Constructivism_(philosophy_of_...

[2] https://en.wikipedia.org/wiki/Jean_Piaget

[3] https://en.wikipedia.org/wiki/Seymour_Papert

[4] https://scratch.mit.edu/


This advertisement article is on the front page because a voting ring promoted it.


Project based learning works really well for me and it seems to work well for the 20,000+ people who have taken one of my courses (based on feedback I've gotten over the years).

For example, in 10 hours of time I went from 0 knowledge about Elixir / Phoenix (or functional programming) to having my own Phoenix app up and running with multi-login passwordless token based auth, webpack integration, etc, etc.. I now feel like I have a really good handle on how an Elixir app can be set up with Phoenix and how all of the front end aspects of a web app fall together with it (routes, templates, views, endpoints, plugs, etc.).

I spent the least amount of time possible just looking at Elixir's beginner guide to get a feel for the syntax and then I started my own app and just looked up stuff as I needed it. Nearly all of that time was spent doing "feature based development" on the app.

"I need to add a /faq page, ok, how do I generate a controller and hook up a new route, let me check the docs."

"My app layout is getting a little gnarly, how can I use template includes to split out my navigation, let me check the docs."

This style of learning is how I teach my https://buildasaasappwithflask.com course too. We cover over 50 general web development topics, but topics are went over in the context of building a real world app, implementing features as we go. Then there's self guided homework assignments to implement even more features into the app.

How do you prefer learning?


I agree in general with the argument put forward. However:

While it is true that principles of programming are better taught in environments that are more suited to that (in my days that meant Modula2 and Common Lisp), one should not just assume that those skill acquired can be transferred by students unaided to different contexts. As a TA later I watched in horror how students that did perfectly well in programming classes reverted to the bad habits they had thought themselves in hobby programming before college, once they were asked to do a project in the 'industrial' IDE's an languages they were used to back then.

Our 'learning' is far more contextualized than we believe and transfer is hard, not automatic.


That's funny. I came to the same conclusion as this article after reading The Art of Learning, by Josh Waitzkin, which the writer also mentions.

Project based learning is inefficient because you repeat things you know lots of times. The repetitions are spread out too far to be burned into memory.

I find that even after 6 years of experience, I still have to Google to convert an int array to string array. I've done dozens of projects, but this thing was never optimized for. It works fine to complete a project, but becomes a drag when trying to implement more complex code.

The ideal would be to internalize as a kind of instinct. When something is instinct, the subconscious can calculate it and work out solutions.

I think the best kind of learning would be to a kind of coding dojo, where they repeat similar routines until it becomes a part of their instincts.


Hmm, I would say that the key is figuring out what the project should be and how it should be carried out. That's where you need somebody mentoring.

You really don't learn network programming from programming a a chat app. That project is broken for that learning outcome. If you asked a network expert he would probably suggest to implementthe OSI model as a C library or something like that.

I would also be very cautious about the group aspect of project work. And again, you would need somebody with experience in the field to help you divide the tasks, if you insist on doing group work. Another suggestion would be to do it alone, and get to communal experience into the project in other ways, maybe through reading groups with people doing similar projects.


Project based learning is not inherently better at teaching a given subject. But it's a superior method for teaching how to write computer software because it's an opportunity to motivate, train, discuss and drill on core software developer tools.

#1 problem with folks out of college? They have no idea how to start or maintain a basic workflow. Project based learning provides space for this.


If two people were racing to "learn" a programming language then yes, project based learning would fail.

This dude doesn't recognize the audience that most blog posts target.

Most people reading blog posts about learning code are NOT in college and ARE NOT employed by colleges.

So the challenge for learning is more about motivation and projects tend to help you stay motivated more easily that little learning lectures and components.


How do you know?


It's an interesting article. I'm not sure I agree with everything though, but maybe I do. Let me explain.

Let's compare programming to figure drawing for a minute. What you could do to learn for example is, you could learn the details of some part of the figure, like the eye. Well, that's cool but it's not gonna teach you. That's because the hard part in drawing the figure is dynamic, balance and proportions. It's not so much that the details are "easy" and that they don't require to be learnt, it's that, if your dynamic, balance and proportions are off, the details will be too.


Every member of the team must be an expert in drawing the mannequin right.

Why? That's not at all clear, even from your example. "If the mannequin is wrong the details will be wrong" does not imply "everyone involved must know how to do the mannequin". Just that someone must.

Let's go from figure drawing to true multidisciplinary art -- 3D graphics. At least in the gaming industry, the rigger (creator of the skeleton and joints, relative expert in anatomy) is not necessary the modeler, is not necessarily the animator, is not necessary the texture artist, is not necessarily the 3d shader expert. None of these need to be particularly good at any of the other skills.


Well, I am self-taught. Actually, I am still learning, I just started a year back. And IMO, project-based learning has been really beneficial in my case. In school, I always grew up knowing I am a bad student, I don't learn anything because I am too stupid to learn. Now, after my attempt at learning to program, I understand that it is not my inability to learn but more my inability to enjoy learning just for the sake of knowledge. When I learn something I need to know where I need it, where it fits into the bigger problem. And this is where a project-based learning has been helpful, what I know about topics like formal grammars, parsing, concurrency, distributed consensus, which data structure to use for a particular use case, basics of cryptography, sockets and protocols etc. is mostly because I took up projects that I found interesting and these projects needed me to have an idea about these things. And this is also project-based learnings biggest disadvantage. Whatever I learn, I learn from googling and reading the top few links. This leaves with serious gaps in my knowledge until I come across another article or resource that explains the gap or someone in some forum points it out to me. Learning in a formal setting is pre-emptive, but this also means there is an authoritative figure who gives you a structure that you follow to gain a thorough understanding of the topic. So I guess rather than saying one is superior to the other, attempts should be made to mix and match the good things from both styles.


A really core question(/assumption in many comments) is "is a computer science education only for SV job prep?" As usual, we could stand to remember that the world doesn't revolve around us. People learn to program for a lot of other reasons too.

If you want the quickest path to a SV job, you learn the most modern frameworks and drill them until you scrawl the boiler plate in your sleep. Project-based learning is extremely appropriate for this. You absolutely don't need four years of education to crank out a comfortable 6 figure salary off this knowledge. (In fact, it might be a liability if your pet framework ages out). You only have to be able to fill a need (even one will do) which a lot of companies have.

If you're looking for anything else, a diet of only back-to-back projects might not be the best way. If you're jumping head first from javascript into kernel modules you might consider a book. If you're looking to automate some bio or physics lab equipment, you can't expect getting good at a few popular frameworks to carry you. If you're learning math there are advantages to having people grade your work and offer office hours.

And if we're being really honest, this non-project-based stuff is pretty useful as SV job prep too. Not knowing things like linear algebra, statistics, and discrete math can be career limiting. I mean, how many of us are really ready to jump the gap to doing serious ML work?


> After finishing his first two Android apps

I think this guy's definition of project based learning is "large finished project".

My idea of project based learning (which as a self learner I use quite a lot), is very _very_ small examples, try to make those examples work in the most unpolished way from a user perspective (focus on the core functionality) and explore the different ways in which they can work as a way to explore the topic or idea that you are interested in. _not_ "try to make some end product based on this idea", that's a whole different venture.

In his example of making a chat app to learn about networking:

> Over the next three weeks, Bob spends a lot of time building the GUI for the app, which he already knows how to do, and only a couple hours implementing the client protocol.

If he wanted to learn about MVCs and GUIs this is indeed a good project. If he really wanted to learn about networking from the perspective of chat, he should have either avoided the GUI and come up with the simplest possible CLI based interface... or avoid the interface all together and just experiment with the core networking code and send hardcoded strings or pipe in some text without a nice CLI. I mean this smartphone chat app is really a strawman IMO.

Disclaimer: I've read nothing about project based learning, I only recognise it as a concept that has emerged in my own personal learning strategies as I am sure it has in many others.


> Disclaimer: I've read nothing about project based learning, I only recognise it as a concept that has emerged in my own personal learning strategies as I am sure it has in many others.

You may not have read anything about it, but you've nailed it.

If it's project-based learning, the learning has to take precedence over the project.

This is why I had side projects when I was a software engineer.

Most day job projects have to work, with minimal (technical, schedule, etc.) risk, and minimal development time and other costs. For a low-innovation project, this generally means using familiar tools. For an innovative project there's still plenty of skill development, but often in circumscribed areas. (This is true for journeyman to master engineers. When you're early in your career, you get learning opportunities for free out of any halfway-decent environment.)

A side project can take an arbitrary amount longer, or fail, without letting your team or organization down. (Or, a side-project with a team can be a designed as a learning experience, if that's a shared team goal. This is gigging with your band, not building with your crew.) This creates the space for prioritizing new concepts and skills, and practice on weaker skills.


The problem with teaching programming in functional languages is that those languages use a different PARADIGM than imperative languages. Learning the functional paradigm is very useful, but it cannot replace programming in like C. They are different. They teach you different things. For my taste and my carrier as mostly a performance / graphics programmer reading the source code of DOOM was a much better, interesting, and inspiring school than learning to program in SML in a university course.


Yeah but the problem with the current method is introducing solutions before the problem. Learning a tool is way easier if you see the value of that tool. I had assignments writing XML schema. No idea why but we had to write them. I didn’t retain any of that but found them very valuable to define interfaces later on. If I’d known the use I’d have been more motivated to learn how to create them.


I agree that small exercises are good for teaching individual skills (e.g. a chess player practicing with only a few pieces on the board), but projects put the reason you are practicing into context. It's a motivator to practice. It shows smaller skills combined to produce some desired behavior.

I think a critical piece left out of the article is the role of emotional affect. Many people are learning because they want to produce a big, realistic set of behaviors (a project), so a good learning curriculum will motivate through a project and drill through small, independent exercises.

Another example, the article mentions Haskell being more useful for teaching cs concepts compared to node, or piano for teaching musical theory. While this might be true, we also need learn how motivated students will be to use these tools over others. At the end of the day, the usefulness of a tool for teaching concepts will be (its effectiveness when used)*(how motivated people are to use it).


The author suggests there's a choice between scaffolding and project-based learning. There isn't. Scaffolding -- or breaking things down, and 'putting on training wheels' so the learner doesn't get overwhelmed -- is an established successful teaching strategy with plenty of research to defend the technique. Project-based learning lends to the idea that making things real, instead of just drills, leads to better learner motivation and retention. Project-based learning also has evidence-based support. The idea that these two learning strategies in are opposition is false. You can absolutely break down activities in components, teach those first, and then do integration practice -- all while doing project-based learning. Think 'mini-projects'.

Disclaimer: I'm working on https://sagefy.org/


I think you need a mix. When I was in high school, I taught myself how to program through making games. Progress was slow, and I wrote bad code, but I had fun, and it's the reason I'm a programmer today.

Then I went to college and learned the fundamentals. I started doing projects in my spare time (often by attending hackathons), and now those projects were getting a lot better. The academics were crucial. However, I had friends that never did projects outside of class, and as a result it hurt their learning.

Then, fast-forward to my first job, and I learned just as much in my first year there as I did during my entire time in college.

So IMO, project-based learning without fundamentals (which is what I was doing in high school) ends up being really slow can produce poor results. But just learning fundamentals can be equally as bad. You need both!


This article reads to me like someone arguing whether it's better to learn by reading or watching videos. Maybe both? Neither?

For me, what good is it to learn all of the fundamentals if I'm not sure those fundamentals are getting me closer to my goal? Especially in the world of tech, there is no short supply of "critical fundamentals", but woefully less in actual implementation.

And in the tech world, there's a lot of implementation that lacks even a cursory understanding of fundamentals.

IMO, this isn't a problem in the approach of learning, but the learner stopping at a superficial understanding on either side.

Some people don't like to jump into new endeavors without understanding the concepts. And others don't want to be bothered with abstracts until they can see the results.

Good students beat bad ones, regardless of the journey that took them there.


I think the author has confused engineering and science.

For example, for learning Web Development, I'd prefer a project based model where I learn concepts and apply them to my project side by side. This way, although the end result would be mess and likely incomplete, I get to learn a lot from the mistakes and issues that come in the way.

On the other hand, for learning about compilers, a tradional semester-based course, with 2-3 lectures/week, and maybe an coding assignment/week that can be completed in an hour or two would work best. In this case, while I don't have any real project at the end, I do learn the core concepts.


My own experience both as a learner and teacher is that project-based learning works great once you know the fundamentals and terribly otherwise.

Imagine you want to teach a cohort of incoming CS students to build simple web-server-with-database applications.

The "traditional" way to do this is to first teach them classes on web dev, OOP, and databases. Afterwards, you give them a software project.

I agree with "modernists" that if you just teach the three classes, but don't set a project at the end that integrates the knowledge, you end up with people who are really good at passing whatever style of exam you set them but can still be poor at actually building applications. That's one reason why recruiters care about your portfolio as well as your CV, I guess. It's also why universities, since time immemorial, have required individual projects and dissertations to get a degree - after teaching you the classes, for your "masterpiece" you work on a project.

The problem with making everything a project and assuming they'll learn or teach themselves the basics as and when they need them, is that without some understanding of what the building blocks are to build an application you end up bogged down in the details and developing by trial and error.

Real examples I've seen: spending the best part of a week on code to manually parse incoming HTTP headers (missing the fact that the server library already does that for you); writing custom code to serialise lists into a database field because you don't understand foreign keys (and of course breaking 1NF in the process); loading individual items for which you have an "id" by "SELECT *" on the whole table and then doing the filtering on the server, in O(N); and of course writing code that's vulnerable to both XSS and SQL injection left right and centre.

Even in a project unit where every student has a slightly different and very open-ended project, there's some common ground, e.g. "using prepared statements" for which there is such thing as a "one right way to do it" and my experience is that actually teaching this kind of thing in isolation is the only way that works.


The nice thing about project-based learning is that it is well grounded, by definition. Theory is useless without some form of grounding.

Group projects... well, that is a another thing entirely.


Absolutely agree with you. Without context and purpose there is no meaning in action. And CS is very broad field. Learning theory without practical is like wondering forest without knowing path. Now if forest is small, it may work but if forest is big then you will definitely lose.


I study computer eng. with a projects based approach (teams of 2, 2 weeks projects) and I believe that it is much better than regular classes. Yes you miss theory, yes you don't go into the details of everything, but you get more real-world experience and you actually need to work by yourself to learn something. In regular classes, you can be passive and just listen, but with projects you have to research, talk with people, try (and fail). If you have great mentorship, you will gain more during those "sprints".

We also have a co-op program which is the best thing in the world really. I simply don't understand why most universities don't have one. We do 4 months of school then 4 months of work in a company (then repeat). 5 internships during a 4 years bachelor give you almost 2 years of experience when you get out of university. Since they are paid internships, you can live well during school semesters and you don't have to work while studying. In addition, you know way more what type of job you want to do since you tried so many different companies. I personally know that I like smaller companies that operate in the fintech market and that I like backend more than anything else I tried.


The author argues against project-based learning but then states his "5-minute attempt to come up with an alternative way to teach these skills" which are, effectively, all projects or pieces of a project. Ultimately it doesn't boil down to project-based learning being ineffective, but lack of teaching for communication in engineering teams. If a student's goal is to learn network programming, then they should speak up that they want to focus on that aspect of the project. I took a few CS course post-grad after already beginning my programming journey and found the methods of teaching to be highly ineffective (the practice problems were too abstract and most students couldn't draw the connection to real-life problems - this is at a uni with a top CS program). Upper level courses should definitely keep focus on the complexities of theory, but entry-level CS courses would serve students better by fostering their interest in a topic and showing what it's like in the work force, and then if it's still something they want to pursue then teaching the theory makes sense as it's something they would end up trying to learn about on their own anyway.


It's hard for me to agree with much of anything the author states here. This snippet is why:

> I don’t teach network programming and haven’t tested these specific ideas. But, in my specialty of software design and code quality, I use exercises built on similar principles all the time.

In the real world, learners do not come packaged with all the tools and links and resources they need to hammer out their solutions. Classrooms are great when you can have a textbook and pre-defined problems and such, but that is not now it works outside of the classroom (or the virtual classroom, as the case may be).

"Software design" is not an isolated component from network programming. It is that bridge between big data always getting piped in to (app / product / website) from the network, and how to help ( user / customer / other developer) where most engineers are hired to create value.

It is a good idea to attempt to design software with "code quality"; conceptually that sounds like a fantastic idea. But it is impossible to do without also factoring in a predictive element of how it will be used or accessed or even discovered over networks, stores, whatever.


I think the author misses one major point: doing lessons and homework is boring. Building real apps and making things is fun and exciting. I started teaching myself Chinese when I was a teenager, and I decided to study it at school. As soon as it turned into a routine with lessons and homework, I lost all interest.

Project-based learning is an awesome way to learn a new framework or language, especially if you already have a lot of experience. I also learn very well from tutorials when I'm a beginner at something. For instance, I followed some Blender tutorials to learn 3D modeling, and I think that's the only way I could learn. The Blender UI is far too complex to just click around and figure things out.

However, there are cases where you really do need to study a book and sit exams. I have many years of experience with AWS, but I'm not an expert and have lots of gaps in my knowledge. I've been thinking about getting an AWS certification, so that I can deeply understand all of their services.


> I do want to dispel the idea that everything would be better if only we switched to project-based learning. The opposite is closer to true.

Both of these maximalist notions are so far from true, it's not even worth asking which one is closer.

> Project-based learning is the most inefficient form of learning that still works.

Where it lacks efficiency in learning, it boasts incredible precision in application. Project-based learning is practical, if not fundamental. You will learn to do something tangible, but of course this may not be the most efficient way to teach conceptual lessons. Concepts and understanding (aka "the math") will be required to decrease learning curves for future applications.

So yes, both project-based and course-based learning are important, it isn't useful to completely eliminate one or the other, but I do feel the world is currently partial towards course-based learning.


It fails to address an important aspect of learning by projects: Being exposed to the problems. You are orders of magnitude more productive when learning the solutions to problems you have encountered before. This kind of motivation can only be achieved if you really do stuff.


I attended a program that had a program that was built around Team Based Learning (TBL) and projects. The program was very well designed around ensuring that you kept up with the course, and put the topics into practice in realistic situations (projects) as both an individual and then as a team.

This is incredibly effective because it reinforces your individual learning with the knowledge of your peers and provides a feedback loop to the professor who would tailor lessons based on what the class was having difficulties with.

TBL is not an easy program to run though, and requires a lot more forethought and time from the professor that is running it for it to be effective. There are some courses where it is a huge improvement over traditional teaching methods and some where I would argue it isn't.


Seems like having to choose only one learning/teaching style is a false dicotomy.

E.g. I've been doing software a long time, so like the github guy in OP who liked project based learning, that works really well for me when I basically already know what I'm doing. E.g. I know C-based languages really well, put me on a project with a new language, and I'll pick it up, np. Learning a new language by drilling would be very boring for me.

But if I'm learning something truly new/novel to me, e.g. machine learning or Haskell, where I have zero existing intuition for how it works, then drilling constitute parts is very useful.


I think whatever learning method depends on your goals, are you trying to get immediate deep experience that a general (still quick-feedback) but intense brushover of all the fundamentals can not provide, then project-based learning is good for you, but if you're looking to not only master the fundamentals but also advance in terms of being able to offer more on the fundamentals, then traditional course-based learning is good for you.

I don't think there is a real good or bad but rather depends on your goal and motivations. Depending on the goal, some people might learn more from projects than course content and vice versa.


I do a little teaching on the side. My primary objection to common (can't say if it is traditional) programming instruction is not theory vs practice, it is "of what"

Most of my time coding is spent pouring through code that i didnt write (at least not recently), building the design in my mind, finding the spot to change, and then making that change.

In contrast, most school coding assignments are write-from-scratch one-offs, usually involving only one students' code.


CS grads are so insecure when it comes to this subject. Truth is most schools burry you in theory and barely teach you anything that is used in the field. A motivated self starter can easily surpass someone that went the traditional route. I only recommend school if it’s free. If you have to take loans or go in debt, don’t do it. That money would be far better used to start your own venture and learn on the job.


I think the author is totally wrong and is just trying to sell some classes. I learned 3/4 of my skill set by coming up against examples when joining a work project or coming upon problems when trying to fix things. There is no functional difference between practice and performance when it comes to writing software, only the goals of the exercise.


I think this is missing the main point of Project Based Learning: you choose what you want to learn. You cannot treat motivation as a constant when comparing educational methods. Students get much more from their learning when it is intrinsic and specific to their interests, as is the case when doing a project.


Some great point in OP's critique. But CS isn't music theory. Music theory is not really known well by most musicians who are excellent players. Theory is for theorists. I'm not saying that most musicians know zero theory-- they know what key they are in--but seriously, that only happens after they can play lots of concertos. Pianists tend to know more, but that's because it's staring them in the face every time they play. (speaking as a career musician: performer, composer, teacher of performance and theory) Yes, musicians drill lots of different things, but honestly, it depends what it's for. If practicing improv withchord changes, one needs to know those changes inside and out, but that is truly just one way-- if one is improvising freely, one needs to practice by doing. It really is the only way. Improv is the creative aspect-- classical is the perfect reproduction aspect. I'm thinking software development is a kind of hybrid. Also-- there isn't a pro music job on earth that anyone could learn well enough in a bootcamp for even six months with no prior experience, so the good news is-- no software engineer is regularaly competing with Horowitz yet. Sure, there are virtuosos in programming, but as a musician, you are surrounded by them and are already one yourself as an undergrad-- at least! This is good news for creativity in teaching programming. In terms of motivation of students: from what I have experienced as a programming student in several programs and as an observer-- the material studied is not inspiring. In music, we study Mozart and Thelonious Monk. Puccini and Stravinsky. Reich and Cage. In programming, we studied models and diagrams. We never looked at "great code". When I asked if there was indpiring code to see, my teacher guided me to online lessons made by the school. I asked if there was. jupiter Symphony of code somewhere, desperately looking for a way to be indpired as I am with great writers and composers. My teacher treated me like I was "being difficult". I dropped out of this program and learned on my own. This was one of the tippy top bootcamps in NYC. I was hoping it would be there for me in these ways-- it wasn't. I'm left with this: I can't imagine any musician getting hired to teach with such low level understanding of ... the world of interesting examples- even a beginning music professor can direct students to inspiring examples of greatness in the field. Teachers need to be paid well for the time it takes to develop interesting methods of teaching. Teaching is a complex hybrid science and art. Like all of these, it requires time and mastery, but most important: innovation!


Programming can be creative but it is not performance art. I'm not sure what you were expecting? Even if you found something (subjective) it may not be very functional. Completely different animal. You seemed to be missing the point of software engineering.

Funny you choose Horowitz, Monk, Starvinsky and Cage as examples. Which either never attended, rejected or had limited academic education.


Actually, software engineering isn't a completely different animal from music-ing. It's the most similar animal I can think of. Its pedagogy is behind music's pedagogy by two, maybe three hundred years. (for obvious reasons) It could learn a lot from the errors in music pedagogy. For one example I can think of off the top of my head, the OP mentioned that we practice scales. Well, yes. But it really is "how" we practice scales that matters. Practicing a scale from bottom to top and back isn't really useful. But lots of people do it like crazy. They drill this. But nowhere in music (maybe performance art, though?)is there a scale in any key all the way up in four ocatves and all the way back down. Nowhere. So practicing that is really a waste of time. It isn't a waste of time if you want to just learn how to play anything fast though, because you "know those notes". Playing fast is a different physical feeling that is counter-intuitive really. But people practice scales like this for the key's sake, and that is really not a great use of practice time. A better use, in this analogy, would be if you wanted to see if you could get around a key in any direction, with leaps and figures and all the things that may show up in a key. Then practice that. Practice figures, loops, improv leaps- in a key. That will make you a better sightreader a better player in that key way faster than scales. I I have the data for hundreds of students, myself and colleagues to prove it. This information doesn't get to the regular Saturday student, but believe me, it is known by professionals and great teachers. It isn't unconventional at all, despite the many parents who get worried about their kids not playing scales like they did or like they heard in movies or TV commercials. Mozart practiced like this. Paganini practiced like this. I have to make the argument to concerned parents at least once a year anyhow. Not sure why. I'm not saying drilling isn't important. It is. But it's what to drill that is the thing. It is also not churning out a bunch of meat compilers and calling those musicians-- or programmers. One thing programmers cold drill right off the bat would be typing for programming. I see a lot of bad typing. I'm bad at it. I realize that's a basic idea. Programming is creative in exactly the way that playing the blues is creative. Everybody does it differently- those who develop their own style are the greats, but in the end, it is still blues. Not sure why academic education of those composers has anything to do with anything. Certainly those composers were some of the most educated, intellectual people who have lived, Monk being the least traditional in his expertise, but he sure understood social history. And even he headed over to work with Hal Overton when he wanted some respect from the establishment at Juilliard and for his Town Hall concert. My point being, in current educational models for programming (not talking MIT or Stanford, because I woldn't know and I'm sure it's different there, because they really pay for it to be so) we aren't reaching for great teaching. We aren't reaching for inspiration. We aren't reaching for technical innovation. But there is a whole lot we can learn from master teachers who have come before in other fields. I think the OP has a point, but I think project-based learning is very important. NOT, however at the exclusion of drills that are creative. Not at the exclusion of coming up with techniques that other fields have not thought of yet. I mean- sheesh- we have accessible mountains of teaching/learning data in computing. No one is using that to make their education better. Why?


If you're interested in going through the literature about learning, I recommend starting with Learning and Motivation in the Postsecondary Classroom by Marilla D. Svinicki.


I feel like articles are just stumbling onto the reason why many sciences have coursework and labwork as part of the curriculum.


Your actions are recognized as skill only if they are useful in real life.


PBL works best in the presence of learning objectives, such that the project has domain deliverables, but is also explicitly designed to lead to learning outcomes. (In a team project, these objectives are mostly defined at the level of the student, not the team.) Ideally, these learning objectives are shared with, and maybe even, in a college course and maybe earlier, co-developed with the student.

“5 PBL Pitfalls to Avoid” [^1] may not be the final or best word on PBL, and it focuses more on PBL in primary education, but it's relevant and I was able to find it in ten seconds via Google. ([^2], which links to this, is the top hit for “project based learning”.) “The Practice is not the Performance” would be a stronger article if it either (1) responded to PBL as it's actually used (in my understanding and experience) in schools, or (2) argued that, and why, college-level applications typically stumble into the pitfalls in [^1]. (Which may indeed be true, but I'm not prepared on the content of “The Practice” to weight the authors claims over my own experience.)

The author's other point is around isolation practice. Well, yes. Olin's current Software Design course (intro to Python and programming, typically taken in the second semester) takes a textbook approach[^3] to the basic concepts of programming and computational thinking, and introduces projects as concurrent stream. But also — musicians, athletes, and other performance practitioners are often advised not to perform isolations to the exclusion of longer phrases or entire pieces, lest (1) they forget how to combine the elements into a fluid whole, or (2) they remove the feedback loops that tell them what areas of weakness to isolate. The software analogue of (1) might be design principles and practices that don't appear with isolated elements or at small scale. The software analogue of using isolations within a project context might include discovering that you don't understand a language feature well enough to use it or debug it, taking time out to explore it by reading or experimentation, and then returning to the project. This doesn't downplay the importance of isolations, but it uses them as a complement to PBL, not a substitute.

I think there's a lot to be learned about software engineering (which I think is what the author means when he refers to “CS”) from instruction in music and in studio arts, but it's better learned from a deeper look at both formal and informal instruction in these areas.

There are some open questions, not (I believe) mentioned in the article but raised in some of the comments here:

* Who is PBL appropriate for? For example, Olin College uses PBL — mostly within the context of Team-Based Learning (TBL) — extensively. However, the Olin admissions process[^4] evaluates candidates within the context of a group activity, which overlaps substantially with PBL and TBL, and provides not only information about the candidates to the students, faculty, staff, and alumni on the team, but also provides a taste of the TBL/PBL experience to prospective students, that they can use to self-select prior to matriculation.

* In what domains is PBL applicable? It seems to me a clear win for many engineering, design, and entrepreneurship topics. Olin's Quantitative Engineering Analysis[^5][^6] is a multi-year experiment in applying PBL to introductory-college-level math and physics.

[^1]: Frank McKay, “5 PBL Pitfalls to Avoid”. https://www.edutopia.org/article/5-pbl-pitfalls-avoid

[^2]: Edutopia, “Project-Based Learning”. https://www.edutopia.org/project-based-learning?gclid=CjwKCA...

[^3]: Literally. Allen Downey, who designed an early version of the course and is on the faculty, wrote the textbook: _Think Python_. http://greenteapress.com/wp/think-python-2e/

[^4]: Olin College, “Candidates Weekends”. http://www.olin.edu/admission/candidates-weekends/

[^5]: Olin College, “QEA”. http://meet.olin.edu/olin-isms/quantitative-engineering-anal...

[^6]: Olin College, “Changing Course”. http://www.olin.edu/the-wire/2016/changing-course/


tl;dr: If you only do one type of X-based learning, it will fail to teach you everything you need to know. This is literally covered last in the disclaimer.

The author never talked about whether this was actually a problem.

And frankly, it's a shameless plug for a course that literally uses project-based learning in it.




Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: