Project based learning won't work, traditional learning won't work, only my class will work. In fact the source he quotes (himself) claims he can give you a year's worth of experience in an hour.
But why doesn't project based learning fail? Well, in his example Bob, who wants to learn network programming, pairs with Alice, who somehow mysteriously takes over the entire project and does all the networking code so Bob learns nothing. That's sort of like arguing that you can't learn guitar from a teacher because a velociraptor might stroll by and eat the teacher.
I don't know what type of learning is best, but I know you need a little more evidence then that for a post titled "Why project-based learning fails".
Neither shows project learning is the only thing which works well; just that it works better than the model the author proposes or than traditional classrooms.
The key fallacy the author makes is that free-form project working doesn't work. Let's say I want you to learn machine learning. I need a carefully designed and sequenced series of projects which exercise all the skills along the way. That's not an uncommon fallacy; many progressive schools made the same mistake and basically failed.
The random coding project model, in isolation, as the author described, falls flat on its face just as the author describes. It works pretty well for simple/broad things (e.g. learn an API), but for anything with depth, expert sequencing of knowledge and design of projects/assessments becomes important.
You can't argue with that. She got more from a 4 year computer science education than the 6 week bootcamp! QED
But seriously, as another commenter mentioned, these maximalist headlines don't really help us get to a better education system. Deliberate practice is very important, but that doesn't mean project-based learning "fails".
I'm tutoring a student 1-on-1 at the moment, and my biggest concern as a teacher is always motivation. The delight that comes from building your own little real world website/app (simple cookie clicker clone in her case) after only a few weeks of programming is so useful in getting students to actually appreciate the value of what they're learning, and then it just snow-balls.
Some good points raised in the article, but perhaps a little click-baity and one-sided.
My takeaway as well.
> In fact the source he quotes (himself) claims he can give you a year's worth of experience in an hour.
Did he invent a form mind-meld?
Why more universities don't leverage co-op programs, especially given the exorbitant and rising cost of college tuition in the US, is beyond me.
I attended a regular computer science program and I am glad I did. The goal of university is not to prepare you as best as possible for working life, but to teach academic thinking.
Class projects usually have a specific purpose, like teaching computer vision, operating systems or building compilers.
You'll spend the rest of your life working. There is no need to rush into it, in my opinion.
Then again, I didn't study in the US, so I didn't have the tuition problem.
There are certainly arguments to be made that university is by definition, in original intent, and in theory solely a vehicle for teaching academic thought, but I think the pragmatic counter to them is a poll of "why are you here?" , and should be considered valid.
The end result is that university is in practice not one or the other but both: it must teach some academic thought, and provide some job-readiness.
I wish to digress for a moment into why that is not just a failing of many students to understand school's true purpose: many people (hi) do not come from a background where lifetime economic security is a guarantee, and while programming is a rare example (at least in North America) where if you spend the time, you can probably do it, and if you can do it, you can probably be paid well for it, there are unfortunately still gatekeepers. HR departments frequently resort to very base credentialism to sort through candidate resumés, and those with schooling typically come first and get higher base offers of compensation. Not to speak of the network social effects of spending years with other entrants into the field, and the opportunities for extra-curricular pursuits which also are well regarded by those gatekeepers.
These among other things have forced many people with mere occupational goals to pursue university degrees. I wish it were not like this, because my professors universally did not feel that that was a good reason to be there, and occasionally avoided very useful practical topics on principle, but I (and many within my cohort) felt there was no better alternative for a successful outcome in life, and it has mostly worked out.
1 - I of course have not conducted this poll, but I do strongly believe that at least a plurality would respond with job-attainment as their primary goal, and would not be surprised by an even greater response.
A generation of financially secure parents have pushed the mindset of Degree==Job. An understandable choice, based on the idea of attaining the best start possible in life.
Whatever reasons, whatever perceptions, the role of the University is unchanging and should remain steadfast; A University is there to create new academics.
But I think we actually agree -- I would just amend your statement to be an aspiration rather than a description: "A university should be there to create new academics."
I feel saddened that in practice (in great part due to the reasons we've each outlined), it is not.
Because academics don't work on projects? C'mon.
Except for a few things (like math) I wanted a lot to be taught in a project based fashion at university. There was a lot of CS stuff that we learned where the professors were super unclear about why it was important or when the correct time to apply the knowledge was. That not only made the teaching less useful it seriously sapped my motivation.
Years later I was surprised a lot by the applicability (or not) of some of the knowledge taught to us and I think probably the lecturers could probably have guessed that I would run in to these situations - they just didn't put much (if any) emphasis on bringing it up.
Now I'm teaching (as an amateur and not in a university, but still), I'm teaching in a project based fashion because it's how I wanted to be taught.
I do this partly because I'm trying to present every topic as a problem first in a context that seems realistic. I feel like getting students to recognize and 'feel' the problem themselves before teaching them the solution helps with their motivation and the applicability of that knowledge - wherever they decide to use it.
Even though I was forced to work out of necessity, it turned out to be the best move. Learning how to balance theory with “we need this done in a couple hours to try and land the next client”, has been so valuable to my career (even through grad school). Engineering at every level is about trade offs, and nothing makes that more apparent then working a job.
Can you explain why you think taking courses and working flipping burgers or some other summer job is better than taking those same courses and being mentored by practicing engineers, programmers, scientists, poly-sci, <professional in your chosen field>?
I'd also like to point out that if you argue college should be about creating academics, plenty of my friends went to my school and went on to become one. They simply used their co-ops to do research, either on campus, for another university, or for a company.
This is a well-worn false dichotomy. It sets up scientific understanding of a field against job skills. Scientific understanding is important because it saves you months or years of flailing around in the dark, trying to solve problems without understanding them, or worse, producing half-baked work that could be so much better if only you had a clue. But without context, it's very hard to make knowledge stick, and projects provide that context.
In many cases, large companies have special days where they'll come onto campuses to do interviews, and they may take a significant cohort of students each semester. There are often rules like, "you have 24 hours to respond to an offer and may only reject one offer" to ensure good faith been parties and high match percentages when some companies may be more prestigious or slow in their process.
Students then spend 3-16 months (this depends on the school's academic timing and the company's own structure -- 4 months in summer is most common) at a company and if it's a good company, they'll be given real work, strong mentorship, and deep integration into a team, and often students near graduation will get offers of employment directly at the end of their term.
There are also usually very tedious written work-term reports to fill out as a bureaucratically useful artifact of the process. They're typically expository essays of topics like "what I learned at $company", and everyone hates them.
Students will have 1-4 (usually 3) terms out in the real world over the course of their time in school, and graduate with meaningful practical skills and usually a taste of what's available to help assess opportunities.
It's on the whole a great system for a lot of people, particularly those who are using university to increase their employability.
And to babysit children so that their parents can spend more time participating in capitalism.
You are correct however, but it's more understanable than you make out.
Rather, the education system is designed to impart the skill of information recall -- which was essential to actually acquiring the target skills (ie., you needed to remember lots about chemistry before doing any). Remembering sentences about a skill has no relationship to being able to do it, only, that as you're learning to do it, those will be an aid.
That hasnt been true for less than 20 years, perhaps 10 with the prevalence of smart phones in 2007.
The whole system is set up to impart the skill of recall and grade according to your ability to recall. There is a implicit general awareness that you're actually not aquiring the target skills.
eg., after 5-7 years of french in the UK, no 18 year old can speak french. And so on for every subject.
The same is true all the way up until PhD. It's our historical model of what education was for: giving you the library up-front.
The skills were to be obtained in employment, and with any self-motivated practice.
Today that is painfully ridiculous, and there's really almost no value in it. Leaving the warehousing, grading and certifying functions of educational institutions their only apparent use.
PS. There's an argument to say education hasn't needed to be this way since the advent of public libraries, cheap books, which is still late 20th C. This is plausible enough too. Our template of education is still basically a mix of medieval and Victorian, with the presumption that books are hard to obtain or difficult to survey oneself.
note that primary/elementary schooling isnt set up this way: it is deliberately skilling. This makes it very effective and very important.
Those seem like a series of stridently unrelated and evidence-free statements.
Of course no foreign part-timer can speak French like a native, but able students can certainly hold a basic conversation and read French to a reasonable level.
Language degrees include an exchange program which offers extended full immersion, so it's simply nonsense to say that a graduate in French won't be able to speak it.
But... the problem with CS is that it has no idea what it is. Most engineering degrees have well-formed requirements, which include a lot of math and domain-specific detail.
The sciences and math have a core curriculum which hasn't really changed all that much for fifty years or so now.
CS has... what? What's the basic skill set, what are the core requirements, and how are they recognised?
No one agrees. Employers mostly want "Minimum Viable Developers" who can crank out code using the Framework du Jour.
Academics do what academics do.
You can certainly make the case for any one specific curriculum, which will include a mix of theory and programming projects in various languages and environments.
But there's really no such thing as a definitive core CS skill set. For everyone who says "You should know C" someone else is going to say "Bad example - learn this teaching language instead."
And so it goes for the rest. Machine code? Compiler theory? DSP? It's all optional.
At the same time, CS has spent far too little time on the psychology of programming and system design. There's a fair amount of unicorn chasing, in the form of tidy concepts like type theory and side-effect free coding, but far too little research into designing languages and practices that are verifiably better at managing concurrency, handling versioning, and minimising avoidable bugs.
So the "learn theory vs learn by practice" question is a side issue, and can't be answered unless there are specific goals. For now, there's no such thing as a "qualified developer" in the sense that there are qualified (actually chartered) engineers.
Until there is, how can anyone decide which teaching approach is better when there's limited agreement about what needs to be taught?
The other benefit is subject gate keeping. Departments at least in STEM, don't offer pointless classes.
An autodidact might not know what's valuable and might not always muster the energy to get through. They also might not have the integrity to give themselves a fair test.
So yeah, school is useful beyond addressing the scarcity of books.
This is tutoring.
Everything else it does is a waste of time. That's where you'll find most of the actual time spent though, and it is certainly not an educational aim. The aim is to grade on recall. It is not to motivate, or guide.
No. At best this means you had a terrible experience, and are generalizing in ignorance. At worst you are dug into a position and are willing to defend it without integrity (a sadly popular and often successful approach).
Many (most?) of my tests were open book and they were hard because they were puzzles requiring you to have absorbed the book to the extent that you could reason beyond it. The tests would probe our understanding of an idea by asserting a small change in the initial assumptions, and have us complete the derivation with that new assumption. Test questions were novel complications of the problems in the book.
That was my experience, and it was a good one. I'm sorry you didn't get that.
Further, for people who did theoretical CS or math programs, it can feel quite frustrated working in a field with so many "self-learned" people and find that these are often ignorant of what they don't know.
Finally, many people here have Ph.D.'s or have worked in academia or teaching in various ways.
This all results in people being somewhat skeptical of advice to "don't do school".
Those all had knock-on effects and have indirectly made me more capable and confident in my analyses of performance or occasionally in troubleshooting, but it is very rare that I spend a day doing something deeply theoretical, and it is conversely very common to spend days hitting my head against walls with configuration, integration, or versioning of build/deploy/test components and dependencies. My CS theory is no good there, only methodical troubleshooting and asking for help in IRC are effective, and anyone can learn that by doing.
It's as if interviewers are just stroking their own ego trying to recreate the relationships they encountered in academia only with the roles reversed.
Undergrad and most masters programs are still just going for information recall.
This is why PhDs are specific, intensive, daily, guided, etc.These are all the characteristics of skill aquistion.
Compare, say, with paino tuition. All skill acquisition is basically a form of apprentiship. PhDs are apprentice academics.
Ask yourself the simple question: after finishing this education what skillful activities can I actually do?
After primary education: reading, writing, remembering, etc.
After secondary: basically the same.
After tertiary: basically the same.
After PhD: run a particular kind of experiment, analysis, etc.
After 10 years in a programming team: solve professional programming problems.
After 10 years with a piano tutor: play the piano.
The question "what can I do?" seems always answered with: not much. This should be quite shocking, and today it is -- really, that is the correct answer. Your BSc amounts to "not much".
You got some breath of knowledge, not everything is relevant but it helps putting things in perspective. This is crucial.
But the most important thing is that you've learned how to quickly adapt and pick up new things. This is what is valuable and in combination with a broad understanding is very much worth the education. It is okay if the understanding is shallow or even obsolete, point is you have the tools to recognize what you need to learn and you've learned how to quickly brush up on the relevant parts necessary to solve the problem at hand.
What kind of answer did you expect? My BSc amounted to me being very proficient in framework X? That would have been a waste of time and that is also what often is the alternative to an education.
People who self-learn are often very good at a specific tasks but have vast areas which for all intents and purposes is magic.
It would require the whole process to be lead by a tutor in small groups. And each day you would work through experiments, mathematics related to them, writing reports about them. You would read books together, and work through their practice exercises together.
You would be set to individual practice as often as a piano student is. And tutored as often as a piano student is, or much more often, given the compressed time.
For computing, you would need daily tutored high-skill computer use. You need tutored programming. Your time would be mostly in projects, with seminars where you read through the theoretical things together. The tutor would be guiding your progress individually, dealing with your issues and showing you the skill of programming, etc.
The "lecture" is a spoken textbook delivered to hundreds of students. It isnt giving you any skills. It's not designed to do that. Mistaking it for skill acquisition, which is what most people do, is pathological.
> Maybe lack of education...
You've confused the worst excess of the current educational system (knowledge about a framework!) as what I am arguing for.
That is what we have now.
Skills are modes of thought, imagination, creativity, attention, deliberation, etc. They require a lot of training in each. To be a programmer is to think, imagine, attend to, and deliberate differently.
A framework has nothing to do with it. This is confusing "skills" with whoknowswhat, recalling how to solve a problem.
In your terms, all of education is at the moment, learning the API of a framework.
What it should be is how to think and act like a programmer.
All of education is learning the pattern of notes in beethoven's fifth.
What is should be is how to play the piano, and compose for yourself.
Also, you've misread me completely. When I talked about learning framework X I explicitly said that it was a waste and that education will give you something much better.
But so is the ability to do a job.
Some companies have the resources available to take a fresh grad that knows theory, concepts, and how to learn, and then get them up to speed on how to actually do the job they've been hired for. Some WANT people that only know how to learn and the concepts, so they can teach them how to do it the "X Company Way".
But a whole lot don't. A whole lot of companies need to hire people that can slot into a position and hit the ground running. A whole lot of companies need people that can deliver on the job description.
A strong Computer Science education will probably make you a significantly better programmer in the long run, but if you can't get a job or keep one because you aren't prepared to step in and do the job, it doesn't really matter.
> Some WANT people that only know how to learn and the concept, so they can teach them how to do it the "X Company Way"
Those companies would do well in not hiring people with ambitions that have not had any education, correct. But part of getting an education is avoiding having such mindless jobs.
Also, you should not expect even seniors to hit the ground runing, unless you hire on total complete match of both culture, code style and exact libraries being used. In which case you should not be surprised about there being lack of qualified applicants. Even experienced seniors need to learn every single time they change a job and thus also tech stack.
Moreover, many agile companies are basically organized to cater to unexperienced programmer.
- A PhD in Performance from the University of Indiana
- A PhD in any technical field, from say, MIT
- A PhD in comparative literature from Yale
- A PhD in clinical psychology (which could be therapy-focused or research-focused), further subdividing it) from UC Berkeley
- A PhD in EE/CS from Berkeley.
I have no doubt all of those PhDs will be able to do a lot. I have no doubt that some, but not all, of what said PhDs can do will be valued by employers. And that is why academia and scholarship are distinct from business and industry - they have different, but sometimes overlapping, aims. And a PhD is explicitly training to be a scholar/researcher - it’s not a professional degree, though it sometimes has value in a professional environment.
James, the main argument for project-based learning is ability to make learning interesting. Considering that students aren't vessels that need to be filled, but torches which need to be lit, this can trump other approaches. Importance of being interested in the material can be hard to overestimate.
You may still be right, but I was surprised to see the arguments for project missing this one.
People who are theorists at their core typically believe that the learning is interesting by itself, usually because they are focused on developing their own models for how things work. It's a subjective-creative process that's exciting: You get to play detective! They are getting creative outcomes _while_ learning about theory. They do not need to wait for a project outcome in order to feel accomplished. Their only other answer, like OP's, to learning things that aren't interesting by themselves but still need to be learned is rote learning, or drills. What other method could possibly educate the student and yet not introduce other rabbit holes that distract from the topic?
Project-motivated people find this approach boring, and they are ready to go down the various rabbit holes involved in the project, if necessary to achieve their envisioned outcome. They are focused on _applying existing theory_ to effect their concrete outcomes, which is an interest that straddles the theoretical and real-world-application zones. They get happy brain chemicals by e.g. watching people use a project they've built. These types use a referential thought process for higher leverage and shorter timelines. So rather than developing their own model from scratch (NIH, the theorist's pet method) which will take them a long time and might not be that effective, they borrow someone else's model ("hey, let's use Framework X, it has tons of features!") and make it serve their subjective outcome-vision. Seeing the vision brought to life is where their creative fulfillment is activated.
CS departments are stuffed full of the former case. Engineering departments are stuffed full of the latter case.
It can be absolutely maddening to be either one, and then find yourself being asked to accommodate the other's learning / executing style.
Plenty of “theorists at heart” still think it’s boring to do lots of rote drills. :-)
It’s not like doing drills really helps you develop new models/tools/concepts from first principles especially more than it helps you adapt existing models/tools/concepts to new situations.
* * *
I think the article under discussion here misses the point in a different way. He assumes that it’s impossible to get meaningful fast feedback about various particular aspects of your work (and focusing on those deliberately) while working on “projects”. But at least in computing (and probably most fields if you look around), that’s not necessarily true.
We have tons of quality feedback available in computing, starting with tools like a REPL or a debugger which interactively show you the results of small actions, and then compilers and static analyzers and profilers and ....
Then there’s IRC and mailing lists and open source ticket trackers etc., where expert strangers will spend tons of their time helping you out for free just to pass the time. If your project happens to be something that other people are interested in and you are working in the open, there’s feedback from customers, other programmers, etc. If working in a group of mixed ability, there are the other group members or possibly a skilled mentor/tutor, and if working in a company there are coworkers.
There is oodles of open source code, documentation, free textbooks, targeted videos of people demonstrating how to use particular development tools or their general method, or showing off particular tricks, much of which is reasonably searchable and can be called up from the internet on demand when a particular problem/roadblock is stuck. You can relatively easily try to implement parts of projects yourself, then go find an example project where someone else implemented the same thing, and compare the two. Typically you can find both toy examples and production code, with their entire version history, internal project discussions, bug tracker history, ....
Arguably you could learn a whole lot by coding up 30 different examples of data structures in C, one after another, getting expert feedback after each one. And then moving on to coding solutions to 20 different dynamic programming problems. And then 10 compilers for different little languages. And then 10 different CRUD apps for made up restaur. Or whatever....
But you could also get lots of good experience by building a whole project that you personally care about from bottom to top, using it for a while and watching where the bugs come up, seeing what parts are completely wrongly architected, etc., and then trying again several times (ideally with a bit of introspection and research in between) until you have a better idea what to do.
Project-based learning simply doesn't come to their mind; it's completely off their radar because it is, as op implies, potentially full of little all-consuming rabbit holes that are unrelated to the CS principles in question. Look at the great theorist Feynman and his "computer disease" theory. He had a deep aversion to getting sucked into things like IT concerns, because theoretical model-detecting was his core motivator.
This is a problem of the coworker’s lack of executive planning / control / time management skills, not anything about learning via projects or theorizing. Arguably if the coworker had done a bit more project-based learning he would have better practiced those skills.
It feels like what really good teachers excel at is inspiring students to spend more thime thinking about the subject than is strictly necessary, even getting them to like it, so their minds can run away with the subject.
Exactly. Btw are their any popular video games out there that actually help kids to write mods for them?
James already addresses this assertion in the very first sentence of the blog post. And the second sentence.
The "main argument" of those posts -- i.e., the posts James is responding to, is not that "project-based learning... make[s] learning interesting".
That may be your main argument, but it's not the argument that the people who James is responding to are (mainly) making.
To this is coupled traditional courses in basic math/algorithms/scientific theory.
These lines produce candidates that are wasted supperior to our traditional universities. Well, if you need them to actually work with CS in the real world, I’m an employeer not a scientist, so I have no idea if they produce better candidates for research. But for real world jobs, these kinds of candidates are the only ones coming out of the universities that I can safely put in a position and expect to see them become productive after 1-3 months.
Traditional candidates take 6 months of mentoring before they start earning their salaries. They’ve often never even deployed a project in anything resembling a real world project. They’ve never worked in teams, and simply don’t know how to do so. They don’t know how to communicate with non-IT people. And so on.
As I said, project based teaching may not produce better CS candidates in terms of how good they are at CS, but it does produce candidates that can deliver a finished quality product on time.
The problem is, as you mention, _universities_ are not, and should not, be geared towards the industry. That's why we have the engineering programs, which are. They are expertly taught all the Microsoft technologies completely as the mainstream Danish consultancy expect from there employees.
Furthermore, I reckon that "work with CS" for you means implementing Sitecore websites, do a bit of C#, and maybe proramming a Typescript webapp? To me those tasks have nothing to do with CS. They require no knowledge of typesystems, complexity theory, etc. (Keep in mind that CS on Danish universities is datalogi, which is the academic discipline and does not directly translate into CS as the american program)
I've always said that traditional computer science programs should really be split up into the "theory" part of computer science, and software engineering curriculum (the practice).
I think Aalborg CD students are far superior to students from Aarhus or KU, but only as production ready candidates that can work in a team, not necessarily for R&D.
I'm curious as I graduated from DTU several years ago. But since I'm not Danish and I moved back to Oxbridge I lack insights on the different universities out there.
DTU was a very nice theory + project-based learning setup. IMHO, along with Chalmers and a few others, a really underrated option. It was quite heavy and rigorous on the theory side of things.
Most students are oversubscribed with four or five classes. There tends to always be some portion of a group which contributes trivially or simply doesn't. When you've got a group of five, one is typically unreachable, apathetic, or just exhausted. Pairs are a gamble, a bad partner being a massive workload. When you're the person who's carrying the group, you usually end up grokking everything in the project because you touched all of it. However, since you touched all of it, you're exhausted and likely pissed.
What I've found works is individual work sample projects, particularly those timeboxed to 'hopefully a lab period but you have a week'. The GPGPU course at my school has labs which are 'fill in the blanks' for cuda code. As the course goes on, those blanks get progressively more complex and make you flex your core understanding of the course. For more theory driven courses, the standard set of assignments works nicely.
As an aside, group projects seem to be partially motivated by the TAs and profs trying to deal with larger class sizes. Taking a senior level graphics class with 30 people lets you write 4 gnarly opengl projects which the TA marks in depth; a software engineering course with 200 people, a weekly deliverable, and 5 TAs? Dividing that by 5 is more realistic.
(For context I go to a school not known for their undergrad CS program; experiences may differ in other institutions)
Perhaps that's part of your problem?
At the school I'm going to (KTH) you always take exactly two courses at a time. That makes scheduling group assignments a lot easier. You also tend to take most classes with the same people, so once you've found a decent group you can stick with them for most project courses.
: Almost, you're allowed to take an extra course if you ask nicely and have kept up with coursework so far
Lots of people have taught themselves to program (and many other subjects and skills) by doing their own individual projects.
I would argue that motivation is the single most important aspect of learning - if you're not motivated to learn something, you're not going to do it. I tried to learn programming skills on and off all through my teenage years and early 20s, but it always petered out because I never had anything I really wanted to do in specific with them. I lacked the motivation to just learn the theory without having a project to work on.
When I finally found a project that excited me and could be broken down into manageable pieces to learn, I made infinitely more progress.
Recently I've been going back and learning actual computer science concepts. I've got enough programming knowledge that the computer science things actually interests me now, and I have the motivation to spend time on theory and exercise that I can't immediately apply to a project.
But I never would have gotten to that point without project based learning to begin with, because I never would have been able to stick with it. I wouldn't have cared enough.
Jimmy might be motivated enough by learning the theory that it works for him. But it doesn't work for everyone, and trying to act like there's any one single right way to educate people, when people are so massively different in so many different ways, seems to be fairly arrogant.
This feels a bit like a strawman. I agree group projects are not the best way to teach individual concepts. However, they're the best way to put what you've learned into practice. To use the author's martial arts analogy, group projects are like sparring in a safe environment.
Group projects are also the best way to practice your group communication skills, something isolated exercises cannot help with.
The worst about group project is when one person picks all the interesting tasks and learns, others don't. Which means that people who know least and need practice most, are the ones that are pushed to anciliary roles with only little practice happening.
The reason this happened is because of a lack of group communication skills. I would blame your instructor for not pre-addressing problems like this before things got started, or not checking in at all.
While these exist in workplace too, setup is different, goals are different and dynamic is completely different. And you can usually find yourself non-group setup with own accountability and tasks.
Most important difference is that in school you are supposed to learn and this turns whole thing to waste of time where you don't learn - and then you miss stuff you was supposed to learn.
Of course. But it at least sets the expectation that these things need to be talked about. Otherwise students won't know what to do, or even realize what's happening until after the work division has been decided.
Not just that, but a competent instructor should have checked in with the groups to make sure this sort of thing doesn't happen.
I don't think that's necessarily separate from project-based learning.
To steal an example from another commenter, I'm actually nearing the end of a figure drawing course at a local community college (because, you know, sometimes I want to not be coding :) ). The three components of drawing (as described by this professor) are line, value (shading) and gesture (shape/flow).
One way you could teach that class would be:
- Drill line. You're gonna straight lines, curve lines, squiggly lines until you can draw any arbitrary bezier curve you want.
- Drill value. Just draw a metric ton of gradients.
- Drill gesture. Draw lots of figure shapes without every really worrying about filling them in.
- In the final class, try to put it all together and draw a whole person.
A much better way to teach it would be:
- Draw a whole person, but don't worry too much about value and gesture- just try to get the line right.
- Draw a whole person, but don't worry too much about line and gesture- just try to get the value right.
- Draw a whole person, but don't worry too much about value and line- just try to get the gesture right.
- Draw a whole person at the end using everything you've learned.
The latter way "drills" subskills, but it also practices the overall skill of "drawing a person."
In the same way, I'd strongly advocate project-based "drilling." Build an app to learn guis. Build an api-consuming cli to learn about apis. Build a chat app to learn networking. And if you want students to learn about networking, either assign it as an individual project or make sure all students work on the networking code (his chat app example was just a poorly-implemented assignment, not a problem with project-based learning).
Overall, it's hard to argue that someone who's spent most of their time building projects won't be better at building projects than someone who has not.
Academic success or conceptual understanding alone are not good indicators of professional success. Perseverance and collaborations are at least as important as is the ability to sell. Also let's not forget: The market is usually buying experience and not conceptual knowledge with unproven track record for application.
Projects may not be the best way to learn in depth about an isolated aspect if the project is end to end. But that is setting up a straw-man to be torched. A proper way to learn about networking e.g. would be measuring TCP performance and experimenting with the ramp-up. Or writing a protocol decoder for something non-trivial.
Therefore, at random places at random times there will be courses that emphasise some methodology/ies where a majority of the students do well, or not, given some kind of bell curve distribution of student-methodology success.
The outcome of which is that, occasionally, someone can write a convincing argument for or against some particular methodology.
Teaching and learning, at scale, is difficult; outcomes are ill defined and hard to measure; correlation, causation, competing priorities and influences.
This always fails as there is never time budgeted for this in the schedule, so the software professional is not just facing the traditional underestimated deadline, but is now faced so doing with partially unknown tools and under intense pressure to just cut every corner and rush the first ting that doesn't blow up completely. Next project, the 'new tech' is now considered 'known' and so the kludges become practice and decent learning never occurs.
How can an academic be arguably arguing against Constructivist Learning Theory  without mentioning Piaget  or Papert , from the birthplace of Scratch  no less.
The author recommends, "design drills for it" but at the same time deriding project-based learning as simulating work. I cannot think of a better way to encourage stimulus-response coding than, "doing drills". Projects provide context, which provide anchors for knowledge. Projects provide a constant stream of problems with a motivation to solve them, the teacher should be there to guide the student towards the knowledge and skills to solve the problems as they arise.
For example, in 10 hours of time I went from 0 knowledge about Elixir / Phoenix (or functional programming) to having my own Phoenix app up and running with multi-login passwordless token based auth, webpack integration, etc, etc.. I now feel like I have a really good handle on how an Elixir app can be set up with Phoenix and how all of the front end aspects of a web app fall together with it (routes, templates, views, endpoints, plugs, etc.).
I spent the least amount of time possible just looking at Elixir's beginner guide to get a feel for the syntax and then I started my own app and just looked up stuff as I needed it. Nearly all of that time was spent doing "feature based development" on the app.
"I need to add a /faq page, ok, how do I generate a controller and hook up a new route, let me check the docs."
"My app layout is getting a little gnarly, how can I use template includes to split out my navigation, let me check the docs."
This style of learning is how I teach my https://buildasaasappwithflask.com course too. We cover over 50 general web development topics, but topics are went over in the context of building a real world app, implementing features as we go. Then there's self guided homework assignments to implement even more features into the app.
How do you prefer learning?
While it is true that principles of programming are better taught in environments that are more suited to that (in my days that meant Modula2 and Common Lisp), one should not just assume that those skill acquired can be transferred by students unaided to different contexts. As a TA later I watched in horror how students that did perfectly well in programming classes reverted to the bad habits they had thought themselves in hobby programming before college, once they were asked to do a project in the 'industrial' IDE's an languages they were used to back then.
Our 'learning' is far more contextualized than we believe and transfer is hard, not automatic.
Project based learning is inefficient because you repeat things you know lots of times. The repetitions are spread out too far to be burned into memory.
I find that even after 6 years of experience, I still have to Google to convert an int array to string array. I've done dozens of projects, but this thing was never optimized for. It works fine to complete a project, but becomes a drag when trying to implement more complex code.
The ideal would be to internalize as a kind of instinct. When something is instinct, the subconscious can calculate it and work out solutions.
I think the best kind of learning would be to a kind of coding dojo, where they repeat similar routines until it becomes a part of their instincts.
You really don't learn network programming from programming a a chat app. That project is broken for that learning outcome. If you asked a network expert he would probably suggest to implementthe OSI model as a C library or something like that.
I would also be very cautious about the group aspect of project work. And again, you would need somebody with experience in the field to help you divide the tasks, if you insist on doing group work. Another suggestion would be to do it alone, and get to communal experience into the project in other ways, maybe through reading groups with people doing similar projects.
#1 problem with folks out of college? They have no idea how to start or maintain a basic workflow. Project based learning provides space for this.
This dude doesn't recognize the audience that most blog posts target.
Most people reading blog posts about learning code are NOT in college and ARE NOT employed by colleges.
So the challenge for learning is more about motivation and projects tend to help you stay motivated more easily that little learning lectures and components.
Let's compare programming to figure drawing for a minute. What you could do to learn for example is, you could learn the details of some part of the figure, like the eye. Well, that's cool but it's not gonna teach you. That's because the hard part in drawing the figure is dynamic, balance and proportions. It's not so much that the details are "easy" and that they don't require to be learnt, it's that, if your dynamic, balance and proportions are off, the details will be too.
Why? That's not at all clear, even from your example. "If the mannequin is wrong the details will be wrong" does not imply "everyone involved must know how to do the mannequin". Just that someone must.
Let's go from figure drawing to true multidisciplinary art -- 3D graphics. At least in the gaming industry, the rigger (creator of the skeleton and joints, relative expert in anatomy) is not necessary the modeler, is not necessarily the animator, is not necessary the texture artist, is not necessarily the 3d shader expert. None of these need to be particularly good at any of the other skills.
If you want the quickest path to a SV job, you learn the most modern frameworks and drill them until you scrawl the boiler plate in your sleep. Project-based learning is extremely appropriate for this. You absolutely don't need four years of education to crank out a comfortable 6 figure salary off this knowledge. (In fact, it might be a liability if your pet framework ages out). You only have to be able to fill a need (even one will do) which a lot of companies have.
And if we're being really honest, this non-project-based stuff is pretty useful as SV job prep too. Not knowing things like linear algebra, statistics, and discrete math can be career limiting. I mean, how many of us are really ready to jump the gap to doing serious ML work?
I think this guy's definition of project based learning is "large finished project".
My idea of project based learning (which as a self learner I use quite a lot), is very _very_ small examples, try to make those examples work in the most unpolished way from a user perspective (focus on the core functionality) and explore the different ways in which they can work as a way to explore the topic or idea that you are interested in. _not_ "try to make some end product based on this idea", that's a whole different venture.
In his example of making a chat app to learn about networking:
> Over the next three weeks, Bob spends a lot of time building the GUI for the app, which he already knows how to do, and only a couple hours implementing the client protocol.
If he wanted to learn about MVCs and GUIs this is indeed a good project. If he really wanted to learn about networking from the perspective of chat, he should have either avoided the GUI and come up with the simplest possible CLI based interface... or avoid the interface all together and just experiment with the core networking code and send hardcoded strings or pipe in some text without a nice CLI. I mean this smartphone chat app is really a strawman IMO.
Disclaimer: I've read nothing about project based learning, I only recognise it as a concept that has emerged in my own personal learning strategies as I am sure it has in many others.
You may not have read anything about it, but you've nailed it.
If it's project-based learning, the learning has to take precedence over the project.
This is why I had side projects when I was a software engineer.
Most day job projects have to work, with minimal (technical, schedule, etc.) risk, and minimal development time and other costs. For a low-innovation project, this generally means using familiar tools. For an innovative project there's still plenty of skill development, but often in circumscribed areas. (This is true for journeyman to master engineers. When you're early in your career, you get learning opportunities for free out of any halfway-decent environment.)
A side project can take an arbitrary amount longer, or fail, without letting your team or organization down. (Or, a side-project with a team can be a designed as a learning experience, if that's a shared team goal. This is gigging with your band, not building with your crew.) This creates the space for prioritizing new concepts and skills, and practice on weaker skills.
I think a critical piece left out of the article is the role of emotional affect. Many people are learning because they want to produce a big, realistic set of behaviors (a project), so a good learning curriculum will motivate through a project and drill through small, independent exercises.
Another example, the article mentions Haskell being more useful for teaching cs concepts compared to node, or piano for teaching musical theory. While this might be true, we also need learn how motivated students will be to use these tools over others. At the end of the day, the usefulness of a tool for teaching concepts will be (its effectiveness when used)*(how motivated people are to use it).
Disclaimer: I'm working on https://sagefy.org/
Then I went to college and learned the fundamentals. I started doing projects in my spare time (often by attending hackathons), and now those projects were getting a lot better. The academics were crucial. However, I had friends that never did projects outside of class, and as a result it hurt their learning.
Then, fast-forward to my first job, and I learned just as much in my first year there as I did during my entire time in college.
So IMO, project-based learning without fundamentals (which is what I was doing in high school) ends up being really slow can produce poor results. But just learning fundamentals can be equally as bad. You need both!
For me, what good is it to learn all of the fundamentals if I'm not sure those fundamentals are getting me closer to my goal?
Especially in the world of tech, there is no short supply of "critical fundamentals", but woefully less in actual implementation.
And in the tech world, there's a lot of implementation that lacks even a cursory understanding of fundamentals.
IMO, this isn't a problem in the approach of learning, but the learner stopping at a superficial understanding on either side.
Some people don't like to jump into new endeavors without understanding the concepts. And others don't want to be bothered with abstracts until they can see the results.
Good students beat bad ones, regardless of the journey that took them there.
For example, for learning Web Development, I'd prefer a project based model where I learn concepts and apply them to my project side by side. This way, although the end result would be mess and likely incomplete, I get to learn a lot from the mistakes and issues that come in the way.
On the other hand, for learning about compilers, a tradional semester-based course, with 2-3 lectures/week, and maybe an coding assignment/week that can be completed in an hour or two would work best. In this case, while I don't have any real project at the end, I do learn the core concepts.
Imagine you want to teach a cohort of incoming CS students to build simple web-server-with-database applications.
The "traditional" way to do this is to first teach them classes on web dev, OOP, and databases. Afterwards, you give them a software project.
I agree with "modernists" that if you just teach the three classes, but don't set a project at the end that integrates the knowledge, you end up with people who are really good at passing whatever style of exam you set them but can still be poor at actually building applications. That's one reason why recruiters care about your portfolio as well as your CV, I guess. It's also why universities, since time immemorial, have required individual projects and dissertations to get a degree - after teaching you the classes, for your "masterpiece" you work on a project.
The problem with making everything a project and assuming they'll learn or teach themselves the basics as and when they need them, is that without some understanding of what the building blocks are to build an application you end up bogged down in the details and developing by trial and error.
Real examples I've seen: spending the best part of a week on code to manually parse incoming HTTP headers (missing the fact that the server library already does that for you); writing custom code to serialise lists into a database field because you don't understand foreign keys (and of course breaking 1NF in the process); loading individual items for which you have an "id" by "SELECT *" on the whole table and then doing the filtering on the server, in O(N); and of course writing code that's vulnerable to both XSS and SQL injection left right and centre.
Even in a project unit where every student has a slightly different and very open-ended project, there's some common ground, e.g. "using prepared statements" for which there is such thing as a "one right way to do it" and my experience is that actually teaching this kind of thing in isolation is the only way that works.
Group projects... well, that is a another thing entirely.
We also have a co-op program which is the best thing in the world really. I simply don't understand why most universities don't have one. We do 4 months of school then 4 months of work in a company (then repeat). 5 internships during a 4 years bachelor give you almost 2 years of experience when you get out of university. Since they are paid internships, you can live well during school semesters and you don't have to work while studying. In addition, you know way more what type of job you want to do since you tried so many different companies. I personally know that I like smaller companies that operate in the fintech market and that I like backend more than anything else I tried.
> I don’t teach network programming and haven’t tested these specific ideas. But, in my specialty of software design and code quality, I use exercises built on similar principles all the time.
In the real world, learners do not come packaged with all the tools and links and resources they need to hammer out their solutions. Classrooms are great when you can have a textbook and pre-defined problems and such, but that is not now it works outside of the classroom (or the virtual classroom, as the case may be).
"Software design" is not an isolated component from network programming. It is that bridge between big data always getting piped in to (app / product / website) from the network, and how to help ( user / customer / other developer) where most engineers are hired to create value.
It is a good idea to attempt to design software with "code quality"; conceptually that sounds like a fantastic idea. But it is impossible to do without also factoring in a predictive element of how it will be used or accessed or even discovered over networks, stores, whatever.
Project-based learning is an awesome way to learn a new framework or language, especially if you already have a lot of experience. I also learn very well from tutorials when I'm a beginner at something. For instance, I followed some Blender tutorials to learn 3D modeling, and I think that's the only way I could learn. The Blender UI is far too complex to just click around and figure things out.
However, there are cases where you really do need to study a book and sit exams. I have many years of experience with AWS, but I'm not an expert and have lots of gaps in my knowledge. I've been thinking about getting an AWS certification, so that I can deeply understand all of their services.
Both of these maximalist notions are so far from true, it's not even worth asking which one is closer.
> Project-based learning is the most inefficient form of learning that still works.
Where it lacks efficiency in learning, it boasts incredible precision in application. Project-based learning is practical, if not fundamental. You will learn to do something tangible, but of course this may not be the most efficient way to teach conceptual lessons. Concepts and understanding (aka "the math") will be required to decrease learning curves for future applications.
So yes, both project-based and course-based learning are important, it isn't useful to completely eliminate one or the other, but I do feel the world is currently partial towards course-based learning.
This is incredibly effective because it reinforces your individual learning with the knowledge of your peers and provides a feedback loop to the professor who would tailor lessons based on what the class was having difficulties with.
TBL is not an easy program to run though, and requires a lot more forethought and time from the professor that is running it for it to be effective. There are some courses where it is a huge improvement over traditional teaching methods and some where I would argue it isn't.
E.g. I've been doing software a long time, so like the github guy in OP who liked project based learning, that works really well for me when I basically already know what I'm doing. E.g. I know C-based languages really well, put me on a project with a new language, and I'll pick it up, np. Learning a new language by drilling would be very boring for me.
But if I'm learning something truly new/novel to me, e.g. machine learning or Haskell, where I have zero existing intuition for how it works, then drilling constitute parts is very useful.
I don't think there is a real good or bad but rather depends on your goal and motivations. Depending on the goal, some people might learn more from projects than course content and vice versa.
Most of my time coding is spent pouring through code that i didnt write (at least not recently), building the design in my mind, finding the spot to change, and then making that change.
In contrast, most school coding assignments are write-from-scratch one-offs, usually involving only one students' code.
Funny you choose Horowitz, Monk, Starvinsky and Cage as examples. Which either never attended, rejected or had limited academic education.
“5 PBL Pitfalls to Avoid” [^1] may not be the final or best word on PBL, and it focuses more on PBL in primary education, but it's relevant and I was able to find it in ten seconds via Google. ([^2], which links to this, is the top hit for “project based learning”.) “The Practice is not the Performance” would be a stronger article if it either (1) responded to PBL as it's actually used (in my understanding and experience) in schools, or (2) argued that, and why, college-level applications typically stumble into the pitfalls in [^1]. (Which may indeed be true, but I'm not prepared on the content of “The Practice” to weight the authors claims over my own experience.)
The author's other point is around isolation practice. Well, yes. Olin's current Software Design course (intro to Python and programming, typically taken in the second semester) takes a textbook approach[^3] to the basic concepts of programming and computational thinking, and introduces projects as concurrent stream. But also — musicians, athletes, and other performance practitioners are often advised not to perform isolations to the exclusion of longer phrases or entire pieces, lest (1) they forget how to combine the elements into a fluid whole, or (2) they remove the feedback loops that tell them what areas of weakness to isolate. The software analogue of (1) might be design principles and practices that don't appear with isolated elements or at small scale. The software analogue of using isolations within a project context might include discovering that you don't understand a language feature well enough to use it or debug it, taking time out to explore it by reading or experimentation, and then returning to the project. This doesn't downplay the importance of isolations, but it uses them as a complement to PBL, not a substitute.
I think there's a lot to be learned about software engineering (which I think is what the author means when he refers to “CS”) from instruction in music and in studio arts, but it's better learned from a deeper look at both formal and informal instruction in these areas.
There are some open questions, not (I believe) mentioned in the article but raised in some of the comments here:
* Who is PBL appropriate for? For example, Olin College uses PBL — mostly within the context of Team-Based Learning (TBL) — extensively. However, the Olin admissions process[^4] evaluates candidates within the context of a group activity, which overlaps substantially with PBL and TBL, and provides not only information about the candidates to the students, faculty, staff, and alumni on the team, but also provides a taste of the TBL/PBL experience to prospective students, that they can use to self-select prior to matriculation.
* In what domains is PBL applicable? It seems to me a clear win for many engineering, design, and entrepreneurship topics. Olin's Quantitative Engineering Analysis[^5][^6] is a multi-year experiment in applying PBL to introductory-college-level math and physics.
[^1]: Frank McKay, “5 PBL Pitfalls to Avoid”. https://www.edutopia.org/article/5-pbl-pitfalls-avoid
[^2]: Edutopia, “Project-Based Learning”. https://www.edutopia.org/project-based-learning?gclid=CjwKCA...
[^3]: Literally. Allen Downey, who designed an early version of the course and is on the faculty, wrote the textbook: _Think Python_. http://greenteapress.com/wp/think-python-2e/
[^4]: Olin College, “Candidates Weekends”. http://www.olin.edu/admission/candidates-weekends/
[^5]: Olin College, “QEA”. http://meet.olin.edu/olin-isms/quantitative-engineering-anal...
[^6]: Olin College, “Changing Course”. http://www.olin.edu/the-wire/2016/changing-course/
The author never talked about whether this was actually a problem.
And frankly, it's a shameless plug for a course that literally uses project-based learning in it.