Hacker News new | past | comments | ask | show | jobs | submit login

Whilst testing like that seems to be possible for granular facts and most likely is a great way to handle these, what about conceptual understanding?

The only "surefire" way I've found to test that in myself is written condensing of the concept in a way that allows another person to understand the concept from my notes, or actually explaining the concept to another person.

That doesn't scale to 20-30 people in a class though since validating that number of explanations is not easy.

Random probing using application of the concept however encourages the cooking recipe approach. There's only a number n different types of problems you can generate, and by learning all of those is possible. They're easy to grade, but they don't guarantee full understanding.

Then there's using compound problems, where you start mixing concepts. If a student understands say momentum and friction, they should be able to work out a problem involving both. And those are the bane of students who did not invest in conceptual understanding since the number of recipes to learn suddenly increases exponentially. First midterm of physics 101 usually leads to a number of exasperated "This wasn't covered in class". Yes it was, just not this specific recipe. Now the issue with this type of problems is they cover a number of concepts, and pinning down the culprint/s isn't exactly straightforward. I guess you could tackle this through deduction based on a number of pairings (say momentum&impulse,momentum&friction&whatnot...), but noise would mess with that process.




Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: