For instance, immediate feedback ought to deliver learning. So they looked at which courses give immediate feedback.
But it would be even better if somone looked at which learners actually learned. You took the course now build xyz. Or pass a test or something.
There might be a difference in the sorts of “immediate feedback” that are effective. It’s an indirect proxy of what you are after.
Unfortunately, as others have noted, doing this is intractably hard. We've reached out to dozens of coding tutorial companies, but none want to share their visitors' contact information (understandably). We've tried contacting students on our campus and others to find people who've used tutorials, but few have. If you have ideas about how to contact tutorial users, please share!
The even harder problem is measuring learning. There are essentially no reliable, valid measures of any knowledge of programming (exams in courses are mostly unreliable, invalid measures). It's something we're working on in my lab, but it will take years, as it has in math and physics education.
University of Washington
I suspected it was hard or else, I assume, you would have done it. But I thought it was hard because getting everyone to take a standardized test before and after the courses would be hard, not because even the contact information was not available!
Also, what do you mean exams are unreliable? Is there something you can point me to. I did not know that was the case.
That's not to say it can't be done, but is a much larger and more expensive piece of research.
I agree though that project-based learning where the student takes an active role in the creation and development of projects is crucial. If you think about it, when the student enters the real world/work force, everything they do will be focused around projects of various size.