For instance, immediate feedback ought to deliver learning. So they looked at which courses give immediate feedback.
But it would be even better if somone looked at which learners actually learned. You took the course now build xyz. Or pass a test or something.
There might be a difference in the sorts of “immediate feedback” that are effective. It’s an indirect proxy of what you are after.
Unfortunately, as others have noted, doing this is intractably hard. We've reached out to dozens of coding tutorial companies, but none want to share their visitors' contact information (understandably). We've tried contacting students on our campus and others to find people who've used tutorials, but few have. If you have ideas about how to contact tutorial users, please share!
The even harder problem is measuring learning. There are essentially no reliable, valid measures of any knowledge of programming (exams in courses are mostly unreliable, invalid measures). It's something we're working on in my lab, but it will take years, as it has in math and physics education.
University of Washington
I suspected it was hard or else, I assume, you would have done it. But I thought it was hard because getting everyone to take a standardized test before and after the courses would be hard, not because even the contact information was not available!
Also, what do you mean exams are unreliable? Is there something you can point me to. I did not know that was the case.
That's not to say it can't be done, but is a much larger and more expensive piece of research.
I agree though that project-based learning where the student takes an active role in the creation and development of projects is crucial. If you think about it, when the student enters the real world/work force, everything they do will be focused around projects of various size.
Disclaimer: I'm working on sagefy.org
It seems to be a double-edge sword if you plan on using assessments (the only way I can think of to assess prior knowledge) to do it because people want immediate satisfaction. Unfortunately if you told people to take a 25 question pre-quiz before starting the course I am sure most would simply leave the website.
Another way to assess knowledge would be to give the learner topics of varying difficulty to select from; it's unlikely that they'd choose something so easy that it'd be boring and in case the topic is too hard, there should be an escape route to something easier.
A simple example is a user manual online, http://support.casio.com/global/en/wat/manual/5411_en/ . The topic of each section is clearly indicated and when parts of another section are mentioned a direct link is given. In this way the information is presented as the reader wants it.
The book Code Complete: A Practical Handbook of Software Construction, provides a brief description at the start of book of what a chapter contains and suggests reading orders based on what the reader is looking for. These are examples of the reader choosing order.
Examples of relevant questions is in Maths textbook Mathematical Techniques: An Introduction for the Engineering, Physical, and Mathematical Sciences 4th Edition
, where sections contained check your understanding boxes. In these boxes a problem relaying on the information in the section was given followed by a full worked out solution. The sections ended with more problems related to the immediate section followed by questions that involved knowledge from pervious chapters. In the way same code examples could be presented and at the end of chapters a build your own program challenge.
I've always been a huge proponent of example-based learning, and less about theory and generalized concepts.
Still, though, I think the online tech ed space has a lot of growing up to do. In time, we'll all wise up to provide better experiences for learning. Until then, for the most part, we're stuck with:
1. Google: "How to do X in Y"
2. Watch 3 different videos or read 3 different articles
3. Apply it.
4. Run into a snag.
5. Google / Stackoverflow
Too many new learners are stuck unable to make the leap out of the browser based IDE and into running their own code locally on a machine or on a server.
Learning multiple languages (of different paradigms) led me to implement code in one language to one in another. Even though the first set may have been heavily based on some tutorial, when translating my own code, I had to be able to reason about it, break it apart, and re-compose it.
This line of thought helped make me more capable of talking about and thinking through my code as an idea, where the implementation was just an exercise.
If anyone else is teaching themselves, I would advise the following:
+ Learn a high-level OOP language
+ Learn a high-level functional language
+ Translate programs between the two, focusing on really working with the respective paradigm.
+ Move on to something lower level. Many people will say go with C to learn manual memory management. I found Rust interesting, if not difficult at first. Up to you, just seek lower-level understanding.
+ Learn an assembler, fake or real. Shenzhen I/O and Human Resource Machine are good introductions to this.
+ Build skills in your languages through katas. Learn to think and write in a language with minimal reference. This is simple, persistent, repetitive practice.
+ Read about a discrete math you find interesting. You will likely not be as strong at math as educated people. Discrete maths can help at least give you some good flair.
+ Seek constant review from peers.
+ Don't always agree with peers. Find ways to defend your viewpoint. Good arguments- not "Java sucks." Be able to say why you have the right idea.
+ Learn simple circuits. They help you think atomically. There are even sites that have virtual breadboards so that you can experiment and learn for free.
Recipes can work, but they are just the first step. Learning how to "remix" a recipe and make it your own is the next. It's a long journey, for sure. I would agree that online resources that are the most visible are generally very entry-level and hand-holdy. I would also add that there is a vast amount of deep knowledge freely available for those wishing to go further.
Loksa, D., Ko, A. J., Jernigan, W., Oleson, A., Mendez, C. J., & Burnett, M. M. (2016, May). Programming, problem solving, and self-awareness: effects of explicit guidance. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (pp. 1449-1461). ACM.
I think self-sufficiency should include building a local environment you can work work in without having to rely on some SAAS / Cloud service based IDE.
Is there even a single tutorial that a self-taught undergraduate coder (who has already went through a number of simple tutorials, can code a number of languages like Python and C# and use GitHub) can take to learn how are real-world apps made, what do professional (professional ≠ veteran, it just means a person who earns their living doing it, working in a team usually) coders actually do all day long at their workplaces, what are they meant to know, how to solve interview puzzles and everything enough to go and get a coding job with reasonable ease?
As far as I know many companies need coders that can join a team and start writing/maintaining plain ordinary code, not necessarily brilliant code, just code that is readable, maintainable, reasonably fast and reliable and solves simple tasks it (a particular piece of code) is meant to. As far as I know a huge portion of people can potentially learn to do this with ease (you don't need IQ>150 for this, 100 is enough, even 90 will do which means the majority of the population). But people don't know how. And knowing a programming language syntax and being capable to code a simple script for yourself isn't enough: 99% of people who speak a language and can write a simple essay have no idea about how to write a novel (not necessarily a best-seller, even a mediocre one) - they are to be taught to structure it, introduce and develop characters, describe places and situations nicely, bake an idea in, maintain suspense, organize their writing process etc - it seems arcane to them while it essentially is just a number of plain and simple principles that can be outlined in a single manual. If only somebody having both the knowledge (skill) and the talent to explain would bother to write one and wouldn't write a huge tome practically obsolete by the release date instead.
Lets use java for simplicity. We're going to do a non trivial app thats web based so we dip into the exciting worl of java EE.
Probably need to throw spring boot or cdi onto the pile of things to be a little familiar with.
Then you need to teach how app servers work. getting one setup. which computer/ os etc.
Zooming in take the build process, 'Setting up jenkins' Thats a big tutorial all on its own. you'd need to know Git or SVN or one of the others.
then you step into building a CI pipeline, Building a test suite, functional, unit and integration test. building reporting ontop all of that.
That would be mega tutorial series. And the moment you finished it, people would start pulling strips off it for being out of date. Or they would attack it for language and tooling choice.
But yeah, the churn in the industry is crazy, so it would be some work to compile a list that's coherent and orthodox that didn't go out of date the moment you published.
Most article writers aren't differentiating their audience enough, making their articles difficult to read for most people. There is a huge variance of "beginners", for example. Further, most developers who write these articles don't have a holistic learning framework in mind. Lastly, I'm seeing far too many technical tutorials just for SEO or attracting eyeballs lately. These articles are great for attracting attention but are typically technically shallow and add to the confusion rather than help.
Probably not a big deal, but I thought it was an interesting connection.
After the research, I raised some money through a Microsoft Research grant to support some additional work and pay for bandwidth for a decade. Just $4,000 left until I have to pull the plug :( If you like Gidget, please make a donation to support our skeleton crew.
Most people creating educational content simply do not have those skills.
1) Academics in education talk about the use of technology in education but often don't know how to use technology in depth.
Story: In attending the World Conference of Online Learning - held at a very capable hotel - there were no power bars for laptops, and no sessions were recorded. Less than 5% of 1500 attendees (academics) carried or used a laptop or tablet for notes. Typing fast to take notes received comments. This was one of the leading conferences (apparently) in the world. Over 100 countries, 500 speakers, and 200 sessions.
2) Some academics seem to regularly trash education that technology may be in fact delivering well for students beyond the Academic's ability to understand.
One talk I attended at this conference was a Ph.D. essentially skewering how Duolingo is not a good way to learn a language. The arguments may have had merit, I found it hard to take the talk seriously, well, because the millions of users reporting a positive learning experience with tools like Duolingo is well known. Learning to speak a language is incredibly valuable for many people.
3) If academic's don't use the right terminology...
It's a little baffling why academics use the word pedagogy when speaking about the use of technology in education.
Pedagogy is how children learn, not adults. Andragogy is how adults learn.
"Pedagogy", "improving learning experiences", "improving student outcomes"is on a long chain of lean startup type groupspeak where people continue doing what they do but sprinkle a little innovation dust on what they're doing. Want to upset an academic? Correct them in using Andragogy instead of Pedagogy as a non-academic.
4) Innovation in Education + Technology.
Technologists can learn education easier than educators will learn technology. These two groups need to come together, or as I suspect, educators competent in understanding the capabilities and possibliites of technology will need to exist in higher learning. The rate of change in society may be hard to keep up with a tenure seeking mindset.
5) K-12 educators seem a lot more tech friendly than post-secondary educators.
Post-secondary educators seem often threatened by technology replacing them, and fail to recognize that improved digital learning experiences will be like better text books.
If our educators aren't levelling up on tech use, they wont be able to help the students of the future.
Other claims you made are wrong. That people report positive learning experiences with Duolingo is not evidence that they can read, write, or speak a language. In fact, there are several studies of Duolingo that show the exact opposite: there's massive variation in learning outcomes, and it really only supports the most motivated and resourceful of learners. But that's what research is for: to test assumptions with data.
The existing government run education model is completely broken. They are openly hostile to innovative technology. It is a matter of philosophy and it cannot be fixed. If the system is broken, it is time to have it replaced with something better. Eventually, something better will come a long and citizens/taxpayers will begin to question why this old model still exists, just like how people ask why certain businesses still exist.
If there is any innovation to happen in education, it will not come from traditional universities or schools.