This dual set of brain regions are gatekeepers to long-term memory.
They/it only lets in things it thinks you might need later.
And it grades this need based on emotion.
So, if you want to recall something later, you need a purpose, and if the purpose ties into something emotional, all the better.
(I know some people are going to fight me on the assertion that emotion regulates memory, but do a little research before telling me I'm wrong.)
An example, before you start starting studying a topic, tell yourself, consciously, why you are going to learn this specific thing, and how you want to use it. After studying, tell yourself how are you going to use the things you have just learned, imagine yourself applying them in different scenarios. Even these two simple exercises helps brain to make those connections GP told about.
I don't want to get in to the topic of whether NLP has a scientific base, just want to emphasize that most of these techniques are brought together from examining how experts relevant to the area do things that made them successful.
I don't have enough time right now for a longer answer or sources, will try to expand later.
My spaced repetition system requires that I copy ideas I want to recall. I'm sharing my summary of the main ideas in case they help you.
# Learning by Building Connection
New ideas have to connect with what’s already there — they cannot be stored, as if in a filing cabinet.
Throwing facts at people doesn’t work. You have to:
* connect ideas to other ideas and everyday things
* understand an idea in multiple ways (e.g. words, visuals, etc.)
* once you make connections, you have to maintain them
* you don’t need to unlearn connections to make room for new ones
# Maintaining what you learned
Recalling is better for retention than re-reading. Spacing effect shows that we forget things quickly the first time we see it, but if you exert effort to recall things spaced over increasing intervals of time, you retain a lot more than if you were cramming.
# Learning through deep connections
You have to process ideas on a deep level to make them stick. A great way to learn something at a deeper level is to explain it to someone.
In contrast we learn by doing, by manipulating things or concepts.
This all makes sense in light of the "human cognitive architecture". Working memory is severely limited while long-term memory is effectively unlimited. Thus, committing things to long-term memory frees up the scarce resource of working memory. In fact, there are many studies which indicate that the key differentiator between experts and novices is that they have a huge long-term store of schemata which can be applied in the problem domain and allow more efficient representation and manipulation in working memory.
"Why Minimal Guidance During Instruction
Does Not Work: An Analysis of the Failure of
Constructivist, Discovery, Problem-Based,
Experiential, and Inquiry-Based Teaching"
> Evidence for the superiority of guided instruction is explained in the context of our knowledge
of human cognitive architecture, expert–novice differences, and cognitive load. Although unguided
or minimally guided instructional approaches are very popular and intuitively appealing,
the point is made that these approaches ignore both the structures that constitute human
cognitive architecture and evidence from empirical studies over the past half-century that consistently
indicate that minimally guided instruction is less effective and less efficient than instructional
approaches that place a strong emphasis on guidance of the student learning process.
The advantage of guidance begins to recede only when learners have sufficiently high
prior knowledge to provide “internal” guidance. Recent developments in instructional research
and instructional design models that support guidance during instruction are briefly described.
> The past
half-century of empirical research on this issue has provided
overwhelming and unambiguous evidence that minimal
guidance during instruction is significantly less effective
and efficient than guidance specifically designed to
support the cognitive processing necessary for learning.
> Our understanding of the role of long-term memory in human
cognition has altered dramatically over the last few decades.
It is no longer seen as a passive repository of discrete,
isolated fragments of information that permit us to repeat
what we have learned. Nor is it seen only as a component of
human cognitive architecture that has merely peripheral influence
on complex cognitive processes such as thinking and
problem solving. Rather, long-term memory is now viewed
as the central, dominant structure of human cognition.
> These results suggest
that expert problem solvers derive their skill by drawing
on the extensive experience stored in their long-term memory
and then quickly select and apply the best procedures for solving
problems. The fact that these differences can be used to
fully explain problem-solving skill emphasizes the importance
of long-term memory to cognition. We are skillful in an
area because our long-term memorycontains huge amounts of
information concerning the area.
> The aim of all instruction is to alter
long-term memory. If nothing has changed in long-term
memory, nothing has been learned.
I'll quit there, but that's only a few pages in and it's all quite good and definitely worth reading IMO.
Do reading assignment, discuss/argue about it in class with a TA or professor to mediate and clarify stuff, maybe write a paper afterwards.
Engaging with the content is kind of a universal way to learn by doing.
Maybe that's OK for philosophy and sociology, but in other fields students will be actually asked to contribute to the world and so must be trained to do rather than to memorize.
Except for scholars (who are primarily historians) what of philosophy and sociology is worth remembering anyway? Most of philosophy was written before Darwin and certainly before any significant brain science research was done. Much of sociology is also questionable:
The only reason education focuses on memorization is because in 1892 a committee chaired by the President of Harvard decided that high school education in the USA should prepare students for collegiate study in particular subjects of his choosing. Should we, today, continue to follow the direction of those professors, who were only interested in ensuring that, when students arrived at their departments, they were sufficiently indoctrinated to be scholars in their fields?
Also, if I'm driving to a relatively new destination that I've been to before and think I'll go to more times, I try to remember the way without looking at the GPS, because I've found that if I look at the GPS every time, I don't end up remembering the route as well.
If Quality is something intrinsic to the observer/universe pair, learning is maximized along the path of most resonance.
Learning-to-learn tricks can often desensitize you to your "true passion" (this is assuming it exists) because there isn't a coherent narrative that allows your experience and skills to compound in a unique way.
Instead you have a collection of brute-force gained skills to solve problems for other people and by becoming good at this become numb to what "you" uniquely can see as a problem and create unique solutions to.
Analogies may be the exception because they grow, relying on roots and branches. An analogy that doesn't feel ham-fisted is naturally connected to your existing pattern of "you".
I also like "Refactoring your Wetware": https://pragprog.com/book/ahptl/pragmatic-thinking-and-learn...
However, from another angle, it's basically about using the idea of language learning to teach basic cognitive psychology.
(My phd was in human memory, and I can't stop thinking about ways this book could be used as part of a cool learning course).
RE networks, I agree that there is likely evidence for distinct patterns of activation in various neuroimaging studies, but having worked in memory + neuroimaging, I think there's a serious risk that people will take something like "statistically significant difference in brain activity" and use it as a substitute for "substantial differences in learning behavior / retention". (this is a well known problem in imaging).
I'm not too familiar to L2 acquisition research, though, but those are my impressions from thumbing through some of the field. Would def love to hear some study recommendations :).
Greg Wilson (who founded Software Carpentry) has a great collection of thoughts on learning to program in general: http://third-bit.com/
Learning at scale via MOOCs seems to be enormously effective. EdX alone issued 250K certificates for 2.5M registered users. Mostly in CS.
I'd be interested to see YC Startup Schools own results as well. Do at least 10% of Startup School 2017 grads go on to full time work on their companies?
I feel the need to inform you that you are, more than likely, quite ill-informed. Zines are a spectacular human achievement, worthy of your attention and respect. Or, were it otherwise, how do you explain this godamn amazing piece of work:
Essentially, zines were blogs before the web.
I agree that the hand drawn style isn't the best way to consume this sort of content.
Each project needs time for the brain to digest in order to make the right decisions.
You save time by going slow, but async. parallel.