
Overtaxed Working Memory Knocks the Brain Out of Sync: Study - bovermyer
https://www.quantamagazine.org/overtaxed-working-memory-knocks-the-brain-out-of-sync-20180606/
======
wm_throwaway
(Throwaway)

I work in human memory neuroscience / linguistics.

1\. The 7-plus-or-minus-two study is entirely bunk. Welsh words are longer and
yield < 7\. This WM capacity is about verbal rehearsal.

2\. Here's background reading on the complexity and debates over oscillation
functions:
[https://neurophysics.ucsd.edu/courses/physics_171/Buzsaki%20...](https://neurophysics.ucsd.edu/courses/physics_171/Buzsaki%20G.%20Rhythms%20of%20the%20brain.pdf)

~~~
wodenokoto
What is so controversial about this that you needed a throwaway?

------
mromanuk
> In 1956, the renowned cognitive psychologist George Miller published one of
> the field’s most widely cited papers, “The Magical Number Seven, Plus or
> Minus Two.” In it, he argued that although the brain can store a whole
> lifetime of knowledge in its trillions of connections, the number of items
> that humans can actively hold in their conscious awareness at once is
> limited, on average, to seven.

So, your mind can hold 4-5, not 7 things at the same time or it goes "out of
sync"

edit: added quote.

~~~
bovermyer
This seems to agree with my own experiences. I've never been able to hold
seven things in my head at once, it's definitely closer to the four number.

~~~
DiffEq
...and you may not be able to do a handstand. But if you learn the technique
you can easily within a few weeks or less. Memory is no different. Learn the
techniques recall and you will be able to remember more.

~~~
graeme
Do you have any evidences that this applies to short term working memory?
That's what the article is about. The limit seems to be 3-7, and it's not
clear that's malleable.

~~~
namibj
Well if I'm sober from a mental perspective, I can reverse >=11 digits if
heard with a speed of 1~2 Hz.

~~~
graeme
Are you chunking them?

I don't really have a good sense of what that speed implies mind you.

~~~
osrec
I guess they meant 1 to 2 numbers said out loud per second

~~~
namibj
You are correct, and I can neither confirm nor deny your parents question, as
I've been out of (mental) shape due to the time of day. I don't think I did,
las time i tried.

------
azhenley
I just wrote up a human memory section in my dissertation a few days ago.

There seem to be two more widely accepted limits to working memory than George
Miller's 1950s limit: (A) 3-5 chunks [1] or (B) it _depends_ greatly on the
context and type of chunk (e.g., visuospatial) .

[1] N. Cowan. Metatheory of storage capacity limits. Behavioral and Brain
Sciences, 24(1):154176, 2001.

------
CodeCube
I'm curious how this might apply to your average tech worker/programmer type
person ... Are there common missteps that we do which overtaxes working
memory, and how can we adjust our working rituals to mitigate?

~~~
rashkov
In my opinion unnecessary abstractions -- especially splitting code up into
different files -- is a tax on working memory and leads to difficulty
achieving flow state. Every layer of indirection takes up residence in working
memory and so it had better be necessary, else it should be omitted even at
the cost of some code duplication

~~~
clarry
> In my opinion unnecessary abstractions -- especially splitting code up into
> different files

It gets especially bad when someone else wrote that code and you're just
getting into it. Layers upon layers of abstractions scattered across classes,
functions, files. Each add very little on their own, but you kinda have to
keep it all in your head (or write it down as I do) if you're planning to grok
it and fix that bug.

It's kinda like reading some piece of code not in its final form, but as a
pile of diffs are applied to it. Fun. :-)

I find older style procedural code with longer functions generally much easier
to get into than any OOP.

~~~
p1necone
As a counterpoint, I find breaking code up into separate classes and functions
to help greatly with minimizing what I need to keep in my working memory.

I don't often need to know what every single line does exactly all at once,
nor can I actually keep that much information in my head anyway. I'd rather be
able to say "that's the function that frobbles the subductor, it's 5 lines
long and right now I don't care how it does it.",

Sure, the bug might be in that function - but if it's only 5 lines long and
does one simple thing, I can write a bunch of tests for that one thing and
work it out. When the code is broken up like this, I can keep an even larger
system in working memory all at once.

~~~
taneq
I think the "functions should be small and do only one thing" maxim is
important but needs to be paired with "things should be small and done in only
one function." The more you split logic across multiple functions, the more
likely that the bug won't be in any one function but in the way that you
composed them.

~~~
rashkov
This is one of the most astute criticisms that I've seen of coding style,
along with your earlier post regarding locality of reference and duplication
of statements vs duplication of intent. I think these are really great
principles for writing clear and beautiful code. Are these ideas something
that you've developed from experience? Is there a wider context and
conversation around this kind of thing, that you know of?

------
Nomentatus
OMG we still have mercury delay line memories as our "working memory", no
wonder the world is a damn mess.
[https://en.wikipedia.org/wiki/Delay_line_memory](https://en.wikipedia.org/wiki/Delay_line_memory)

"Miller thinks the brain is juggling the items being held in working memory
one at a time, in alternation. “That means all the information has to fit into
one brain wave,” he said. “When you exceed the capacity of that one brain
wave, you’ve reached the limit on working memory.” "

File it under "path dependence," Pops. TIE (This is Evolution.) (a la "TIL")

~~~
p1necone
It's interesting - we have delay line memory effectively, but rather than bits
the individual chunks we can carry are these big fuzzy nebulous concepts which
possibly encode quite a lot of information in one chunk - albeit with many
errors.

~~~
namibj
Consider that there are ways to keep semi-coherent non-quantisizing thoughts.
You can sythnesize such an abstract concept if you step through a list of
'named' concepts that are overlapping with it, to generate the combination of
all the 'named' concepts. You can even associate one such abstract concept to
another, like when you learn a single vocab, but the repetition you need to
enable long-term memory has a strong impedance mismatch with the way you
generated that abstract concept.

Also, there are ways to form thought structures that you can't use a normal
single-step debugger on, as the intermediate states have no useful
interpretation. They are however much, much more capable, e.g. able to provide
you with the ability to scan through isles at a large store, while walking
reasonably fast through it, turning your head left and right. Your eyes scan
rough at first, moving to get more detail for those areas in an isle that
though process wants more detail on. Due to the inherent delay/reaction time,
this needs an interleaving of about 3~6 steps delay between subsequent,
sequential viewings of the same area, if all these viewings are decided with
full knowledge of this area. The higher the interleaving, the higher the load
on this working memory, but the less multi-viewings without intermediate
deciding have to be done of the same area, as you can't handle more areas than
this interleaving factor at the same time.

A nice aspect of this fuzzy nature of associations is that you can directly
combine fuzzy associations from two such abstract concepts you can synthesize.
Don't refine those concepts too much though. Compare e.g. the visual
impressions from one historic city center with those of another. Don't try to
list each and compare them one-by-one, try to get a fuzzy state of non-
individually-enumerated-at-any-point visual impressions from one city center,
place it to the side (you can imagine it on a side in a virtual space with
rough directions relative to your brain, but don't try to relate it to the
physical world, or you snap out of the coherence and possibly loose part of
the memories), and then gather the same for the other city center on the other
side. Maybe switch between them a few times, like 2 or 3 times each, and then
just dissolve the seperation, e.g., drop the association of which side the
concept was on, and just all the other mental-space location information. Just
consider it no longer important, don't think about it in the moment. Prepare
how to drop one of like 4 or 6 or so low-complexity semi-abstract thoughts
(there should not be a linguistic expression accurately describing it that is
shorter than 7 syllables, this counts for each though separately), before you
do the fusion. If you then run through the things you get if you brainstorm
small linguistic expressions (like maximum 5 syllables, preferably fewer) or
visual things (that are drawable to recognition in under 30 straight lines of
finite length, to the visual recognition skill common in Pictionary), you get
what both of these city centers have in common, with much of the sampling done
on the combined probability distribution this essentially is. The reduction in
noise/errors is related to what a quantum computer does, but the limits are
sadly much lower.

Be careful, you might like to use such to get rid of (some) emotions, and that
can hit some feelings like hunger/thirst without trying to. It usually takes
years to get a good handle on those after you loose them.

Manipulation of these lower-level/monolithic though processes can be done by
creating a self-feedback one that is trained to tell you as a one-dimensional,
non-quantisized "feeling" (you probe it similar to how you consciously probe a
specific bodily sensation, but by asking for an abstract concept (naming it
creates too much overhead, as you don't need to directly refer to it from
linguistic communication) instead of a region of your body. Like how you can
feel how dry your eyes are if they are dry, with less quantisation than what
you'd use if you try to put it in words, and with less fudging than if you try
to put it in numbers (even it those have decimals).

------
hyperpallium
Is there evidence of correlation between the size of working memory and
brilliance/genius? Or is a universal human mind limit?

Although, for any even moderately complex problem, abstractions or chunks of
some kind are necessary, so proficiency in recognising, using and creating
them is more important than holding one or two more, being able to hold more
like help - especially in managing them. Perhaps help a lot, and the
development of some abstractions may actually be impossible without greater
working memory.

------
yosito
The trend of turning untested hypotheses into clickbait headlines is getting
really bad. There is little evidence so far that the headline is true, it's
just one hypothesis.

~~~
dang
Ok, we've added the qualifier ": Study" to the title above. But I don't think
this complaint is so fair. As far as clickbait goes this title is tame.

~~~
yosito
The qualifier should be ": Hypothesis", there is no study that points to this
conclusion.

