
How much bandwidth does the spinal cord have? - jacobedawson
https://www.reddit.com/r/askscience/comments/7l56sb/how_much_bandwidth_does_the_spinal_cord_have/
======
wcoenen
The top voted answer is assuming a neuron firing/not firing (once per
refractory period) can be counted as a bit for the purpose of calculating
bandwidth.

But I think the peripheral nerve system uses firing _frequency_ to encode
intensity, so I'm not sure you can really equate 1 firing with one bit.

For example, over 15 refractory periods there could be anywhere between 0 and
15 firings, thereby encoding 16 different possible intensities. That would
effectively be only a 4 bit message per 15 periods, not 15 bit per 15 periods.

~~~
zamadatix
The classic goodput vs throughput debate. E.g. Gigabit Ethernet is most
commonly actually physically transmitted at 1.25 gigabit/s because of 8b10b
coding. Ethernet requires interpacket gaps, and the packetization itself
changes overhead. Then of course you have other protocol overheads underneath
you could consider, and how much overhead the data you were transmitting has
due to encoding of it itself.

In the end there is really "bandwidth" which is how much symbol "space" is
available and various levels of "goodput" which is the rate of whatever you're
calling useful data.

~~~
rzzzt
baud vs. bps

~~~
zamadatix
Other things like unusable symbols (e.g. IFG added to above) make it more
nuanced than that.

------
stevenjgarner
Do bundles of nerves (such as a spinal cord) have cross-over attenuation,
limiting their bandwidth like copper cables? This was one of the drivers in
telecommunications towards adoption of fiber optics over copper cables. The
600+ pair copper telco cables could not sustain 600 separate DSL circuits, as
the high frequency signals would interfere and cause attenuation. In many FTTH
deployments, fiber was a cheaper transport than installing additional multiple
pair copper cables to support the required customer demand. Of course, the
enhanced bandwidth capability of fiber optics, together with the ability to
have many more "circuits" over a a single fiber, were other drivers towards
adoption of optical broadband. If there is not cross-over attenuation with
bundles of nerves, is that only because of the mylar sheath? Obviously any
significant interference might have an effect on the calculations of spinal
cord bandwidth.

~~~
smegger001
The nerves have a layer of fatty acids called Myelin that act as a
sheathing/sheilding around the nerve fibers to prevent crosstalk.
[https://en.wikipedia.org/wiki/Myelin](https://en.wikipedia.org/wiki/Myelin)

~~~
swordsmith
Myelin's function is to speed up signal conduction along the axon, not
crosstalk between axon fibers.

------
cr4zy
Similar question on Quora from 2012[1] per sense:

    
    
      Vision: 10Mb/s => 100Mb/s
      Hearing: 30Mb/s
      Touch: 135Mb/s
      Smell: 100k neurons
      Taste: 100kb/s
      Proprioception: ??
      Balance: ??
    
      Total: ~10Mb/s=>~1Gb/s
    

Internal brain bandwidth is also worth mentioning as this is the last
remaining wetware advantage over hardware due to the three dimensionally fully
connected heterogeneous cortical substrate. I can't seem to find a figure on
that though.

[https://www.quora.com/How-much-bandwidth-does-each-human-
sen...](https://www.quora.com/How-much-bandwidth-does-each-human-sense-
consume-relatively-speaking)

~~~
jlawson
I really question the hearing one, at least.

Shouldn't this basically just be the bandwidth of a headphone signal which is
at the lowest quality where you can hear degradation if it goes any lower?

For me that's something like 3MB per minute, per well-compressed ogg files, or
50KB per second. Yet that answer says my hearing bandwidth is actually 600x
higher than that.

I can imagine some small difference, but to say my ear has evolved to transmit
600x more sound data than I can actually perceive sounds off.

~~~
Terr_
> Shouldn't this basically just be the bandwidth of a headphone signal

One issue with this approach is that it's very hard to tell the difference
between data that was never sent versus data that is there but our developing
brains were trained to ignore because it didn't turn out to be useful.

A simple example of this would be language differences, where certain
important language features simply aren't noticeable to non-native speakers,
despite having the same sensor-hardware.

~~~
perl4ever
Well, don't neurons get pruned as you grow up? So you don't necessarily have
the same hardware.

~~~
Terr_
I was thinking more about everything leading up to the brain, such as the
composition of your retina and optic nerve.

------
ramshanker
Follow up question I googled.

What is the resolution of our Eyes?

Ans: 576 Megapixels [1]..... Holy..... And here we are, trying to rely on
couple of cameras for autopilot.

[1][http://www.clarkvision.com/articles/eye-
resolution.html](http://www.clarkvision.com/articles/eye-resolution.html)

~~~
dahart
> Ans: 576 Megapixels

I’m skeptical. Since the human eye has less than 150M sensor cells, and our
sensors aren’t as good as 8 or 10 bit resolution pixels, that answer is
overestimating by at least 4x, and possibly a lot more.
[https://en.m.wikipedia.org/wiki/Photoreceptor_cell](https://en.m.wikipedia.org/wiki/Photoreceptor_cell)

~~~
Veedrac
He misconveyed the source. The claim was about the detail the eye can percieve
in a given field of view, over time and by moving the eyes over it, not the
amount of information the eye captures at any one time, which is vastly lower.

It's mostly irrelevant for cars TBH, since crashes are quick events.

~~~
edejong
That’s a simplification. The events leading up to a crash are important. The
spatial understanding of the situation on the road, the behavior of other
drivers and other factors all play a role in the seconds before a crash.

~~~
Veedrac
Yes, but there's a time-accuracy trade off when you see things by scanning
around. A good, wide camera trades off remembered perepheral details for
lower-resolution but temporally up-to-date details, which matters more in a
crash I'd expect.

------
DonHopkins
Something I posted earlier about spinal cords and high fidelity music:

[https://news.ycombinator.com/item?id=18750902](https://news.ycombinator.com/item?id=18750902)

Speaking of spines and copyright issues:

In K W Jeter's excellent dark cyberpunk novel "Noir", intellectual property
theft is viewed as literally killing people by removing their livelihood, so
copyright violators were punished by having their still-living spinal cords
stripped out and made into high quality speaker cords in which their
consciousness is preserved, usually presented to the copyright owner as a
trophy.

"In the cables lacing up Alex Turbiner's stereo system, there was actual human
cerebral tissue, the essential parts of the larcenous brains of those who'd
thought it would be either fun or profitable to rip off an old, forgotten
scribbler like him."

[https://marzaat.wordpress.com/2018/01/27/noir/](https://marzaat.wordpress.com/2018/01/27/noir/)

>There’s a lot to like in the novel.

>My favorite section is the middle section where the origin of the asp-heads
is detailed via McNihil’s pursuit of a small time book pirate and the
preparation of the resulting trophy. The information economy did, in this
future, largely come to place. As a result, intellectual property theft is
viewed as literally killing people by removing their livelihood. Therefore,
death is a fitting punishment. McNihil, in his point by point review of the
origin of asp-heads, notes that even in the 20th Century there was the phrase:
“There’s a hardware solution to intellectual property theft. It’s called a
.357 magnum.”

>Actually it’s decided that death is too good and too quick for pirates.

>Their consciousness is preserved by having their neural network incorporated
in various devices. (Turbiner likes to use stripped down spinal cords for
speaker wire.)

>This sounds like a cyberpunk notion but, in other parts of the novel, Jeter
takes a swipe at such hacker/information economy/internet cliches as
information wanting to be free (McNihil destroys a nest of such net hippies)
or the future economy being based on information. Villain Harrisch sneers at
the notion stating that information can be distorted but atoms – and the
wealth they represent – endure.

>Still, his novel is chock full of the high-tech, low-life that characterizes
cyberpunk.

(I'd quote some more, but as a high-tech, low-life net hippie, I'm afraid of
having my nest destroyed and getting my spine ripped out!)

------
garbre
I reject the premise that the nervous system has "bandwidth" in a sense
comparable to digital communications. Yes, nerves fire in discrete action
potentials, but every step of nervous transmission also involves a processing
step. Let's not forget: a huge benefit of neural nets is _dimensionality
reduction_ , which is at once compression but also the extraction and
abstraction of salient information. Does this represent the gain or loss of
information? It's a basically meaningless question; the question is how does
the system as a whole work, and how well?

Nor is it clear what the endpoint of a communication is. This is another
issue. Does information get counted twice if it's used by both unconsciously
by the brainstem as well as rising into awareness and is used by the
neocortex? The list of questions can go on.

This bandwidth thing is one of the questions I find frustrating, on par with
people wondering if a simulated piece of brain has feelings (the answer is
NO). Why is left as an exercise for the reader.

~~~
carrozo
I keep thinking of this piece:

[https://aeon.co/essays/your-brain-does-not-process-
informati...](https://aeon.co/essays/your-brain-does-not-process-information-
and-it-is-not-a-computer)

~~~
jkachmar
I don't particularly like that essay because the author seems focused on the
idea that "your brain is a computer" is a metaphor rather than a theory (see
[1] for a more nuanced discussion).

The author correctly points out that past eras developed metaphors to explain
how the mind might work based on the technological innovations they were
familiar with, but I think there's a lot more nuance in the computational
theory of the mind.

Namely that the notion of computation is much more abstract, and potentially
more portable across disciplines, than some of the historical examples that
the author of that Aeon piece brings up.

Anyway, obviously I don't have any real answers but for whatever reason the
brain-as-a-computer theory rings pretty true for me and I've enjoyed reading
essays and watching talks about the topic [2] [3].

[1] [https://medium.com/the-spike/how-to-find-out-if-your-
brain-i...](https://medium.com/the-spike/how-to-find-out-if-your-brain-is-a-
computer-644a1a4fed1b)

[2] [https://medium.com/the-spike/yes-the-brain-is-a-
computer-11f...](https://medium.com/the-spike/yes-the-brain-is-a-
computer-11f630cad736)

[3]
[https://www.youtube.com/watch?v=lKQ0yaEJjok](https://www.youtube.com/watch?v=lKQ0yaEJjok)

^-- This is the first part in an ongoing series of lectures that Joscha Bach
has been giving at the Chaos Communication Congress; if you watch it and find
it interesting you should check out the proceeding installments.

~~~
jv22222
I followed your links and ended up in this rather excellent essay:

[https://medium.com/the-spike/your-cortex-
contains-17-billion...](https://medium.com/the-spike/your-cortex-
contains-17-billion-computers-9034e42d34f2)

One of the final points the author makes is that the brain might be a neural
network made up of as many as 89 million neural networks.

That's a staggering concept.

If true, I wonder how anyone stays sane with that level of entropy in the
system!

~~~
Veedrac
The "neural network of neural networks" thing is a bit of lavish exaggeration
TBH, because a network of networks is just a larger network. The human brain
has about 100tn synapses, which I think is a less obscure statistic to marvel
about.

------
kadendogthing
Here is a rule:

>Please don't submit comments saying that HN is turning into Reddit. It's a
semi-noob illusion, as old as the hills.

Allow me to point out a few characteristics of this post:

* A submission to reddit

* No actual scientific or academic details supporting his statements. Just some known other quantitive data about the spinal cord. OP even characterizes most of his statements as "gross assumptions"

* This post to HN has been upvoted.

Here is a scientifically reasonable perspective on this subject:

[https://www.reddit.com/r/askscience/comments/7l56sb/how_much...](https://www.reddit.com/r/askscience/comments/7l56sb/how_much_bandwidth_does_the_spinal_cord_have/drkxah0/)

------
nielsbot
I laughed at this quote (tl;dr) from the article: "about a 4K movie every two
seconds".

Hope that doesn't qualify as a spoiler.

~~~
Ididntdothis
Latency is quite high though as far as I know.

~~~
gingabriska
But I read somewhere that the brain is able to adjust heart beats and
breathing at a very fine frequency and and keeping it in sync requires good
latency and all this is done without concurrency locks.

So does brain have massively parallel architecture where key things have their
own core?

~~~
inciampati
In biology every single protein (or other biochemical entity) can be thought
of as it's own core. It's parallel in an amazing and beautiful way. Or the
most boring way possible. It just depends on if you are a computer scientist
or biologist how you would see it.

~~~
bdamm
There are to this day computer science researchers having conniptions about
the impossibility of massively parallel lock-free architectures based on
eventual consistency. Yet literally all they must do is open their eyes and
see the truth!

~~~
jplayer01
It helps that our brains and bodies are analog. It doesn't matter if a
particular variable is changed by different factors at the same time. The
total magnitude of all the changes is far more important. Thus, lock-free.

~~~
farazbabar
Kind of like CRDT (conflict free replicated data types for the biologists
amongst us).

------
CodiePetersen
You aren't really "sending data" though. Each one of the neruons just ends up
triggering a circuit. And some of the fire signals are actually inhibitory
signals to stop other neurons from firing. Not to mention all of the chemical
signaling going on and the different types of patterns/behaviors that get
activated through repetitive or limited activity.

So really to calculate bandwidth is kind of pointless. One the neurons use
more dimensions than just on and off electrical signals as their state and
two, neurons don't really pass information along when you think about it they
just trigger circuitry.

------
bryanrasmussen
I didn't read through everything but it seems people are assuming some sort of
'normal' spinal cord without specifying what that is - so in this bandwidth
analogy they have for spinal cords, what is the affect of age, disease, and
injury on it.

------
diehunde
I think google used to ask this question to candidates.

------
parentheses
Solid back of the envelope job.

------
jonshariat
\- Modem \- Cable \- Fiber

Maybe in the future we'll have... Spinal Tap

~~~
grayed-down
This is a place for serious discussion. No humor or attempts thereof will be
tolerated as evidenced by the steadily graying color of your comment text.

