Hacker News new | past | comments | ask | show | jobs | submit login
What Would Real Brain-To-Brain Communication Look Like? (discovermagazine.com)
27 points by DiabloD3 on Oct 18, 2015 | hide | past | favorite | 8 comments



I use this form of brain-to-brain communication that involves using other parts of my body to make sounds that I assume other people can interpret using their ears, and map onto ideas. It takes a lot of training during childhood to get this down but once you do it's pretty efficient and you can transmit fairly complex ideas fairly quickly, especially if you can assume knowledge and context on the receiver end.


Brain communication very much interests me, because it comes down to the essence of how we encode knowledge in biological signals. Like Google Deepdream, understanding how concepts of what a dog looks like and how it is related to other things, encoded as a series of bits, is just absolutely fascinating.

I know we still don't understand beyond a very probabilistic level how the brain encodes this information biologically, but I've heard lots about prosthetics. Can anyone answer me how they control robotic arms using just the brain [1]?

[1] http://www.instructables.com/id/Mind-Controlled-Robotic-Arm/


I'm sceptical the school kid project shown in your link actually works significantly. Johns Hopkins University Applied Physics Laboratory made a real one, I think by surgically moving 'spare' nerves to various points at the inside surface of the patients chest and then putting electrodes on the exterior surface.

There's an interesting video explaining a bit: https://www.youtube.com/watch?v=9NOncx2jU0Q

from

http://www.jhuapl.edu/newscenter/pressreleases/2014/141216.a...


While not based on much real science/technology, Ramez Naam wrote a cool sci-fi book (trilogy actually) on this topic called Nexus.


Another book that explores this (pretty in-depth) is The Forever Peace by Joe Haldeman. It won a couple of awards for sci-fi writing, and I found it enjoyable.

In it there are remote-controlled humanoid killing machines that are piloted by soilders from a bunker through an electronic hole in their head. They can communicate sub-lingually, hear eachother's thoughs, and feel eachothers feelings to such an extent that the men in the platoon (it's 5 men, 5 women) start getting PMS.


There's a different type of brain-to-brain interface: I remember reading something about neurofeedback(eeg biofeedback) , when they trained to users to increase a certain brainwave pattern(their alpha wave) in some synchronized fashion and that created a strong feeling of contentedness between them .

I'm not sure this is true though, because the field of neurofeedback has plenty of bullshit.


Perhaps it could reduce the latency of speech.


If we're going to compare human conversation to network performance, I think the limiting factor with speech is throughput, not latency. If I have a complex idea I can start explaining it pretty quickly, and whoever I'm talking to will understand the individual words I use easily enough, but it still might take a while to communicate the whole idea.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: