Hacker News new | past | comments | ask | show | jobs | submit login

Previous efforts were more inaccurate, you may remember hearing jargon like “alpha waves” or “beta waves”. Participants would learn to move cursors by learning to create the neurological activity that was being listened for.

This, on the other hand, is watching for complex neural activity, in that it learns what pattern appears when a participant pictures drawing an A.

Think of the difference like: before the input was controlled by turning your whole body, and now the input can take individual sign language letters.




>hearing jargon like “alpha waves” or “beta waves”. Participants would learn to move cursors by learning to create the neurological activity that was being listened for.

Previous efforts included moving a cursor with the implanted Utah array, which I don't think was just based on broad brain waves was it?

Are you thinking of non-invasive devices? Neuralink didn't invent implanted microelectrode arrays. Some of their press has been around being the first one wirelessly connected to a receiver, but it wasn't first there either. Mainly their roadmap for scaling to more and more electrodes and their custom chip for processing is what's new I think, but they aren't sure if the polymer coating they are using for will hold up long term.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3715131/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: