Hacker News new | past | comments | ask | show | jobs | submit login

I've worked alongside a group researching similar techniques, it doesn't actually need to be implants - http://bretl.csl.illinois.edu/projects

I think we were able to get up to something like 20 characters per minute, but it was largely UX design. You can go from 6 -> 20 characters. I wouldn't be surprised if the implants improve the speed, but UX could probably play an equally large impact.




i think i believe that. what is the easiest thought abstraction that can be captured by our sensors? well the abstraction is largely defined by the UI. i like to think of it like language. UI components (words) come together to enable complex actions (sentences or thoughts). it evokes questions, like what language does the brain speak in certain contexts for certain outputs? that's gonna be interesting to follow. what if we all think super differently and that makes it hard? i can't imagine why, but i don't have a background in real brains

it may be that our current "AI" tools might be helpful--they're really good at composing "languages" for tying together different types of data. seems that tying noisy brain sensors data to our English alphabet might be an example of that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: