Hacker News new | past | comments | ask | show | jobs | submit login

i think i believe that. what is the easiest thought abstraction that can be captured by our sensors? well the abstraction is largely defined by the UI. i like to think of it like language. UI components (words) come together to enable complex actions (sentences or thoughts). it evokes questions, like what language does the brain speak in certain contexts for certain outputs? that's gonna be interesting to follow. what if we all think super differently and that makes it hard? i can't imagine why, but i don't have a background in real brains

it may be that our current "AI" tools might be helpful--they're really good at composing "languages" for tying together different types of data. seems that tying noisy brain sensors data to our English alphabet might be an example of that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: