Hacker News new | past | comments | ask | show | jobs | submit login
Brain-Computer Interface Smashes Previous Record for Typing Speed (ieee.org)
161 points by 5rest 3 months ago | hide | past | favorite | 126 comments

> A machine learning algorithm then decodes the brain patterns associated with each letter, and a computer displays the letters on a screen. The participant was able to communicate at about 90 characters, or 18 words, per minute.

> By comparison, able-bodied people close in age to the study participant can type on a smartphone at about 23 words per minute, the authors say. Adults can type on a full keyboard at an average of about 40 words per minute.

I had no idea average typing speed is this slow, though the extreme slowness of touchscreen typing is no surprise.

I'm curious about the WPM test methodology; for example if all those studies use a standardised method or not. Because at least the various WPM testing tools and websites are extremely inconsistent in different aspects.

My own typing speed ranges from ~40WPM to ~110WPM depending on what tool or test method I'm using. For example some tests just use any random word from the dictionary, which of course includes very long and complicated ones, others go as far as to limit themselves to the 100 most common English words. And even with those factors being identical I can still get different speeds depending on the UI; the cursor movement, presentation of the text and input latency all have a measurable impact on the end result.

My average typing speed is 80-110, but I can max out at 121. This is for standard english words, not the crazy blocks of letters and special characters used for competition, which IMO isn't practical for the average user.

I actually spend a few minutes every few weeks taking typing tests etc, as it's a valuable skill to me.

I figure if the average typing speed is somewhere between 50-80 WPM, if I can type 50% faster, in theory, I'm responding to emails and commenting in general 50% faster.

It's just a nicer experience using a computer if you can type fast. Sometimes it's easier to command-backspace and retype an entire line/word than it is to move the cursor and correct the error. It's also just plain nice to have only your thoughts to worry about when typing and not have the cognitive overhead of trying to keep track of where all the letters on the keyboard are.

All in all, I would recommend improving your typing speed if you spend any type of meaningful time on a computer.

A fun 1-minute test that is somewhat practical here: 10fastfingers.com

Are tasks like responding to emails and adding comments limited by typing speed ? I would have guessed most typing tasks aside from copying text are limited by knowing what to type.

Pretty much all of the time I'm typing natural language (chat apps, emails, writing this comment here) I'm going to be limited more by my typing speed (~120WPM) than by the speed at which I can think of the words I want to type.

A pretty trivial demonstration of this is that I can talk much faster than I can type, and I would expect that you can too.

I can talk much faster than I can type, but I have much higher standards for precision and language quality when writing. When I write in natural language, I almost always carefully read through my message once or twice before sending. I don’t think doubling my typing speed (currently around 80 wpm, I guess) would save much time.

There's a time and place for everything. Sometimes that level of precision is required, sometimes it's not.

I think it's frequently the case that you determine what you're going to type in a block-wise fashion. You know roughly what the next paragraph/sentence should look like, and then "speak" through your typing. If you type faster, you will almost certainly "speak" faster in this way.

Sometimes, yes. If your brain gets too far ahead of your writing, then you need to pause thinking or you miss stuff.

If this happens to you a lot, you might benefit from training your typing speed up. (or writing, or dictating, or whatever)

I feel obligated to mention https://typeracer.com

https://monkeytype.com is much cleaner IMO if you're not trying to battle anyone.

Cool site! 113 wpm at 95% accuracy... I can live with that.

It's nicer for "pure" typing speed of letter-only words, but I appreciate that typeracer will include text with punctuation.

Well there went the last 30 minutes of my life. Turns out I can sustain 86wpm which is better than I expected. Was fun. Thank you :)

Addictive. Thanks for sharing :)

65 year olds now had the same 30+ years as everyone else to get up to speed on computers but most still lean on the “hey you know I’m X-years old so you know me and computers don’t get along especially with the texting heh heh”

As there are also plenty of competent typists and computer users at that age, It’s all about the influences, wonder how I can avoid that and brain plasticity excuses

> It’s all about the influences

This is key. Everything else is an excuse.

My Mum is in her 70s and until she lost feeling in one of her hands (chemotherapy 23-ish years ago) she was an exceedingly fast touch typist. Even without feeling in her hands, she could still type at ~40 WPM, possibly higher depending on what she was typing. Arthritis means she now mostly uses a tablet.

But looking at other people in that age range, I can think of literally zero from church who even know how to touch type. It's probably the difference between someone who had to do secretarial work and someone who did not.

I do think your first statement is absolutely on the mark. e.g., "I'm old, and I don't understand it, so I won't try."

The old adage is still true: Whether you think you can or you can't you're probably right.

As someone who is older, I'd phrase it slightly differently. "I'm old, my time on earth will be gone in less time than you've been alive, this thing you're all excited about seems silly and pointless and doesn't solve any problem I have, so I won't waste any of my precious time on it."

This is part of why I've never used Facebook, Twitter, or any of the rest of it.

Makes sense, though I think the same reasoning is often used to excuse obviously inefficient habits. Spending a week to learn to competently use a computer would likely save most people much more than a week over the following ten or even five years.

I agree with elmomle, but mostly because I've had a number of customers many years ago (back when I worked in tech support) who were older (70+, sometimes 80+) who were absolutely willing to learn new technologies and were generally sharp people.

The difference is definitely much less about not finding new technologies useful or irritating; it's a difference, I think, between people who are genuinely curious and interested in learning new things and those who aren't.

> wonder how I can avoid that and brain plasticity excuses

I've worked on this question personally. I decided that the obvious first step to gaining the mental flexibility of youth, is to mimic youth:

* Explore: When we're young, we constantly try new things, even when there isn't an apparent ROI. We try new arts, new experiences, new ideas, new hobbies, etc. We are not afraid to ignore the established way and invest in something new - often for the novelty (or rebellion) of it. We give the new things time; we play. We are curious, not critical - we wonder why and explore the idea instead of criticizing it and shutting it down. When we are old, we often stick to what we know well and criticize the rest.

* Push yourself: In school or as a junior employee, you can't say 'I've always done it this way' or 'I'm not interested in learning something new'. You have to learn and adapt. When we're old, power corrupts - most people make those excuses and they are generally accepted. Nobody else will push you, as a rule.

There are limits in life; I don't have as much free time now as when I was young, but that's not a deal-breaker (and I use time much more efficiently now, including by prioritizing and by knowing myself much better). Also, I don't 'play' like a 6 year old or even a 25 year old; I do it my way.

I also saw it as an interesting experiment: How much of mental changes were due to changes in practice and how much due to biology. I can't provide empirical data but especially Exploration seems to have changed my life, not only mentally but significantly, emotionally: I'm much more optimistic, less jaded, and more emotionally connected than I was. Life is invigorating. A warning though: I'm challenging some norms of age and therefore peers don't grasp and sometimes reject me. I wish I could get through to them.

I can do about 45-50 wpm on my phone with swype typing. It would probably be faster, but words like "our" and "off" are hard because it seems like it's luck whether the keyboard picks "or" or "our", of/off.

Swype is pretty great for writing with a phone. I kind of wish there was an improved Swype though.

Did you know that Swype stopped being available on Android in 2018?[1] I just learned this, and I still haven't found a keyboard that lets you do Ctrl+C and Ctrl+Z like Swype did. Tried SwiftKey and Gboard.

1. https://en.wikipedia.org/wiki/Swype

I use swipe-style input (don't know whether it's actually swype) for most things. It's somewhat annoying that it can't tell the difference between words with identical swipe patterns.

I find it more annoying that it can't even attempt to render words that it doesn't know. The reason is exactly the same as its inability to distinguish "isn't" from "orange" -- there just isn't enough information in a swipe to identify any intended letters. But a failure to recognize a novel word means you can't correct the IME - you have no other option but to switch input methods.

I think Swype could rely more on repeatable patterns. Eg if you're swyping in a straight line and want a letter included from that you should do a little loop on it. Eg with "our" I do a little loop on "u" and it doesn't pick "or" anymore. But it does then also pick "out"...

It should be way more accurate on the starting and ending characters. If I'm starting with "i" I'm not going for "orange".

One thing that frequently trips me up are names. I'm swyping a regular word and it thinks I want to use a name I've never used before.

SwiftKey definitely notices if you do an extra movement on a letter—it's quite easy to differentiate between "to" and "too" with that strategy, even when it doesn't have the rest of the sentence for context.

> It should be way more accurate on the starting and ending characters. If I'm starting with "i" I'm not going for "orange".

The whole point of using a very-low-fidelity input method is that it's faster. How much care and effort are you planning to put into entering each word?

Swype is absolutely fantastic when it works right. I also agree that it would greatly benefit from having a better algorithm behind it.

Given how accurate search suggestion algorithms are becoming, I find it surprising that keyboard suggestions are as bad as they are. If swype could have the same predictive abilities as search engines it would be a major boost in speed and comfort.

I was with my previous girlfriend for ~5 years, and have basically been using Gboard the entire time. We communicated via various messaging services literally every single day. If I was to try swipe-type her name right now, there is a 50% chance that it will suggest a different but similar name that I think I have literally never accepted as a suggestion.

I was expecting this feature when I got my first Android phone over a decade ago, and I am still waiting for it. Is there some engineering step that I'm missing? Model my typing history and use that for prediction weightings. Is this unfeasible on my phone hardware?

Swype lets you long-press a suggestion and choose to never suggest that again.

Google's Android keyboard (GBoard) has swiping capability. Still not great, I think the problem is the keyboard isn't able to go back and correct previously typed words based on subsequent contextual information.

I suspect much of the population either doesn't type much, or doesn't make much of an effort to type faster. My average is in the 140-160WPM range (to try to type at 40WPM consistently I would have to be deliberately slow, as even one-handed I'll easily go over 60), although I never learned touch-typing formally; I attribute it to spending extensive amounts of time in using IM, which basically forces you to type quickly you want to keep a reasonably natural pace of conversation and not lose your thoughts. Then again, I've interacted with much younger coworkers, many of whom were surprised at the speed at which I could return a reply (in full sentences with punctuation, without abbreviations and such) in a chat, so I'm not sure if the motivation is still there today --- especially with audio/video calls being more common.

How do you manage to sustain 140-160 WPM? I consider myself a very fast typist, and I don’t think I ever broke 100 WPM (it’s the maximum I ever got in one of those typing tests)

I use a keyboard with a very light and springy feel, and low travel. It also helps to not think of individual letters but rather entire words, so you can be pressing multiple keys nearly simultaneously and "buffer" multiple words ahead.

(You'll know that you're buffering when you take a typing test and by the time you see that you've mistyped something you've already typed several more correct words, so it pays to be accurate before you try to buffer ahead.)

Touchscreen typing on native keyboards always feels slow to me, but with apps like SwiftKey which remap the boundaries of the virtual keyboard to better match your intended keystrokes I can type significantly faster. Add swiping and completion to that and I'm pretty confident one can easily beat that 23wpm mark

I thought Apple had a patent on this technology, interesting to see others do it

Apple was extremely late to the boat and it was one of first high profile features people said was blatantly copied from Android which had Swype and Swift Key for years

Exactly. Importantly, SwiftKey was acquired by Microsoft after first being released as a standalone app following a couple rounds of VC funding.

> I had no idea average typing speed is this slow, though the extreme slowness of touchscreen typing is no surprise.

same same.

we had to type at least 60 wpm in middle school in order to pass the "typist class" (around 2000) thou i did not maintain that level.

edit: clarification downthread https://news.ycombinator.com/item?id=27423762

Keyboard wpm is about double; ~37-44wpm is P50. Still not super fast though.

> The fastest of these previous typing-by-brain experiments allowed people to type about 40 characters, or 8 words, per minute.

Ah, so it's previous record for BCI typing speed, not "typing speed" in general. Original headline (on IEEE side) is misleading.

For the record, it looks like the record for typing on standard keyboards is 212 WPM (on both QWERTY and DVORAK, amusingly), while the record on a stenography keyboard is 375 WPM.

Any idea on the speed of dictation?

Edit: Found a reference that average dictation speed is around 150wpm, but also mentions some tests going up to almost 300. So it would seem the average person could be almost 4 times faster speaking rather than typing. https://www.reference.com/business-finance/average-dictation...

Yes, I thought the exact same as you -- that the headline meant a BCI achieved like 400 wpm or something.

Once this read the mind thing matures, so much of the software will have to be rewritten. Even now sometimes I have to joke around in office "PC is slower than me" because I gave certain commands in sequence and it is still performing them one by one while I wait for computer to catch up. Now imaging this, command directly from brain to the machine. In engineering world, It would increase the productivity by at et least 4x I guess.

Currently to draw a line in AutoCAD, I have to either move the mouse to a "line" button and click it. OR type a command "line" if my hands are on keyboard. Now If I had a direct interface with the machine, just the the thought of "Draw a line" and software is line-drawing mode. Add eye tracking to it, and I simply look at the point on the screen when line shall start and say in my head "here"...... opportuties are endless.

Computer will have difficult time catching up with the amount of commands we are capable of issuing to it. BRING IT ON.

It was 2006, when I did Speec-To-Text data entry of 200+ mobile numbers using good old Windows XP. Progress seems to be have been too slow since than.

the ideal CAD operator has a series of personalized single key shortcuts on their left hand.

I challenge you to put your top ten commands on single key or double-same key (e.g. zz) shortcuts on the left side of the keyboard. Your CAD skills will speed up significantly once these become muscle memory.

If you're already able to do input faster than the computer can process your commands, how would a faster input method increase productivity?

Well, there are many variety of commands/actions. For the compute intensive ones, brain-computer-interface wouldn't have any implication. However there are many-many mundane commands too, with near instantaneous processing, those are bottlenecked by Human's mechanical limitations. These are the ones which would improve tremendously.

Okay, first of all, this is great achievement.

But comparing speeds isn't really necessary or even relevant. For a person that literally has no other way to communicate nobody will be complaining the can type 20% slower than a person with a smartphone. The fact they can type at all is important.

Now, for the speeds, I type 80wpm/400cpm but what is more important I can also freely think while I do so. I don't even know consciously where the keys are on the keyboard, I need to focus on a character and then see where the finger goes as if I was bystander, a passenger of a mech body that observes it do something.

I can only assume that writing with brain-computer interface requires incredible focus and most likely engagement of visual cortex which would preclude from using it for anything else (so no ability to imagine any imagery while typing).

> But comparing speeds isn't really necessary or even relevant.

This isn't a comparison between this and typing on a keyboard.

It's a comparison to the prior state of the art in this technology.

Most people use their visual cortex to see what they are typing, though.

EDIT: I mean, on the screen (and perhaps some on their keyboards).

Only while they are learning. Have you noticed when you type on the phone your fingers are flying more or less without you having to actually search for the letter?

I have wondered about this and I think what is happening you actually learn the qwerty layout on your phone and the fingers already know more or less where to go and what you do visually is to just aim where your finger needs to go (ie find the center of the button) and the rest (like reading the letter) is most likely redundant.

I don't know how to or have time to test this, though.

I look away from the screen while typing especially in bars, when I've got a long chain to get in. This has the side effect of having a few particular old dudes tell me to fuck off every once and a while, stop showings off... As I slam out 90wpm and look blankly past them.

Why make a cynical assumption at all? Francis R. Willett, Donald T. Avansino, Leigh R. Hochberg, Jaimie M. Henderson, and Krishna V. Shenoy are real people. You could, you know, just ask them.

What cynical assumption? It is well known visual cortex can't do too many things at the same time. The famous gorilla experiment shows this. And if you need to imagine anything graphical it most likely involves visual cortex.

Shiny title with the actual typing speed so well hidden.

It's silly that it's hidden, because 18 WPM for a quadriplegic still seems extremely impressive to me, given that it's just working from brain electrodes and not some muscle the person can control. Yes, the headline is silly, because by hiding it it ended up causing the snark about 18 WPM being slow.

I'm surprised, though, that the most efficient way to do this is still to have the person imagine physically drawing the letters by hand. I know motor neurons are probably our most reliable output, but I would still think that, with all the advances in training from noisy data in the past decade, that training what the thought of "A", "B" etc look like in the head would be doable.

Or even what the thought of hearing or saying "A", "B" etc looks like. The auditory cortex is activated when we imagine sounds. Or, if they wanted to stick to motor neurons, could they have the person imagine saying the letters with their mouth?

I'm sure they've thought about this stuff and it's harder than it seems, of course. But I would just predict that brain-computer interfaces 20 years from now won't involve imagining using your hand to write letters.

Great point! The electrode arrays used in these studies are implanted in the participants' motor cortex in an area called the "hand knob". Neurons here are highly modulated during motor output, not only for the hand, but also for the rest of the body. This area is close to the output though, and abstract higher-level features like the "idea of an A" isn't likely to be decodable from neurons there. Future research is looking at exactly this, but will likely require recording in a different area.

For context, I did my PhD in the lab that did the work in this article.

When typing in passwords on a phone I have to visualize the keystrokes by thinking about the finger movements and concentrating on my forearms without really moving my fingers. Generally once fast and once slow to think about the actual activations.

> 18 WPM for a quadriplegic still seems extremely impressive to me

Honest question: how so? We should expect a direct neural interface to far exceed the speed of any manual input device, especially after 40-50 years of research.

Because I'm basing it on where things were 5-10 years ago (almost nothing like it actually working yet), not some hypothetical place where the research could be after 50 years.

GPT-3 is also very impressive to me, even though 30 years ago I thought we'd have Hal by now.

Some problems just turn out to be way harder than anyone anticipated, and so when they make advances I'm impressed.

> We should expect a direct neural interface to far exceed the speed of any manual input device

Counterpoint: If this were the case I would have already heard about techies getting brain implants to optimise their communication.

Since that hasn't happened, the only logical assumption is that available neural interfaces are slower than existing manual input methods.

Because manual input devices aren't available to quadriplegics and earlier solutions (e.g. sip/puff controller used to select single letters from an on-screen keyboard) are relatively very slow.

18 words/min

Lots of discussion of WPM speed. Prof. Shenoy presented at our weekly seminar (shameless plug - https://twitter.com/RiceNeuro) a couple of weeks ago. I think he considers the goal to be reaching the rate of speech, which is about 150 WPM. Beyond that, I think that there's a strong argument that our brain is not completely generating/processing thought fast enough to substantially increase output rate.

> “It’s at least half way to able-bodied typing speed, and that’s why this paper is in Nature.”

> The participant was able to communicate at about 90 characters, or 18 words, per minute.

> By comparison, able-bodied people close in age to the study participant can type on a smartphone at about 23 words per minute, the authors say. Adults can type on a full keyboard at an average of about 40 words per minute.

I used to type "semi-competitively" (aka just spending a lot of time on typeracer.com) and could get around 170 WPM if I put enough effort into it. (At least when I was younger. Probably slower now.)

18 WPM would make me feel like a snail. However, I feel fairly confident that within 1 - 3 decades some kind of BCI will let any user surpass the equivalent of 200 - 300 WPM after a bit of training. And hopefully even with a device that sits on your head rather than in your brain.

So I'm just kind of looking at this like ML research circa 1990. We're hardly even in the infancy stage yet.

Where do you get that confidence? Reading the article, I'm mostly seeing somewhat deliberately hyped-up claims and hand-wavey references to AI, neither of which give me much confidence in this tech taking a tremendous leap anytime soon. Still very impressive, but as someone who knows nothing about the tech, I'm not sure why I should expect it to improve so quickly?

The fact that we can apparently detect repeatable patterns of brain activity, and do it across a few modalities (handwriting, cursor movements) shows that a lot of the core problems are solved. We’ve achieved Proof of Concept.

With greater sensor fidelity, greater quantity of data, improved ML approaches combined with alternate detection targets (particularly subvocalisation) means that getting to faster than parity WPM seems like a realistic result. Incremental improvements to tech is something we humans are quite good at.

Well, I did say 1 - 3 decades, which isn't "anytime soon" I think. I just think the potential has been demonstrated, and to me it seems fairly feasible that if you hypothetically had a device that could transmit thoughts at near the speed of thought, you could get around 200 - 300 WPM. If it could somehow interpret more abstract things like images, then the information density could be higher, too.

I could be wrong, though. Maybe people's thinking rates wildly vary. And maybe my 170 WPM was pretty close to my actual speed of thought. It also may depend a lot on if it's something you're thinking about on the fly vs. regurgitating existing thoughts.

> 18 WPM would make me feel like a snail.

On the other hand, this interface is for people who have a current typing speed of approximately 0 WPM (or ~8 WPM if they were lucky enough to have the previous leading BCI technology available to them). So it's all about perspective.

Very true. I'd be overjoyed to have it if I were in such a situation.

Thanks, I thought from the headline this was enabling over 200wpm maybe.

Same. One of the reasons I hate using my phone to type is how slow I am compared to using a full keyboard. This would be hell for me.

The original iPhone is the perfect size for my hands and thumb typing.

I was quite fast in landscape due to the placement of everything and my ability to also rock the phone while I typed. This increased my speed dramatically.

Sadly, that is not really an option anymore as I gladly give it up for the larger real estate of the 6" models.

If I have something long to say and I'm not too concerned about accuracy, I just use text to speech.

That makes me wonder whether cell phones should let you customize the keyboard, esp. the size of it.

The current generation is a bunch of thumb typers. Ye abandon all hope..

I don't know if it's cultural, but as someone of the 'current generation' (Gen Z), I feel that the opposite is true and typing speeds increase as you get younger (to some limit).

I've tried to back this up and only found a single, questionable, data source[0] on the topic but I do attend several typing competitions and some of the young ages are phenomenal. I was beaten a few months ago by a 15 year old who was typing >220wpm+ for 5 minutes.

[0] http://drwes.blogspot.com/2011/01/results-are-in-age-vs-typi...

I'll second the other Gen z person in the replies and say that lots of us hate thumb typing and if we're going to generalize across every single living person by one factor, my experience is that typing speed goes down with age.

How long until an inner monologue cannot hide from the "authorities"?

Also, Macross Plus, spoiler, for the subconscious throwing up images in a high stress environment; https://youtube.com/watch?t=2077&v=Fgg6p9gSeR0

It's too bad the ieee.org site itself no longer has words on it. It's just a javascript application spinner gif.

works for me, it auto-reloads the page for some reason tho

Speaking of BCI, there are startups intent on implanting thousands of nanoscale sensors directly in the brain to gather copious amounts of data directly.

"You are seeing this advert because we detected that you thought about the Budwesier 'Wassup!' advert that aired 22 years ago and it increased dopamine uptake by 3.7%"

I wouldn't call it "copious amounts of data directly" you still gotta train yourself to use it, and the interface itself should train itself to recognize it.

It might be enough to make people walk again though.

Partially correct, except the vast numbers of sensors have additional multiple uses including detailed monitoring, biofeedback, and actual input.

I write at 50 wpm, type at 100, and speak at 150, a brain interface would have to be 300 or more to make it worth wearing another wearable.

That's surprising to me. I mean, if I could put on a headband or hat and "type," hands free and wirelessly, on any connected Bluetooth device, that would be pretty incredible even if it only went at a portion of my current speed/accuracy.

Obviously, that's still sci fi at this point, but that's what you seem to be rejecting here.

To each their own, I guess.

I type only slightly.slower than I think, so really no keyboard interface would be worth my while.

This is a really good point, I work in an area that requires careful written communication. So sometimes my thought output is only five words per minute, and that includes multiple revisions which would look crazy using a brain interface. :)

What's the better measure of typing speed: full words correct with no typos, or full words typos included?

A buddy of mine used some timing app for our typing speeds, you had to backspace repeatedly to retype a word if you misspelled it. I argued that is unrealistic, you can just use spell check or just catch the typos during your editing read through.

> What's the better measure of typing speed

Full words correct with no typos.

If you want to measure this, you're interested in the combination of reading speed and muscle memory while typing. If you're making typos, your muscle memory is not good. Using a spell checker to automatically fix those words... Well, the number is not comparable, then.

That feels pedantic to me. A more important, but also more difficult, measure would be of information conveyed, regardless of how it's accomplished.

A very effective friend of mine once said if your messages are perfect you're spending the oo much time on them.

It depends on the context. If you write for a living, then one relevant metric is how quickly you type with no errors (although perhaps a better metric would be how quickly you get error-free work product out, including revisions).

If you're talking about who can send text messages to another person faster, then the "typos included" metric makes more sense.

> the oo much time

Was that intentional? Looks like the other two replies may have read right past that without noticing it...

"Measuring typing speed" is just silly, if you ask me. For the reasons you already pointed out.

Typing speed is not about understanding information to its fullest, it's just a somewhat quirky metric for the speed of replication of words.

How does ‘muscle memory’ apply to a brain computer interface?

OP was talking about comparing typing speeds with his buddy, not the neural interface.

My typing class in high school only accepted perfect speed tests. No errors at all. Given how many typos I see in even business correspondence, I wonder if 'editing read through' has gone the way of the manual typewriter :-P

Apologies to those who've heard it before, but whenever BCI comes up I feel obliged to point out that, when you connect a computer to a brain, the brain is the more sophisticated information processor.

Use hypnosis.

You can have effective fast communication to the machine without any fancy technology (and certainly without surgery!) The subjective experience is "think to type". I expect that these days you can use a camera and ML to "read thoughts" directly from facial muscles, no need for GVR or IMUs on the fingers.

Interesting that the method for this interface was imagining writing the word, I guess too many random thoughts would enter your writing if it was just based on mental-verbalizing (saying it in your head).

It would be really cool to have a BCI that runs off mental-verbalizing and have a chopper style rapper use it (eg. Busta Rhymes). Imagine people doing transcription learning chopping as skill to get an edge on their job!

This is the "next" technology I'm excited about. A cap/something that lets me provide another input device to my machine where I can treat it like a keyboard. Imagine being able to think about a chord and have that input sent to stdin.

What could be extra cool is just imagining a loop construct in your head and it just appears in your text editor.

Is there already a scene for this? Sign me up.

I'll be excited when it lets me do things I can't yet, like control a second pair of arms. Every piano player's dream.

Robotic extra thumb: https://www.newscientist.com/article/2277955-this-robotic-ex...

But they use toes to control the thumb instead of BMI.

A neat thing is this uses a language model reduce the error rate a lot. Autocorrect, conceptually, though theirs is offline rather than as-you-input. In a way the clever software involved is maybe the thing that most makes it relevant to this site.

Other tricks we use for noisy touchscreen input could apply here too. For one thing, we've got different models for to collect input. You could try offering autocompletions/predictions to make typing long words faster (long a thing in e.g. single-switch input tools), if watching a screen doesn't slow the person down much. You could try a swipe-like flow where you collect a chunk of data imprecisely but fast (mentally type or scribble a whole word, say), then offer choices.

(Thinking about other extremes, I wonder if there's a way at faster input using a T9-like reduced alphabet (3x3 grid, pick row and column?) or something like that. Or if it could someday work to for people to try to speak or visualize words instead of handwrite.)

You might be able to glue the letter decoding and language model together more closely--feed the letter-decoding NN's uncertainty and less-probable guesses (25% chance this 'e' was really an 'l') to the lang model.

You could learn a person-specific language model, seeding it with a person's writings/speech before the injury or disease (if available), whatever else you think they might need (family/friend names, care requests, etc.), and training the language model as they use the interface (paper already does that with the character-recognizing model).

You could do explicit "mutual training"--while the machine samples how you write letters, it can show you its certainty scores (based on the model it has so far) or maybe something graphical to help you write the letters how it expects (exaggerate certain differences etc.). They already have an "optimized alphabet" that maximizes the machine-visible differences between letters.

From their paper, the existing language model already did very well at producing clean results in this test, but the more you can refine the cleanup, the more you can potentially sacrifice cleanliness of input for greater speed, and maybe get closer to speaking rate.

FWIW some Googling found http://web.stanford.edu/~shenoy/GroupPublications/WillettEtA...

It’s been clear to me for several years that the future of computing is BCI coupled with augmented reality.

However, considering how much the now-normalized ill effects of the digital privacy dystopia we’re already living in would be multiplied by that development, I really fear for the future.

"brain-computer interface" and "typing" should not go in the same sentence


maybe it's just me but when I hear brain computer interface, I'm thinking of the brain sending command directly to the computer, not the brain acting like a virtual keyboard and sending individual keystrokes to a keyboard driver.

The great thing about this is that with the proper use of snippets and auto completion 90 characters per minute can quickly translates to more than 18 wpm.

will such an interface ever be possible without invasive implants in your brain?

Yes. The "secret" is to use hypnosis to "program" your unconscious mind to emit signals corresponding to your conscious "stream of thought". The result is a subjective experience of thinking words and they appear on the screen. Someone else in this thread speculated about e.g. controlling an extra pair of (artificial) arms to play the piano with four hands. That sort of thing should be straightforward too.

In re: the technology to take the signals and transduce them to e.g. a byte-stream without surgery. First was Galvanic Skin Response, then when IMUs (Inertial Measurement Units) were miniaturized enough you could use those instead of GVR, nowadays (as I mentioned in a sib comment) you can use a camera and ML to recognize e.g. muscle twitches in the face or whole body.

(I have no idea why people are not more interested in this angle, even BCI enthusiasts seem to have a blindspot here and just go on about surgery and implants. If anyone is interested, let me know and I can explain further. FWIW, I don't bother to do this myself because I can already type as fast as I need to. In fact, I was a hunt-and-peck typist for years! I'm not proud, just explaining why I don't bother to explore hypnotic low-tech BCI.)

I’m interested in hearing more about this approach, are there any papers on it? I see two different ideas here: the use of hypnosis to create unconscious “hooks” into conscious thought. This could be useful if for whatever reason certain features of conscious are not easy to detect (another comment mentioned that it’s harder to get clear signals from the frontal cortex). Then there is the idea of using muscle twitched / GVS as an information channel. It seems hard to get a high bandwidth from this (compared to invasive or EEG-like approaches).

Cheers, and thanks for asking. :)

> are there any papers on it?

None that I know of, but I have never looked. Hypnosis in general has a hard time being accepted by conventional science (going back to Mesmer and Ben Franklin.)

> the use of hypnosis to create unconscious “hooks” into conscious thought.

Yes, this was literally one of the first things I learned when I started studying and using hypnosis: a binary Boolean signal from my unconscious to my conscious minds (idiosyncratically we (my conscious and unconscious minds) settled on twitch of the right arm shoulder for yes, left for no. Technically it's a trinary signal, with no twitch indicating a "reformulate the query" or "does not compute" response.)

Really, the thing to do is improve communication and report between the unconscious and conscious minds, operating with deliberate cooperation (rather than the ad hoc programming you get from life.) E.g. when I play chess I do not think about the moves, I look at the board and "just know" which move to make. I can actually decide whether to beat someone or lose, and by what degree! (As you can imagine, it makes chess dull.)

> This could be useful if for whatever reason certain features of conscious are not easy to detect (another comment mentioned that it’s harder to get clear signals from the frontal cortex).

Yes, translating and amplifying signals is trivial, your brain does it all the time. However the entire point of having a "conscious mind" is tied up in being easy to detect. The ego is a communication device.

> Then there is the idea of using muscle twitched / GVS as an information channel. It seems hard to get a high bandwidth from this (compared to invasive or EEG-like approaches).

I actually have no clear idea of the bandwidth limits from the unconscious mind to some specific sensor system. Keep in mind that touch typing is already a form of this: the unconscious mind moves the fingers and a stream of text goes into the computer. Achieving that sort of bandwidth should be no problem: set up a binary signal on each finger and an "ACK" on the thumb and you can transmit bytes in parallel, eh? Build something like this "squeezebox" keyboard[1] or a dataglove[2] and use chords from greater bandwidth.

We also have the face, a high-bandwidth output channel, and I have no doubt that a camera (or two) with a simple neural net could be used to train the system to recognize facial tics and expressions. You would have to be a little sophisticated in how you set up the feedback loops: you want to settle on motions that are easy for the face to make and for the NN to recognize. That's kind of a neat thought experiment. (No pun intended.)

I don't doubt that you could get higher bandwidth from implants, I just don't think it's worth the surgery (for able-bodied folk. For people with paralysis it makes more sense.) Not EEG though (I know a neuroscientist who has experience with high density EEG and from what I gather it's just not that great. Like trying to find whales by analyzing surface waves.)

[1] https://peterlyons.com/problog/2021/04/squeezebox-keyboard/

[2] https://en.wikipedia.org/wiki/Dataglove

Depending on how much precision you need, it's already possible. An external interface can read gross changes in electrical activity that a person can learn to control. But it's pathetic compared to what the nerves can do.

Knowing that the average typing speed of adult humans is a measly 40 wpm gives me more faith in the failure of our species.

Sir Francis Galton pioneered the study of psychometrics which showed that intelligence is positively correlated with reaction speed. So it's not surprising that the average person is slow at a cognitive task like typing..




Humans typing slow, the end is near! It doesn't matter how fast you type what matter is you don't always type stuff like this

.. even worse is how slowly we think!!

> Knowing that the average typing speed of adult humans is a measly 40 wpm gives me more faith in the failure of our species.

Uh, that we can even type is in and of itself a wondrous thing. Enjoy your miasma.

Biology is just carbon based circuitry based on digital base 4 (DNA.) Electronics are just as messy with tin whiskers, fried resistors, capacitors, memory errors, clock mistimings, etc.. Energy itself is miasma. But okay, I'll enjoy my privileged "human" miasma ;-)

Are there any study precisely correlating IQ with writing speed?

I'm pretty much done citing anything mentioning IQ since it's been so thoroughly 'debunked'.

And I'm at least partly a victim of the "but your IQ is so high - you could do so much!" expectation trap. At least until I learned to let go of other people's expectations and just be myself.

IQ is just a measure of processing speed. You can still calculate anything you want on an old slower CPU with the same memory. It'll just take longer.

It's not that IQ is bunk, it's that effort and discipline counts more than anything when it comes to getting something done.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact