So what could be done? Probably an indirect method would work better. Use existing motor controls (maybe finger gestures?) and an eye tracker to direct your doctor octopus arms (for which power and weight will be primary concerns).
We won’t get sensory feedback properly, but you might imagine having a single external device which channels some sort of sensory trick to 80/20 it.
When will this arrive? Wearable arms probably don’t have a market, but the control scheme and sensory feedback are going to be top of mind for VR / AR hardware devs I would think
It would be nice, I would be able to type on keyboard and drink tea with cookies without stopping. But there is mouse here... So I need three additional arms. Not like mine, with less strength but with thinnier fingers and a finer precision. If it possible then without a tremor. And an eye on each palm, for I could to see things really closely.
There’s also the issue a lot of these surgeries involve feedback from a conscious patient, and that’s obviously impossible with a newborn.