Hacker News new | past | comments | ask | show | jobs | submit login

Almost certainly not. Because the sense of touch is an important part of the problem and that data isn’t present in videos.



Not just touch but proprioception. Robots in human environments will have to be better at proprioception than 98% of humans. If I bump into you it’s typically anything from annoying to a meetcute. I’m a pretty big guy, but if you had to chose me to step on your foot or somebody else, it’s probably me you want, because I will shift my weight off your foot before you even know what happened (tai chi) because you will barely notice.

If instead your choice is your high school bully or a robot, well for now pick the bully. Because that robot isn’t even being vicious and will hurt more.


> Because that robot isn’t even being vicious and will hurt more.

Rodney Brooks at the MIT AI Lab was a big advocate of something called "series elastic actuators." The idea was was that you didn't allow motors to directly turn robot joints. Instead, all motors acted through some kind of elastic. And the robots could also measure how much resistance they encountered and back off.

MIT had a number of demos of robots that played nicely around fragile humans. I remember video of a grad student stopping a robot arm with their head.

Now, using series elastic actuators will sacrifice some amount of speed or precision. You wouldn't want to do it for industrial robots. And of course, robots also tend to be heavy and made of metal, so even if they move gently, they still pose real risks.

But real progress has been made on these problems.


I think you're probably right, and those non-linear systems are going to make me have to increase my estimate for how long it takes for a robot to go from 5 year old child to ninja physicality. The more complex the feedback mechanisms, the more complexity there is in, for instance, screwing in a screw as fast as possible.


The robot won't take any enjoyment out of it, and won't laugh at your pain. Won't post about it on social media. Isn't going to try and fuck your ex or sister or mom.

I'll take the robot, thanks.


Your friends will though.


"friends"


I'm pretty sure that if I had never opened a door before and I saw somebody opening a door in a video, I would immediately know how to open doors just by watching the video. And that would be any door, with any kind of door handle. Not because I got superpowers, but because I'm average-human.

So, the moment your system needs this kind of data and that kind of data, oh and btw it needs a few hundreds of thousands of examples of all those kinds of data, that's very clear to me that it's far away from being capable of any kind of generalisation, any kind of learning general behaviour.

So that's "60 difficult, dexterous skills" today, "1,000 by the end of 2024", and when the aliens land to explore the ruins of our civilisation your robot will still be training on the 100,000'th "skill". And still will fall over and die when the aliens greet it.


Can you train a robot to imagine touch by showing it what touch would feel like in many video scenarios?


I think their robot has a way of converting touch to a video input. The white bubble manipulator has a pattern printed on the inside that a camera watches for movement. (see 1:58 of the video).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: