
Vibration-minimizing motion retargeting for robotic characters - guiambros
https://la.disneyresearch.com/publication/publication-process-vibration-minimizing-motion-retargeting-for-robotic-characters/
======
raihansaputra
After seeing the Bartender sample, I'm convinced that this is significant to
improving small scale robotic arms. Now the arm does not have to be rock
solid, and it's precise enough to do simple tasks. For more sophisticated
arms, this can increase their speed/extend their capability. This can even
help in exploiting the flexibility of the material, for throwing/catapulting
stuff for example. Exciting!

~~~
rtkwe
Maybe. One of the issues is these seem to depend on having a very accurate
model of the animatronic models and how their materials will react to forces
so that the required damping motions can be baked into the model.

------
CharlesW
Wow. I wonder if this would result in more-perceivably-realistic movement when
applied to 3D animation?

~~~
jcims
Yeah i think some of the simulated ‘optimized’ control output resulted in a
more realistic result than the intended animation.

~~~
ummonk
Yeah, not just the real optimized motion but the simulated optimized motion
looked significantly better than the original unsimulated animation.

Makes me think for actual animations they should simulate physical systems and
then apply optimized control inputs to approximate the animation which the
animator inputs. It would result in more grounded and realistic feeling
animations.

~~~
_Microft
That might be because it's not longer too perfect. As far as I know little
variations in timing are added to computer generated beats to make them sound
better? That might be a similar thing?

~~~
codeflo
To make a different audio analogy: When they graph the input vs. the optimized
control values, it looks a lot like the ringing artifacts of a low-pass
filter. A low-pass filter removes high frequencies from an input signal. Since
there are physical limits on how fast limbs can oscillate, maybe that’s part
of what makes it look more natural.

~~~
_Microft
I noticed that as well and it reminded me of the Gibb's phenomenon.

[https://en.wikipedia.org/wiki/Gibbs_phenomenon](https://en.wikipedia.org/wiki/Gibbs_phenomenon)

------
SeeDave
Absolutely beautiful work!

As an amateur roboticist/professional dilettante I've often been tempted to
consider a robot as little more than a 'Normal Line in R3' (essentially an end
effector's pitch/roll/yaw at x/y/z in effector state i from 1...n) and motion
planning as little more than crow flying with zero consideration for 2nd or
3rd order effects.

While this may be true for simple industrial applications with robot arms of
infinite structural rigidity... I have learned two things from this post:

[1] Physical qualities of the robot arm (beyond range of freedom) can
materially affect path planning

[2] Sometimes there is more to a robot than an end effector doing work (e.g.
rapping robot's funky dance moves). Raises the question though... does the
entire robot arm effectively become an end-effector? Can we consider
delighting an audience with cool dance moves 'work' being performed?

Thanks for sharing, and hats off to the Disney team! Great work!

------
_Microft
People limping might serve as example that a _' state of the art, well trained
neural net'_ can't achieve good motion with less than perfect hardware.

So how much of this is actually adressing the wrong problem, that is targeting
the actuation instead of how the model should be build to achieve the result
i.e. dampening oscillations?

Here's the machine-learning stick figure walking model that most likely all of
you know. How much of the problem with getting them to walk decently is a
wrong model with bad contraints regarding angles at joints or length of limbs?
Shouldn't that be evolved iteratively as well to achieve the best results
possible?

[https://www.kurzweilai.net/images/humanoid-walking-
training....](https://www.kurzweilai.net/images/humanoid-walking-training.gif)
(not trained well yet btw)

~~~
icebraining
> People limping might serve as example that a 'state of the art, well trained
> neural net' can't achieve good motion with less than perfect hardware.

Toddlers are a good example of the opposite (fall down quite a bit even with
"good hardware"). Seems intuitive that you benefit from having good control of
the hardware, even if you can also improve it.

~~~
_Microft
I don't consider toddlers as well-trained in that regard. Or ... any other ;)

~~~
icebraining
That's kinda my point :)

------
latchkey
I can imagine adding this to gimbal stabilization within a drones camera. Even
with a 3-axis gimbal, which is already pretty stable, I bet it would make
things even better.

~~~
gugagore
You would probably need to use sensing to improve stabilization there. The
output of the approach here is just a position (I'm assuming) trajectory. They
manage to do that because they have a good enough model of the beams and
masses. I imagine it's harder to get a comparable model of the disturbances in
a drone.

~~~
jcims
There's a pretty cool project using ML to help translate control from
simulators to the real world. Video if it here, paper in the description:
[https://www.youtube.com/watch?v=aTDkYFZFWug](https://www.youtube.com/watch?v=aTDkYFZFWug)

This seems like it could also help deliver something similar to what Disney
has done (or possibly improve it)

------
gtm1260
Fantastic work. Is there any code/libraries online for this or is my only
option to implement the paper?

~~~
guiambros
I also wasn't able to find any sample code, but this thread [1] has some
additional comments about the techniques used in the paper.

[1]
[https://www.reddit.com/r/robotics/comments/cjy77r/r_vibratio...](https://www.reddit.com/r/robotics/comments/cjy77r/r_vibrationminimizing_motion_retargeting_for/)

------
justinclift
Is this different from 6th order jerk control motion planning, as used by
modern CNC and 3D printer controllers? eg TinyG, g2core, recent Marlin (PR
10337)

~~~
raverbashing
I think it's different in a couple of aspects:

\- there's a stronger timing constraint to the positioning (more than in
CNC/3D printers)

\- the arms/body are significantly more wobbly

\- Apparently you don't need to model the wobbly response "by hand"

~~~
justinclift
Cool, thanks. :)

Hopefully the algorithms (and code) they're using get Open Sourced at some
point, so they can be looked over and possibly incorporated in things.

