Locomotion is largely based upon the designs documented by Cynthia Brezeal Ferrell (MIT mobile robots lab, under Prof. Rod Brooks) in her PhD thesis for the hexapod robot, Atilla/Hannibal.
The first attempt was using Python which presented two insurmountable problems : 1) raspian OS boot time of 1.5 minutes which is unacceptable for an embedded device and 2) python threading is not sufficient for realtime. I was attempting to make series elastic actuators but the imprecision of the threading (jitter) was leading to wild oscillations... I finally had to accept it was a dead end.
I have started over in Elixir + Nerves which is designed at its core for embedded work. I will admit it is very slow going. Not because of any deficiencies in the language or environment. Quite the contrary -- I get a 10-second cold boot time and superb stability! But rather my mind is the limiting factor here. After three decades of imperative programming, the shift to functional programming is a challenge!
I also remember reading awhile back about the huge market for upkeeping these things once Sony abandoned it. Perhaps you're onto a business!
In the mid 2000s, I had written some code for the Aibo to let it read books aloud.
Flash forward toa couple years ago. My kids saw some video of me with the Aibo Reader project and wanted an Aibo of their own. Unfortunately, the Aibo as a product is dead. The batteries are now dying and irreplaceable (thanks to Sony and their idiotic insistence upon DRM - it's not enough to merely provide electrons, the battery must also know the secret handshake in order for the Aibo to accept it. Maddening!!!) And those few used Aibos that do have working batteries are dying from other issues related to mechanical failure (mostly clutches in the head/neck assy.) yet command a premium price.
So now, my kids still want an Aibo. I got to thinking about how far technology had come in the past 20 years, and started wondering to myself if I could build a facsimile with the RPi. Thus the birth of this little side-project.
Using a flexible servo horn, this allows some degree of deformation when torque is applied. Embed a 1/32" neodymium
magnet in the horn, and a magnetic rotary position sensor in the driven part. As resistance to movement ("torque") increases, the flexible servo horn twists/deflects, angle of difference between the two parts changes and this can be detected by the magnetic sensor.
Picture a torque strain gauge, the way the needle deflects from the centerline of the gauge. This is similar but we're doing so in an radial (edit:removed axial) and not linear sense.
I went on to 3d print a flexible servo horn somewhat similar to this:
but with the magnet embedded in the outer rim of the wheel and the sensor in the inner section.
Different amounts of sensitivity are obtained by printing different spoke stiffness on the horn.
Result is a compact, modular torque sensor for about $2 in parts.
Feed the results to a PID look and you have a nice controller that can tell me when a robot leg is bearing a load, or is jammed, etc. This is essential to proper gait. Otherwise your robot is simply an electronic marionette.
One of the big challenges was getting 16 additional 16bit voltage reads back to my Rpi over i2c bus. (one value for each servo)
Soln: used one of these multiplexer boards:
and I used AdaFruit's excellent servo controller board: https://www.adafruit.com/product/815
and some neopixel rings for eyes:
(wonderful for expressing mood)