
Show HN: Robot that copies artist's exact strokes to replicate a painting in 3D - chrischen
http://www.instapainting.com/blog/research/2015/08/23/ai-painter/
======
chrischen
I'm having a problem smoothing the stepper motors with real time motion data.
I was hoping someone on HN could point me in the right direction.

~~~
tlb
You either need microstepping or servo motors with encoder feedback.
Microstepping drives like [0] are buttery smooth.

[0] [http://www.geckodrive.com/gm215-step-motor-motion-
controller...](http://www.geckodrive.com/gm215-step-motor-motion-
controller.html)

~~~
chrischen
In the video it's being driven at 1/8th step. I'm considering switching to two
servo motors and using a polar system.

The problem is I need it to also be fast enough to fee lag free for the
artist.

~~~
542458
You can get good 1/32 drivers like the DRV8825
([https://www.pololu.com/product/2133](https://www.pololu.com/product/2133))
for not too much - I'm pretty confident they'll be able to do what you need
(You can buy knockoffs for even cheaper). Even 1/16 is pretty dang smooth.

It also might be that you're supplying too much or too little current to the
motors - both will result in jerkiness and noise. If your current drivers are
current limiting, I'd look into tuning that. If they aren't, the ones I linked
are.

~~~
chrischen
They are current limiting and can go down to 1/16 step I believe. However,
even at the current settings the motors can't keep up if the artist whips
their arm from one side to the other.

------
salgernon
It doesn't do pressure, but this is along the same lines:

[http://shop.evilmadscientist.com/productsmenu/605](http://shop.evilmadscientist.com/productsmenu/605)

And if you've got an old serial plotter (or 3 or 7) that can emulate or do
HPGL:

[http://music.columbia.edu/cmc/chiplotle/](http://music.columbia.edu/cmc/chiplotle/)

There's something absolutely magical about plotters...

~~~
joshu
The watercolor bot can do pressure.

------
hpoydar
Very cool project. Can you give us some more details about the hardware and
software you're using?

~~~
chrischen
The hardware is [http://www.makeblock.cc/xy-plotter-robot-
kit/](http://www.makeblock.cc/xy-plotter-robot-kit/) sans electronics. I ended
up purchasing the electronics anyways because they have a modified Arduino
that uses RJ25, which makes the wiring much more stable (as opposed to pins
stuck into an Arduino).

If you got the individual pieces yourself or 3D printed it can be much
cheaper.

The main electrical components are:

1 x Arduino Uno (I used the modified Uno called "Orion" that has RJ2 ports) 2
x Stepper motors 1 x Servo motor 4 x microswitches as limit switches 2 x
stepper motor drivers (they handle the microstepping) 3 x RJ25 breakout boards
(for connecting the limit switches to RJ25 ports)

This was a prototype to prove the concept. I'm going to experiment with 3D
printed parts next in a new design.

If you're feeding in coordinates as Gcode I would recommend loading Grbl. Keep
in mind Grbl has specific pin requirements that make it incompatible with the
Orion's RJ25 ports, so you'd have to wire a normal Arduino. Makeblock supplies
Gcode firmware that works with the Orion.

For the demo in the video, the firmware was custom to handle real-time input
and recording of motion.

On the host computer side, it was just a python script that interfaced with
the Wacom tablet. Specifically:
[https://bitbucket.org/AnomalousUnderdog/pythonmactabletlib](https://bitbucket.org/AnomalousUnderdog/pythonmactabletlib)

I also hacked support for the Myo armband and mouse support.

~~~
dharma1
thanks, this is very cool

------
cocoflunchy
Not sure from the video, does the artist look at the robot when painting or at
a screen? I feel like it's going to be really hard to paint that slow, so it
might be better to simulate the whole thing (which will be way closer to real
time) and then reproduce it later on the robot (with some corrections if
needed). But simulating might be difficult... or lead the artist to use brush
strokes that are really hard to replicate. Cool project anyway! Let us know
how it evolves :)

Also just a heads up on the Amanufactory website the font in the Plans section
is really hard to read, you might want to check that out (
[http://i.imgur.com/Klvmorc.png](http://i.imgur.com/Klvmorc.png) ).

~~~
chrischen
The artist looks at the robot. They were going extra slow to be cautious, but
it can mimic movement pretty fast as long as you're not going across the
canvas in a single stroke.

Edit: Thanks, fixed the amanufactory.com bug.

------
noahbradley
Now you're talking. I can't wait to print out nicely formed brushstrokes from
a digital painting.

Something along the lines of embedding a bump map into the 2d image to give
subtle depth information. Let me know if anyone needs an artist to do the art
side of this.

~~~
chrischen
Can you email chris@instapainting.com? If you're in SF would love to have you
come in to try the full colored version.

------
sandebert
Cool project. Also, I like how the sounds the robot makes would fit perfectly
in just about any horror movie.

Example: [https://youtu.be/ip5Bt_-Yxsc?t=7m](https://youtu.be/ip5Bt_-
Yxsc?t=7m)

------
knicholes
This is cool. It reminds me of 3D printers, but just in two dimensions and
with a paintbrush and paint instead of a hot-end/nozzle/filament.

You could look at the firmware used to control the stepper motors/end stops
that much of the reprap community is using at
[https://github.com/MarlinFirmware/Marlin](https://github.com/MarlinFirmware/Marlin).
This uses GCODE as well, and deals with acceleration around turns.

~~~
chrischen
Yep looked into Grbl. Hadn't seen Marlin. I didn't use this for Live mode
since they weren't really built for that, but they'll be used for the photo to
painting part.

------
Raed667
As a kind I dreamed of a project like this, I got told "why would we need this
when we have printers". It was fifteen years ago.

------
geon
Why not skip the tablet and draw directly with the plotter?

You could probably even use the generated electric pulses from the steppers as
your input.

------
schanq
Reminds me of a project by Patrick Tresset @ Goldsmiths, UoL...

[https://www.youtube.com/watch?v=bbdQbyff_Sk](https://www.youtube.com/watch?v=bbdQbyff_Sk)
[http://research.gold.ac.uk/9255/](http://research.gold.ac.uk/9255/)

------
fictivmade
Since the strokes are recorded, is it translated into positional coordinates
and movements? Is it possible to adjust the "code" post-recording to perfect a
replicate or add changes? Cool project!

~~~
chrischen
Yes. You can fully manipulate the raw data afterwards to change it.

I actually truncated the recording to remove the test strokes at the
beginning.

------
zemotion
Super cool project! And damn unexpected to see Jean. Interested to see how the
colored versions will look. Good luck moving forward!

------
joshu
This looks familiar!

~~~
chrischen
Heh. This is just the beginning. Finishing up something tonight to reply to
one of your tweets with Cthulu.

