

OK Go Makes Lib Dubs Back - Gozu
https://www.youtube.com/watch?v=u1ZB_rGFyeU&feature=youtu.be

======
geerlingguy
Some people are speculating the ending sequence ( _spoilers_ basically a large
grid of 'umbrella pixels') was done digitally, using some part of the drone's
ascent to switch to CGI-umbrella-mode... but I'm wondering, if it was filmed
live, what kind of coordination would they be using?

I would imagine they could run some wire out to a bunch of little LEDs (one at
each position), and have the LED's light from a simple Raspberry Pi + Arduino
setup (then it's just a matter of the umbrella people running to their mark,
then opening and closing with the LED).

Any other ideas? Pretty good video.

It looks like the seat bot they used may have been the Asimo U3-X Personal
Mobility[1] robot.

[1] [http://asimo.honda.com/innovations/u3-x-personal-
mobility/](http://asimo.honda.com/innovations/u3-x-personal-mobility/)

