Microcontroller
Arduino Nano ESP32
The robot is controlled by a total of 7 motors.
2 90g servos for eyebrow movement
1 90g servo for mouth movement
2 20kg servos for head movement + rotation
2 20kg servos for body movement + rotation
For our Build18 Hackathon Project, we made a Robotic Hand Puppet decorated to look like Oscar the Grouch from Sesame Street.
Build18 is an ECE Hardware Hackathon where teams are given a budget of $250-$300 to build anything they want. After winter break, teams will receive the parts they ordered and have 1 week to complete their project.
The 90g motors we used were too large to be placed directly on the mouth and eyelids. Even with the fur covering the head, the blocks would stand out as unfaithful to the original character.
How do we translate motion to the eyes and mouth while hiding the size of the servo bodies? We use a 4 bar linkage!
From left to right:
Hyeyoon Song:
Designed and manufactured the controller
Sowed the costume onto the robot
Claire Kim:
Programmed the Arduino Nano Esp32 to control the servo motors
Jieun Lim:
Programmed an AI model for users to interact with the puppet
Namky Eun:
Designed and manufactued the robot arm and head
During the Build18 Showcase, our robot puppet could not move his eyes or mouth. He could only do a backflip
While this project did not work as we had intended, our team still learned a lot from this project.
Learnings:
How to use a 4 bar linkage to translate motion.
ESPNow protocol cannot control more than one sensor.
Use belts to lift higher torque arms.
Add more plates around facial features to attach fur more easily.
Fur is heavy, take the extra weight into account.
Snap-on hinges are not weight friendly. Use pins or bolt sleeves instead.
Servo horns are easily stripped. Use metal ones instead.
A quick compilation of videos we took while working on the robot together.