The first goal was creating the eye movement mechanism. Since eyes are a central part of facial expression, it felt crucial to start with this system. Prior to starting the design, I had a conversation with Michael O'Gara, a student at Drexel University. Michael had created a pair of animatronic eyes, and we talked about how to control the angles of the eyes. Since eyes are essentially balls, their positions are controlled via angles in two directions. Thus, the following design was created. The eyes move via a ball and joint system, which are connected to the four servos. A servo is dedicated to each eyelid and the x and y direction of movements for the eyeballs.
Day 1: May 13th, 2021
On the first day of CAD, I started with determining how I was going to make the eyes move. I created preliminary sketches on a tablet to get a rough idea of how the mechanism could look (See the second image below). After creating these sketches, I researched average eye sizes and pupillary distances. This gave me a rough idea of how to dimension the eyes in Inventor. The creation of the eye models and adapters in the back was not difficult and was created in around 30 minutes.
Day 2: May 14th, 2021
Initially, I set out with the goal to control the eye movements using only two servo motors, one for the x-axis and one for the y-axis. However, after modeling, the design was deemed nearly impossible mechanically (See the third image below). So, the design was updated to utilize two servo motors for each eye (See the fourth image below). This significantly improved the design, and allowed for independent movement in each eye, and allows for more flexibility.
After completing this design, a major issue came while testing the motion of the eyes in Autodesk Inventor. Due to the connections between the eyeballs and the servos, the eyeballs were unable to be positioned at complex angles, where the x and y-axis contributed to the location. So, another redesign will have to be made.
Day 3: May 16th, 2021
After redesigning the eye mechanism a final time, the eyes can now move in all directions. The solution involved adding pivots inside the eyeball, as the ball joints and rods only was too rigid to accommodate for the complex angles (See the fifth image below). The next goal was to then add eyelids to the eye mechanism. These were relatively easy, as they only needed to rotate along their x-axis centroid (See the sixth image). The eyelids rotate about a post and will be pushed by servos in the extra holes on their edges.
Day 4: May 24th, 2021
The final day of destining the eye mechanism involved finishing the eyelid movements and completing the supporting structure. The eyelid mechanisms were easy to create and only involved determining where the servos would be mounted onto the base. The first image below is the final render of the eye mechanism. The next step was 3D printing the eye mechanism to test to see if it actually worked.
The second goal was creating the jaw mechanism so that the animatronic could produce mouth movements. This design was created in one day, as it was extremely simple. By researching average anatomical data of human heads, I was able to determine the correct proportions for the mouth of the animatronic. This data is similar to that used for the eyes. The data gave me an average pallet size, and average distance between the eye line of a human and mouth line. Using this data, the following model was created (see image to the right). The mouth only uses one servo, as it only has one range of motion. Differing from actual humans, the mouth contains 36 teeth compared to the normal 32 teeth. This was considered for simplicity purposes (all teeth are the same size in the animatronic) rather having realistic and varying teeth dimensions.
The third goal was to create a simple skeleton all the head parts could sit in, and for a potential skin to go on. Using a 3D model from grabCAD, lofts were used to get a basic head shape. The top of the head is hollowed for the microcontroller to sit in, as it is the "brain" of the animatronic. As such, the shape of the top head needed to still be there, so two slender members were created as well. The head skeleton has a platform for the eye mechanism and a slot for the mouth mechanism to sit. All subsystems were put into an assembly to ensure they were all symmetrical and centered. This assembly can be seen below.
The fourth goal was to create a simple pan and tilt mechanism for neck movements. Unlike previous models, this was created in SolidWorks, as I had recently acquired my CSWA and was more comfortable. The neck mechanism uses two servo motors, one for each degree of freedom. The arms of the servos are superglued into their respective slots, keeping the arms in place and allowing the servos to rotate, but keeping the base in place. The pan and tilt system was inspired by camera gimbals, and research was done into some hobbyist pan and tilt mechanisms, such as ones released by Adafruit. The neck mechanism attaches to the base of the skeleton withfour screws. The most difficult part of the design was engineering tolerances. Since the width of the 3D printing nozzle was 0.4 mm, that had to be accounted for in the boxes the servos sat in and the arms.
Using an FDM 3D printer, all parts were created using PLA. The first system to be created was the eye mechanism. The biggest challenge with the eye mechanism was twofold: getting the eyes to stay in their ball joints and creating enough seperation between the eyes and the eyelids.
The eye was redesigned for the third time. Keeping the ball joint in the center of the eye, it was made larger, due to the low resolution of the 3D printer. As such, the original tiny ball joints would not work. So, using inspiration from puppeteering, wires would be used to move the eye in its two degrees of freedom. The wires would need to be hooked at the ends, otherwise, the eye could not have all its DOFs without potentially breaking the servos of the structure. The post the ball joint is connected to also needed a slight redesign. Not pictured here, the portion that adheres to the main eye mechanism structure got a wider base to increase stability.
Once printed, the eye mechanism (minus posts and eyelids) was fully assembled. Pictured is the first fit test to ensure all servos could be mounted. The second image is showing some servos mounted during assembly
First Test of Eye Movement
Above is the first test of controlling the eyes using the ArduinoBlue app for control, connecting to the Arduino using an HM-10 BlueTooth module. The Arduino computes the location of the joystick based on incoming serial data and then uses that to determine how the servos should move using map functions.
Second Test of Eye Movement
The second test of eye movement was to show limitations with the ArduinoBlkue app. The joystick did not function as desired, and would not give much data for the up and down movments of the eye. As such, the controlling app was changed to Dabble, another iPad-based BlueTooth control system.
Assembly of the Jaw Mechanism
Above is the fully assembled jaw mechanism with is one degree of freedom. The servo is superglued to the bottom jaw, allowing it to rotate about the servo, as seen in the CAD rendering.
Head Skeleton 3D Printing
This is a short video showing the printing process of the head skeleton. It was nearly complete when the video was taken.
Fully Printed Head Skeleton
The last part of the head assembly to be printed was the head skeleton. This process took over 36 hours and had no issues while printing. The big challenge was removing all the "support" structrue as it had fused to the main part during printing.
Once all parts were printed, the head was fully assembled. The first step was to attach the mouth mechanism. This was simply super-glued in place, as the head skeleton contained a slot for the servo to rest on the bottom of the eye mechanism plate. The second step was to install the teeth into the mouth. As mentioned in the head goal #2, all teeth are the same size for simplicity. These teeth were later painted white, rather than being silver. The last part was installing the eye mechanism. These had to be installed prior to the eyelids being put in place, as the opening at the front, or at the top, was not big enough. Additionally, all wires connecting the eyelids and eyes to the servos would be installed after the mechanism was in place. The eyes were then removed once installed and painted white and given pupils. The eye prints had a natural circular bottom, so painting them was simple. As can be seen in the first image, the head skeleton did not have the slender members from the 3D CAD render. These were printed later, as the printing volume of the 3D printer was not large enough for the whole head.
First Blink Test
One of the crucial movements of the animatronic is the ability to blink. As such, much time was spent in getting a perfect blink animation. In the images to the left, there is a clear image of some "deforming" that had to occur to the base of the eye mechanism. Due to the printing resolution, the bottom eyelid could not move and was colliding with the eye structure. So, unessential material was removed to allow for the eyelids to function. The second image shows a fully assembled eye, with eyelids and an eyeball. The video above shows the first test of a blink with just the top eyelid The same process was repeated for the left eye mechanism. This was controlled by pressing a button on the iPad controller.
Second Blink Test
The second blink test shows both top eyelids moving in unison. Both eyelids can be controlled independently, but for most cases, they will move together.
Eye Movement and Blink Test
Once fully assembled, this test shows the movement of the eyes with intermittent blinking. At this point, the eye mechanism was complete. All its functions were installed and opertiaonal.
Neck Movement Test
As mentioned previously, the neck mechanism has two degrees of freedom. The third degree of freedom, z-axis rotation, will be added later. The above test shows the head tilting left and right. The jitter is caused by the slow movement and the servos breaking in, as the base is not strong or heavy. Over time, it is expected that this motion becomes smoother with better programming. The images show the neck mechanism from four angles.
Once fully assembled, the first test of all systems working together was conducted. The voice of Tom was created using a text to speech generator, and placing the created .wav file onto a sound daughter board. By analyzing the waves of the sound file, I could time the mouth movements to each syllable of the words. This is the animation that plays on powering up the Arduino after all systems are checked to be functional.Â
At this point, the head system of Tom Progress is fully complete and ready for any programming. The animatronic can also be fully controlled via the iPad as originally intended as well.