Week8: Phonemes

This week’s session explores importing sound into Maya and then animating the model’s mouth. It is important to note that the sound format needs to be converted to WAV or AIFF at first. After imported audio successfully, you will see the green audio track in Maya’s interface.

Suppose there is no audio track that appears in Maya after importing the correct audio format. You can [Right click] and select [Sound] to see the properties’ status. Select [reference-sound], and you will be able to play the audio.

A reference video about how to import sound in Maya:

Because my native language is not English, so I need to find more theoretical knowledge and observation of correct articulation phonemes for this assignment. This research will help my model to make the correct phonemes animation. I collected some diagrams of mouth shape. From the pictures, I observed what mouth movements and shapes look like under different phonemes. In this way, I could observe the reference pictures while pronouncing them and then adjust my model’s mouth shape during the phonemes’ animation.

This video also helps me learn how to promote the 44 phones in the English alphabet:

I chose to use an English-speaking video as my reference. This video allowed me to observe the presenter’s mouth to create an animation and then matched the audio.

First, pick a model. I think choose a model with a useful facial skeleton controller was my goal for this step. Finally, I choose to use Family Man! The rigging on this model is good. It’s helpful for me to control the model’s face. (This controller can change the character’s clothing. In the end, I chose the look in the suit.)

Start to create a scene environment. Because the characters’ hands and arms in the reference video were in contact with the table. So I was just simply making the table and chair. To see if they would fit my character’s movements. Next, let me import the reference video and audio into MAYA.

I decided to adjust the facial expressions and mouth shape first.

I had to make sure that I selected the full-body controllers before I made the animation. After selecting all the full-body controllers, I found that there were still many controllers that were not selected. So I create a [mel] that would help me to select all the controllers.

Because I had to create the characters’ facial expressions and mouths’ animation first, so I create the mels for the neck and face controllers. Now I have two mels for the full-body controllers and face controllers.

When making the character mouth patterns, I started by thinking about the transformation between each one. Initially, I planned to create mouth variations. But I found that the character’s face bound chin also affected the weighting of the lower lip. In the end, I chose to animate the character’s mouth, eyes, and eyebrows at the same time as I animated the character’s phoneme.

But as I was working on it, I found a new problem. The eyes of the character model did not close completely. So I try to keep the eyes open at keyframe 54. At keyframe 56, I quickly moved the upper eyelid downwards. At the 58th keyframe, the eyes are open again. Luckily this worked to some extent.

When I watched the reference video, I noticed that the presenter spent part of the time with only his mouth moving. However, if only the shape of the model’s mouth changed in the animation, it would look very unnatural.  Based on the reference video, I added small adjustments to the model’s eyebrows and eyes. This step makes the animation of the character look more vivid.

At present, all animation controllers curves of facial expressions of characters:

Now, my character only has mouth animation. I think it looks unnatural. I decided to create body movements for the characters. Re-create a mel that selects the character’s head and body controllers. (I found it not very convenient to use the mel for the full-body controller to animate the body and face separately.) When creating the character’s body movements, I increased the swing of the character’s head. These body movements helped the character look alive. Also, I created the waist, arms, and hands’ animation.

Here are the animation curves for all the facial expression controllers:

This is a graph of the movement of the upper body controller:

Current character animation (helping me to familiarize myself with the use of the render farm once again) :

I discovered that the characters were having problems with overlapping models and stiff bodies during movement. I decided to tweak my character animations.

This time the optimizations were:

1.The presenter in the original video reference kept a smiling expression, so some mouth shapes were not as obvious. I modified and adjusted the mouth shape of the first and last words’ pronunciation by following my speaking mouth shape. (Animated curves of the current mouth shape)

2. In the previous version, the character’s eyes blinked too infrequently, and the eyebrows were a bit stiff. I added eyes and eyebrows animation details to the original keyframe.

3.Increased the range of arms’ motion. (Current animation curve of the arm)

4. The movement of the fingers’ hand joints has been added so that the hand movement will be more smooth. (Current animated curves of the fingers)

5.Handle situations where models and tables overlap.

Final rendered video:

I will continue to modify my phonemes animation in the future study.

Phonemes study materials:

This entry was posted in 3d Animation Fundamentals. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *