Tag Archives: Maya

June 2, 2013

So, Final Major Project is over and handed in, which means my degree is also finished. There have been odds and ends requested from tutors and its left me feeling like I’m in some sort of strange limbo. The degree is finished, but with all these extra hand ins (that dont even affect my marks) its hard to move onto the stage of tidying up my CV ready to apply for jobs.

Anyway, here is my final showreel, which has a variety of my best work from 2nd and 3rd year.
YouTube Preview Image

This is the deformations and rig demo for the elephant rig.
YouTube Preview Image

This is the deformations and rig demo for the toony monkey rig.
YouTube Preview Image

This is the unfinished version of my elephant advert. As I have mentioned previously, this project had a bumpy journey to get to hand in and sadly it never quite made it to completion. However, I thought I would upload it anyway for those curious about the project.
YouTube Preview Image

Now that the first hand in has been and gone, I’ve been finishing off the rigging of the monkey. There were a few bones in the face that I had animated for the deform test that weren’t actually rigged. Firstly, I created some curves to control the ears and the two tufts of hair on the monkey’s head.

One of the most important things I still needed to create was some eye controllers. I wanted each eye to be controllable individually, but I also wanted the animator to be able to easily control them together. As such I decided to create two text curves (R and L) and aligned them with the eyes. I then used an aim constraint so that the eyes point at the controllers at all times. I created a third controller that would move both R and L together.

Eyes01

I also wanted the animator to be able to control the size of the pupil, as I have always found this to be a particularly effective tool to portray emotion when animating a “toony” character. Unsure of whether the animator would want to be able to control the pupil size from the individual eye controllers or from the dual controller. I decided to give them the ability to do it from either. I added an attribute to R and L for pupil size, and two attributes to the dual controller. I then wired these attributes to two plus/minus nodes. I have set the nodes to addition and as soon as I have the eye textures, I will wire the nodes to the scale of the texture.

Eyes02

Rigging the tongue was slightly frustrating as I really struggled to create controllers that could be selected whatever position the tongue was in. Initially I created simple curves that sat above the tongue. However, when the tongue was rotated upwards, the top curve disappeared into the back of the mouth.

tongue01 tongue02

As such, I decided to create some loops that went around the tongue and combine these with the curves above the tongue.

tongue03

Having finally created working strethcy IK legs, I could get to work on making a set of FK legs. Like the IK, I duplicated the deform skeleton, but replaced the _jnt with FK_jnt. Unlike creating the IK leg, I didn’t need to create anything except controllers. I used the curve tool with snap to vertex turned on and drew the shapes around the mesh of the monkey. I used the MEL script “parent -r -s” to attach all the curve shapes to a single curve node so that the animator can click on any of the curves to select the entire controller.

The thigh and ankle controllers I drew with curves, but I just used circles (which I editted slightly) for the knee and tibia joints. To try and keep the rig clear to use I colour coded my controllers. The most important controllers I changed to a brighter colour and the less important ones (like the tibia joints) were darker.

Foot05 Foot06

Like before, I nested all of these controllers in a double set of groups (_SDK and _0). I then parent constrained each FK joint to its relative controller.

As the toe controls were parented directly to the deform skeleton (there was no seperate FK and IK version) I needed to get the main toe controller to follow both the IK and the FK skeletons. I created an FK/IK switch control for the leg and then proceeded to parent constrain the main toe controller to both the end of the IK and FK controller heirarchies. At the same time, I also parent constrained the deform joints to both the IK and FK joint chains.

I then had to wire up the switch to control the parent constraints between FK and IK. I used the hypershade to do this. I brought the switch and the parent constraints in and also created a reverse node.

Foot07

I then also wired up the visibility of the IK and FK controllers so that the animator can only see FK controls when the switch is at FK, and IK controls when it is at IK.

I decided to colour code the IK and FK so that it is clear at a glance whether the leg is set to FK or IK. For the IK I chose red (left) and blue (right), and for FK I chose pink (left) and green (right). However, since the toes are the same controllers whether the leg is in IK or FK, the controllers were just a single colour. I spent a bit of time trying to work out how to use the switch to drive the colour override for the control shapes. Eventually I found that I could use a condition node that was true when the switch was above 0.5 and false when below 0.5. Then I wired the condition node to the drawing override of the controller. As the drawing override is not in the short list of things in the hypershade I had to open the connection editor and wire them up in there.

Foot08 Foot09

Finally, I had to work out what number represented each colour. With some trial and error I eventually found the values and wired up all of the toe controllers so that they change colour. Success!

Foot10

April 8, 2013

With my scenes set up for my animators on the elephant project, I can finally get to work on my side project; the toony rig. The first step to rigging anything is to build the joints that will drive the deformation of the mesh. I call this the deform skeleton.

The first thing I create is always the spine. The pelvis is the start of the chain and the head the end. I generally name joints as I go to avoid confusion when the entire skeleton has been built. I used the same naming convention as my elephant rig. A prefix of C_, L_ or R_, the name of the joint and a suffix of _jnt.

Joint Creation 01 Joint Creation 02

The arm was the next thing I built. I initially place the bones using the orthographic front view. I line up the joint positions to the hand as well as possible. However, because my view is orthographic, the joints are all created on a flat plane. This does not match the shape of the hand, so I then use the perspective view to line everything up correctly. I then repeat the process with the foot, but this time I create the joints from the orthographic side view. To ensure my joints orient correctly later on, I make sure that the first finger or toe I create is the most central one. This simply means that the joint in the palm or sole of the foot will point to this finger when oriented, instead of out to one side at, for example, the thumb or little toe.

Joint Creation 03 Joint Creation 04 Joint Creation 05

 

With all the joints created on one side of the body, I now need to correctly orient the joints. This is good practice because it ensures the joints will all rotate nicely in the same axis. These two images show the spine axes before and after the orientation. I use comet tools to do the joint orientation.

 

 

Joint Creation 06 Joint Creation 07 Comet Tools

After orienting the joints on the hands and foot, I tweaked the angles of the thumb. The reason for this is so that if all the finger joints are selected and rotated in one axis, they should close to form a fist. This makes it much easier for the animator to work and keeps the animation curves much cleaner.

Joint Creation 08 Joint Creation 09 Joint Creation 10

With the orientations tidied I could finally mirror my arms and legs so that I had a complete skeleton. Unfortunately, once I had mirrored the joints, it was clear that the mesh had not been mirrored correctly and was not symmetrical. I have contacted the artist and asked if they can have a look and fix it. However, since it is the easter holidays, I have no idea when they will see my email, let alone send me the fixed mesh. I dont want to continue rigging, just in case there is a problem and they are unable to get things symmetrical for me, as that would mean rebuilding the joints for the right arm/leg seperately and I would have to redo any work I had already done.

Not quite symmetrical

April 8, 2013

I think the camera tracking has been one of my biggest worries during this project. It is something I have only fleeting experience with, but it is essential for producing a polished end product. When filming any moving camera shots we decided to keep them on a tripod and just pan to keep things simpler. The first shot I chose to track was probably one of the most important shots in the entire project: the very final shot where the elephant gives the monkey back to the mother.

I have been using Boujou as this was the program we were introduced to during a previous compositing project earlier in the year. It had produced a great result for me in the past, but I was aware that others on my course were not so fortunate. Boujou does not always get it right first time. I imported the footage and set Boujou to tracking the movement in the image. To do this, Boujou latches onto distinctive areas in the image (ie colour changes which suggest edges of objects). It then tracks how these points move throughout the footage. To get the best result, it is generally necessary to have points that occur on the x, y and z axis within the 3d space of the shot. However the shot I was working with had a flat wall in the background (which meant no track points in the z axis) and I worried that Boujou would struggle to know how close or far away the camera was throughout the shot. The next stage of the camera tracking was to ask Boujou to use these track points and their movement to create a camera in 3d space that matches the movement of the camera in the shot. Thankfully I found an option at this point to tell Boujou that the camera was nodal. This means Boujou knew that the camera was on a tripod and in a fixed position and could only rotate. Finally, I could export the information into Maya and check to see whether it worked. To my delight, the tracking seemed great. However, I quickly noticed that objects in the scene seemed to suddenly move up and down, or side to side out of time with the footage. Since I have no knowledge of how to correct this in Boujou I decided to see if I could fix it without too much effort in Maya. I played the animation until I found a moment when objects in the scene moved out of sync with the footage. I then checked the camera’s curves in the graph editor to check if there were any odd kinks or jumps. Most of the problems were extremely easy to find and fix, but there was one that was extremely frustrating. About two thirds of the way through the scene skipped sideways suddenly and then gently eased back into its original position. I could find nothing on any of the curves that would indicate the camera was rotating like this. I spent a long time trying to establish whether it was just one curve or all of them effecting it, but eventually, after some painstaking work tweaking each individual key, I managed to tidy it up so that the movement was barely noticeable.

I wanted to set up the scenes that my animators would be using before I did anything else, so I then moved on to some of the scenes with a fixed camera. I created a new Maya file for each one, referenced the elephant rig and created a new camera. I created an image plane for the camera with the .png sequence of the correct shot. I cannot believe how difficult it was to then actually manage to position the camera in 3d space so that it lined up with the footage. I had foolishly assumed that it would be the moving camera shots that would cause me the issues when all along it was positioning cameras by hand that would be my downfall. The situation was made more frustrating by the knowledge that I had assumed Motion Graphics were taught easy ways to work out the position of cameras in still shots. In 2nd year, we learnt to take a photo with the camera in the same position and an item in the shot that you knew the dimensions of. You could then use this item to help line up the camera. Since I was relying on the knowledge of my Motion Graphics students as I knew they had plenty more experience than me, I readily accepted their answer when they assured me nothing was needed for these shots. It was pretty galling when I asked them later on how I would be positioning the cameras and they answered “by eye”. I have at least learnt one lesson from this. Whenever possible, if filming for a VFX composition with a still camera, make sure there is something in shot that you know the exact dimensions of. It will make your life so much easier.

March 14, 2013

So yesterday, I finally got the elephant rig to a point where it could be referenced into the animator’s files. One of the major things I worked on was updating all of the controllers so that they are clearer and fit the model nicely. I foolishly assumed this would be easy, but I hadn’t reckoned on the awkwardness of the curve creation tools in Maya. It took me quite a while of just repeatedly trying to create shapes and deleting them as they failed to work. I think my determination to create things that were perfectly symmetrical possibly did not help the situation, but an assymetrical controller just doesn’t look as neat and clean in my opinion. Eventually I hit upon the idea of using the snap to vertex tool and using the edges and vertices of the elephant to help me create controllers that fit nicely to the contours of the elephants body. Having drawn a selection of curves I needed to then join all the individual curves together into a single item. This involved reparenting the individual curve shapes a single curve node and then deleting the rest of the empty nodes. Frustratingly I could find no way to tell Maya to actually combine all the shapes nodes on each curve into one single curve, but each controller selects the entire item wherever you click it, so it still works, its just not as clean as I would like it to be. I then scaled the controllers out from the body slightly and coloured them. I had hoped I could then parent these new shapes to the controllers already in existence (as I had with each individual curve to make the new controller), but every time I tried, the new controllers were rotated strangely and moved away from the body. This was due to the difference in positions of the pivots of the old and new controllers. Hoping I could avoid having to reposition each new controller I decided to instead break all the constraints and set everything back up on the new controllers. It turns out I still had to reposition the pivots, and so rearrange the shapes, but at least I knew I didnt have to spend time trying to delete the shapes of the old controllers, I could just remove the entire item.

I did, however, forget to redirect the spine rotation to the new controllers, so I had a bit of a scare later in the evening when I created a global control and tried to check that everything moved as I wanted it to. When the elephant rotated 90 degrees, the spine flipped, presenting a problem I had first encountered in my 2nd year when rigging a quadroped in 3ds Max. I panicked for a while that my IK spline spine was in fact broken and I would have to come up with a completely new set up. However after I checked the IK I realised that in creating the new controllers, I had not told it to use them to decide the rotation of the spine. Thankfully, this fixed the problem.

IKspine03a

I also needed to update the rig with the new low poly model that my artist had altered for me. I brought the mesh in and whilst trying to work out how to load the skinning from the old mesh to the new mesh, I found an option that instead replaced an old mesh with a new mesh. I tried it out and it worked brilliantly. The old mesh changed to the new mesh. However, I now had two versions of the new mesh, one that was skinned, and one that was not. Assuming that the unskinned mesh was no longer needed I promptly deleted it. A couple of hours later, when testing some other part of the rig, I discovered my mesh no longer seemed to be moving with the bones. Confused I saved the file under a new name, closed it and reopened it. To my horror, the mesh was now invisible. The outliner still showed all the various parts of the mesh, but I couldn’t get them to appear.

MissingMesha

I hastily opened my previous iteration only to discover that that file suddenly had exactly the same problem. Desperately hoping I hadn’t somehow broken every single version (and so lost all my skinning) I tried the next step back. To my relief the old mesh was there and skinned and working absolutely fine. I had simply lost my day’s rigging work, but nothing else. Deciding that replacing the mesh clearly wasn’t the best method to update my rig, I started working on saving off the skinning so that I could load it onto the new mesh. Frustratingly it seemed Maya was only giving me the option to load each bones skinning one at a time. It was doable, but a bit pointlessly time consuming. Fortunately, I knew one of my classmates, had successfully, and easily, loaded skinning onto new meshes during his project. I asked him about it and he showed me a quick and easy method. It involved skinning the new mesh to the bones (but not editing it at all) and then telling the new mesh to look at the old mesh for the skinning values. Maya can load the skinning in a variety of ways, by volume, by UV map etc. It was brilliant and loaded the skinning onto the new mesh perfectly. I didn’t even need to tweak it, though Joe had warned me I might need to. This is great to know as I now know I can quickly skin the high poly elephant to the rig (and tidy it up afterwards) as soon as it is ready. I will not have to go through the time consuming process of skinning from scratch again.

The last thing I needed to build was dynamic tail. Having already gone through the long process of working out how to do the trunk, it was simple a case of repeating the method on a much simpler chain. The dynamic output curve became a blendshape for the spline whilst the controls affected the dynamic input curve. Again, unfortunately, the rig doesn’t update its position until the animation is played, but, to my current understanding of dynamics, there is no way around this.

I also created a control for the tail that will rotate all three FK controllers at the same time. I actually created three of these for the trunk as well, so that an animator can control the entire tail (or a section of trunk) without having to select a whole bunch of controllers. Every controller I create is parented to a group (with the suffix _SDK) and that group is then parented to another group (with the suffix _0). The _0 group becomes the null group, which provides a 0 point for position and rotation. The _SDK group allows me to create batch controllers whilst still making the individual controllers able to tweak the bones position. I simply wired the rotation of the batch controller to the _SDK groups of the relevant individual controllers. When the batch controller is rotated, each _SDK group wired to it also rotates. The individual controllers parented to the _SDK groups also rotate (due to the parenting) and so rotate the relevant bones. However, because the controls are not wired to anything, the animator is still able to tweak the position of the bones individually at any time.

 Tail01a

I then set up a switch for the tail to allow the animator to blend between dynamic and FK. Like the trunk I also set up some attributes to allow the animator to change the stiffness and flexibility of the curve dynamics if they wish.

Tail02a

Finally, I added some empty attributes to various controllers ready to be wired up to blendshapes when I have the highpoly mesh. I created them in advance so that it is less likely there will be any problems with the referencing when I update the rig later on. I wanted to make sure that everything that might be animated was already in place, and so it is only skinning and wiring and not controllers that will change in future files.

January 29, 2013

We had another compositing project last term. Whilst the technical side of things were very successful, the animation suffered. It left me content with the idea that my interests, and skills really do lie more with the technical side of things. Although its really satisfying to produce an animation that works and looks great, I just cant seem to do it as quickly or easily as others on the course do. I can pose the model, create some really strong poses and silhouettes for the camera. However when it comes to linking things together and getting it smoother, thats where I struggle.

The camera tracking was really successful for this project, mainly, I think, because of the location and contents of the shot. There were lots of trees at multiple distances from the camera. This provided Boujou with lots of large tracking points the calculate the cameras position in every frame. The success of the camera tracking really helped to make it feel as though the raptor was there in the shot.

Unfortunately, the camera that we used just wasnt good enough quality for the project. The final video had compression artefacts and the low quality lense meant the footage was grainy and blurred. This meant that when I came to composite the raptor into the scene, I really struggled to make the crisp CG render look as though it were as blurred, grainy and unclear as the original footage. I added several different blur types as well as a grain. Sadly, what I couldnt see in the playback in after effects, was that the grain I added actually changed every frame, causing it to appear animated and making the raptor appear slightly sparkly.

YouTube Preview Image

 

I used the Dino Rig for this project, which I got from Creative Crash: www.creativecrash.com/maya/downloads/character-rigs/c/dinorig

October 24, 2012
Posted in: Glamorgan

So, term started a few weeks back, and after a week of throwing around possible ideas for major project things actually got going. This project is all about learning new skills; specifically Maya skills. Gotta say, bit of a shock to the system. Whilst I can see the potential Maya clearly has, there is a part of me that loathes it every time I try to do something. Why must clicking on things be so difficult?! That, really, is my number one pet peeve. Trying to select items in Maya feels a bit like bashing your head against a brick wall. You click, Maya laughs to itself, knowing you were one tiny pixel from success and promptly does the exact opposite of what you wanted.

None the less, I do seem to slowly be getting to grips with all its little quirks and slowly but surely some progress is appearing in the project. That being said, Im still behind schedule, so its not perfect yet.

For the first week we spent some time doing some reasonably simple ball animations, just to truely get to grips with animating in Maya and how the program works. This second week is when things really got going. Rigging and animating. However, since I have very little interest in bipedal and character animating, my course leader has changed my brief up a bit allowing me to spend a larger period of time on rigging and then a short bit of animating. It took a lot of tutorials, and a huge amount of frustration as I hit brick walls where my knowledge of the program ran out, but I have finally managed to complete a nice little horse rig. Now I just have to animate a walk cycle with it! Before attempting the horse I did quickly complete the “ball with legs” rig that the rest of the animators are doing to ensure I had a reasonable understanding of the rigging basics in Maya.