Category Archives: Glamorgan

April 24, 2013

Whilst I was skinning I discovered yet another mistake with my leg (and arm) rigging. When constraining the three tibia joints so that they would always stay the right distance from knee and ankle, I used a parent constraint. I had thought this would mean they would twist nicely when the foot was rotated. What I had not thought about was that this would of course affect their rotation in all three axes, and not just the one twist axis I wanted. When I started working on the leg and looking at the movement during deform tests, I realised my mistake.

LegTwistError

The joints were rotating and so rotating the leg mesh on an axis that the leg shouldn’t be able to rotate. I had to delete all the tibia and femur constraints (including the parenting to the deform skeletons) on the IK chains. I also had to delete all the skinning I had done so far, as the deform joints were also out of position.

I used the FK joints to reorient them correctly and then used a point constraint rather than a parent constraint to control their positions relative to the knee and ankle. I then wired up the ankle and wrist controller’s on the twist axis to drive the limb’s rotation. I simply wired up each of the tiba/femur joints to have 0.25 of the twist rotation of the ankle/wrist. Combined with the cumulative rotation of the parent chain, this caused the three joints to rotate 0.25, 0.5 and 0.75 of the ankle/wrist rotation.

Then I just had to parent constrain the deform skeleton once more, wire up the various switches and I could start skinning again.

Suffice to say, lesson learnt. I will be thinking much more carefully about which constraints I use in the future.

April 23, 2013

When first starting the skinning, I like each vertex to either be 100% influenced by a joint, or not influenced at all.

Skinning01

I can define areas like the pelvis and the rib cage very clearly, and then seperate up the vertices in between and give influence to the joints in the spine. This results in deformations that have very sharp edges. However, it provides a very quick way to establish which areas of the body should be moving with which joints. It also removes any unwanted stray influences from joints that are nowhere near an individual vertex.

Skinning02

Once that is complete, I can start to smooth the influence so that multiple joints can be effecting any one vertex. The smooth tool is particularly effective for this part of the process, though it does occassionally cause an unwanted joint to gain some influence over an area of the mesh that you don’t want it to. However, by swapping between joints and smoothing gradually the deformations become much cleaner.

Skinning03

With the spine, arms and legs complete I needed to get the head and tail done to have the majority of the rigging finished so that I could start skinning. Creating the head controls was pretty quick and easy.

Head Neck

However, I wanted to give the animator the ability to control whether the head follows the orientation of the neck, or whether it always stays facing the same direction despite the neck rotating (as though it is constrained to the world orientation). To do this I created another switch controller and set up an attribute that would control the orientation. I then created two locators, both of which I placed on the pivot point for the head. I parent constrained one of the locators to the neck joint below it (this would be the local orientation locator). The other locator I point constrained to the local locator. This means it moves with the locator, but doesnt rotate. I then orient constrained the head controller to both locators and wired up the switch so that it would change the weighting of the constraint and therefore which locator the head follows.

When I went to start rigging the tail, I realised I had never actually built a deform skeleton for it. As such I quickly built the joints and used the comet tools to tidy up the orientations.

Tail01 Tail02

I decided to build the FK control system first, as it would be easier and faster. I created a small circle control for each individual joint and parent constrained the joints to their respective controllers. As always, all my controllers were parented to an _SDK group which was then parented to an _0 group. I wanted to make it easier for the animator to be able to animate large sections of the tail without having to select each controller individually. As such I created four extra controllers which I spaced evenly along the length of the tail. I coloured these yellow and the individual circles brown. This makes the controllers for controlling a larger part of the tail stand out more for the animator. Colour coding the controllers is important so that an animator can see at a glance how a rig works without the rigger needing to explain everything they have done.

Tail03

I then wired the rotation of each of the four main tail controllers to the _SDK groups of the individual tail controllers I wanted to control. This means that as the main control rotates, all of the joints in it’s section will rotate and therefore it will bend all of that length of tail.

Tail04a Tail04b

I also parent constrained each of the main tail controllers to the individual tail joint controller immediately preceding its section. This means that it should always move with the tail as it is animated.

I then set to work on the IK tail. As a monkeys tail is extremely flexible, I decided to create an IK chain with four spans. This means that there is a control point at either end, and three control points spaced evenly in the middle. I created clusters for each of the points so that I could parent constrain them to controllers. Unfortunately, as the monkeys tail was modelled in quite a curved position, I discovered that whenever I created the IK, the bones moved from their original positions. Despite playing with settings, I eventually accepted that I wasnt going to manage to get the bones to perfectly match up.

I point constrained the three middle controllers to the base and tip controllers, meaning that whenever one end moves, the controllers in the middle move as well. I changed the weightings so that the end closest to them had more effect.

I also tweaked the 0 positions of the controllers so that the IK joints lined up slightly better with the FK joints to try and minimise the movement of the deform skeleton when switching between FK and IK.

I set up a switch to allow the tail to swap between FK and IK, but I decided to add a couple of other attributes too. I set up an orientation switch for the tail that worked just as the head does, meaning the tail can either follow the position and rotation of the hips, or just the position.

Tail05

I also decided that animating an IK chain with no stretch can be very difficult, as if the curve is made longer than the length of the joint chain, the joints only shape themselves to the part of the curve they can reach. However, I didnt want to force the animator to have a stretchy IK tail, so I decided to add an attribute that would turn the ability to stretch the tail on and off. To do this, I simply used the same arclength node for the curve and a multiply/divide node as I have done in every stretch, but I added a condition node that would be affected by the switch.

Tail06

So at last, the majority of the rig building is complete. I can set up a deform test and start skinning the rig!

BodyComplete02

Having finally created working strethcy IK legs, I could get to work on making a set of FK legs. Like the IK, I duplicated the deform skeleton, but replaced the _jnt with FK_jnt. Unlike creating the IK leg, I didn’t need to create anything except controllers. I used the curve tool with snap to vertex turned on and drew the shapes around the mesh of the monkey. I used the MEL script “parent -r -s” to attach all the curve shapes to a single curve node so that the animator can click on any of the curves to select the entire controller.

The thigh and ankle controllers I drew with curves, but I just used circles (which I editted slightly) for the knee and tibia joints. To try and keep the rig clear to use I colour coded my controllers. The most important controllers I changed to a brighter colour and the less important ones (like the tibia joints) were darker.

Foot05 Foot06

Like before, I nested all of these controllers in a double set of groups (_SDK and _0). I then parent constrained each FK joint to its relative controller.

As the toe controls were parented directly to the deform skeleton (there was no seperate FK and IK version) I needed to get the main toe controller to follow both the IK and the FK skeletons. I created an FK/IK switch control for the leg and then proceeded to parent constrain the main toe controller to both the end of the IK and FK controller heirarchies. At the same time, I also parent constrained the deform joints to both the IK and FK joint chains.

I then had to wire up the switch to control the parent constraints between FK and IK. I used the hypershade to do this. I brought the switch and the parent constraints in and also created a reverse node.

Foot07

I then also wired up the visibility of the IK and FK controllers so that the animator can only see FK controls when the switch is at FK, and IK controls when it is at IK.

I decided to colour code the IK and FK so that it is clear at a glance whether the leg is set to FK or IK. For the IK I chose red (left) and blue (right), and for FK I chose pink (left) and green (right). However, since the toes are the same controllers whether the leg is in IK or FK, the controllers were just a single colour. I spent a bit of time trying to work out how to use the switch to drive the colour override for the control shapes. Eventually I found that I could use a condition node that was true when the switch was above 0.5 and false when below 0.5. Then I wired the condition node to the drawing override of the controller. As the drawing override is not in the short list of things in the hypershade I had to open the connection editor and wire them up in there.

Foot08 Foot09

Finally, I had to work out what number represented each colour. With some trial and error I eventually found the values and wired up all of the toe controllers so that they change colour. Success!

Foot10

April 16, 2013

Im not sure I can truely put into words how frustrating creating the IK leg for this rig was. I do know, however, that creating a stretchy IK leg should not take almost 48 hours to get working.

Before I duplicated my bones, I decided I wanted to put some extra bones along the lower leg to enable smoother twisting. To ensure things worked nicely later on I wanted to ensure that the three new bones I inserted were exactly a quarter, half and three-quarters of the way along the calf. To do this, I parent constrained each of the three bones to both the knee and the ankle, and made sure maintain offset was unticked. This placed all three bones exactly half way between the knee and ankle. I then changed the weighting of the parenting to two of them. Femur01 was weighted 0.75 to the knee and 0.25 to the ankle, whilst Femur03 was weighted 0.25 to the knee and 0.75 to the ankle. I then deleted the constraints and used comet tools to orient the joints correctly.

Joint Creation 11

Once this was complete I created two duplicates (one for IK, one for FK). I then created a third duplicate in which I deleted the three femur joints and reparented the ankle straight to the knee. This was so that I had just two bones in the IK system. I called it the IKGuide. I created an IK on the joints and made a simple cube controller for the foot. I parented the IK handle to this control. I then orient constrained the ankle bone to this controller so that it would not rotate as the leg moved. Finally, I created a simple circle controller and parent constrained the hip bone at the top of the IK to the circle.

IKLeg01

With the IK built and working, I started to set up the IK stretch. Like the IK spine, I needed to know how long the leg was at any one point in time. However, I only needed the straight line distance from hip to ankle. I used the distance tool for this. I aligned one locator with the hip and one with the ankle and then parented them to the corresponding controllers. Now, as the leg moved, the distance tool would always give the distance from hip to ankle. At this point I also realised I had not created a knee controller. For this I used an arrow shape. I point constrained the arrow to both the hip and ankle (with maintain offset unticked). This placed the arrow on the plane between the two, directly in the centre. I then used an aim constraint to ensure the arrow was pointing directly at the knee. After deleting both constraints I moved the arrow in front of the knee and set up a pole vector constraint on the IK.

IKLeg02

To wire up the stretchy leg I used another multiply divide node and, like the spine, the distance was wired to the first input. The second input needed to be the length of the thigh bone plus the length of the calf bone. As the joints had been oriented correctly this could be found simply by adding the x transform of the knee and ankle together. I changed the node to divide and wired the output to the “true” output of a condition node. The “false” output was left as 1. The condition node also had the distance in the first input and the length of the two bones as the second. It then compared the two lengths, and if the distance was greater than the bone length, the condition was true. I then wired this condition node to the x scale (length) of the thigh and calf. The two bones of the IK scaled nicely. Unfortunately, the ankle and foot bones were being scaled strangely when the leg stretched, despite not being wired to the condition node. I even checked the scale of both and x, y and z were all still showing as 1. This meant the bones shouldnt have been scaling at all.

FootScaleBug FootScaleBug02

I tried deleting the IK and remaking it, but the problem persisted. I then tried moving the controls with no IK present at all, and the ankle and foot continued stretching strangely. I could only assume there was something strange with the bones, so I deleted the IKGuide joints and re-created them. I set everything back up, re-wired the thigh and calf x-scale to the condition node and tested it again. I had exactly the same problem all over again. I tried re-creating the bones once more that evening but with no success. I finally decided the only option was to go to bed and look at it with a fresh mind in the morning.

As is often the way with re-visiting a problem the next day, I tracked the issue down quite quickly. I had all the joints in the hypershade to make sure my ankle and foot definitely hadnt managed to end up wired to anything and I realised there was no line showing the parenting of ankle to knee. I un-parented the ankle bone and re-parented it to the knee and the problem disappeared. I was delighted, until I found yet another problem. Whilst the ankle was no longer scaling strangely, it still was not doing what I expected when I moved the hip too far away. Despite being orient constrained the foot controller (and as such theoretically unable to rotate by itself) when I moved the hip controller forwards or backwards so that the leg stretched, my ankle would rotate.

FootBugRotate FootBugRotate02

I decided to fix the problem by simply creating a new version of the ankle and foot bone. I simply point constrained the new ankle joint to the one on the IK leg and orient constrained it to the controller again. Success! Problem solved, just not as tidily as I would have liked. It also left me feeling frustrated because I wanted to know why the problem had occured so I could avoid it in the future. Still, at least the problem was gone and I could get on with parent constraining the IK joints to the IKGuide joints. I unticked maintain offset and parent constrained all the joints to their respective guide joints. I parented the three femur joints in the same way I created them; by parenting to both knee and ankle and then editting the weights. However, what I hadn’t thought of was that a parent constraint would cause the joints to rotate out of alignment due to the ankle’s orientation. I pondered the problem for a while and decided I would simply ensure to maintain offsets when constraining the deform joints to the IK joints.

IKLeg03

I then set to work creating a control system for the toes so that I could create a simple set of foot roll controllers. Initially I decided to place a circle controller around each joint of the toes, but it quickly became clear that some of them would be hard to select.

Foot01

 

Instead, I moved the curve shapes of the controllers above each toe joint and this made them much clearer and easier to see. Finally, I also made a main controller that would be used to curl all the joints of a toe. I then began creating a simple set of foot roll controls and with some re-parenting of the IK handle my IK leg was complete.

Foot02 Foot03

Unfortunately, I quickly checked things in my orthographic side view and realised that at some point during the creation process I had managed to cause the entire IK system to move out of alignment from its starting position, despite all the controls being at 0, 0, 0.

Foot04

The only option was to yet again build the entire IK leg. I deleted all my bones, mirrored the right hand leg to the left hand side. The good thing was, that at least this time all the controllers were already built so all I needed to do was wire everything up correctly, and make sure my controllers were correctly aligned before constraining/parenting things to them. Fortunately, this time I got it right and my left IK leg was finally complete. Hooray!

April 14, 2013

I finally got the mesh back yesterday, and Im happy to say its actually symmetrical this time.

Symmetrical

As such, I’ve been able to get cracking with building the control system for the toony rig. I started with the spine as I feel its such a central part of the control system, and almost everything else is parented to it in some way. My first step was to duplicate the bones that I want to apply the IK spine to. This means if I made any mistakes, moved anything by accident, I wouldn’t ruin the position/orientation etc. of the deform skeleton and I could easily delete the duplicate and start again. I tend to insert IK (or whatever is appropriate) to the name of the joint to differentiate it from the deform skeleton.

Once I’ve got my duplicate I hid the deform skeleton so that I couldn’t affect it or move it whilst working on the controls. I applied an IK spline with two spans to the spine. This means that there are control points for each end, as well as a single control point in the center to affect the curve. I created a cluster for each of the control points. These control the shape of the curve and the shape of the curve drives the position of the bones.

IKSpine01

I then needed to build the actual controllers for the spine. Comet tools provides a quick way to make a bunch of spline shapes, but they are all quite simple, sharp edged splines. They never look particularly nice, and they don’t fit the shape of the body all that well. As such, I like to make a lot of my controllers by hand. To do this, I used the mesh itself to guide the shapes. I turned on snap to vertex and created a selection of curves that flowed around the area of the body that I wanted to control. I generally tweak them slightly afterwards to make sure the ends of the curves meet up and dont leave gaps anywhere. With this complete, I had a bunch of individual curves that could be selected seperately. What I actually want is to be able to click anywhere on any of the curves and to have them all selected. To do this, I had to reparent the individual “curveshapes” to a single curve. This can be easily done by selecting the “curveshape”, then shift selecting the curve I want to parent it to. I then simply use a single line of MEL script: “parent -r -s”. This leaves an empty curve node with no shape that can be deleted.

IKSpine02 IKSpine03

Once all my controllers were built I aligned them with the correct bones and parented the clusters to each controller. I also created two groups for each controller to be parented within. One I suffix with _SDK and one with _0. The _0 is my null group. The 0 point so that I do not need to use freeze transformations. The _SDK group allows me to set up parent constraints for a controller, whilst still giving the animator the ability to animate it. For the spine, I parent constrained the middle controller _SDK to both the top and bottom spine controllers. This means that the middle controller will always remain halfway between the top and bottom of the spine.

Once this was complete I decided to test the spine to check it was working correctly. Unfortunately, it wasn’t. I hadn’t realised I had only given the IK spline four bones to move around. When the curve had extreme bends the bones just averaged out their positions and the shape of the curve was lost. This meant rebuilding the deform spine with more bones so that there were enough joints to follow the spline curve more accurately. Having added them, I made sure to tidy up their orientation with comet tools again.

IKSpine04 IKSpine05

I then repeated the process of applying an IK spline to the duplicate set of bones and creating clusters for the three control points of the curve. I re-positioned the controllers to ensure they were correctly aligned with the new bones and then parented the clusters to the controls. I also set the twist controls for the IK spline to make the hip and chest controllers control

the spine rotation.

 

IKSpine07

Finally, I wanted my spine to be stretchy as this is meant to be a “cartoony” rig. I created a multiplydivide node which I set to divide. I also created an arc length info node for the spline curve. This provided me with the length of the curve at any time. I wired the length into the first input of the divide node and put the length of the curve when all controls were at 0, 0, 0 into the second input of the divide. This means that the output will be the current length of the spine divided by the original length. I then simply wired the output into the scale x (the length)  of all the joints in the spine.

IKSpine06

Success! An easy to use stretchy IK spine.

 

 

April 8, 2013

With my scenes set up for my animators on the elephant project, I can finally get to work on my side project; the toony rig. The first step to rigging anything is to build the joints that will drive the deformation of the mesh. I call this the deform skeleton.

The first thing I create is always the spine. The pelvis is the start of the chain and the head the end. I generally name joints as I go to avoid confusion when the entire skeleton has been built. I used the same naming convention as my elephant rig. A prefix of C_, L_ or R_, the name of the joint and a suffix of _jnt.

Joint Creation 01 Joint Creation 02

The arm was the next thing I built. I initially place the bones using the orthographic front view. I line up the joint positions to the hand as well as possible. However, because my view is orthographic, the joints are all created on a flat plane. This does not match the shape of the hand, so I then use the perspective view to line everything up correctly. I then repeat the process with the foot, but this time I create the joints from the orthographic side view. To ensure my joints orient correctly later on, I make sure that the first finger or toe I create is the most central one. This simply means that the joint in the palm or sole of the foot will point to this finger when oriented, instead of out to one side at, for example, the thumb or little toe.

Joint Creation 03 Joint Creation 04 Joint Creation 05

 

With all the joints created on one side of the body, I now need to correctly orient the joints. This is good practice because it ensures the joints will all rotate nicely in the same axis. These two images show the spine axes before and after the orientation. I use comet tools to do the joint orientation.

 

 

Joint Creation 06 Joint Creation 07 Comet Tools

After orienting the joints on the hands and foot, I tweaked the angles of the thumb. The reason for this is so that if all the finger joints are selected and rotated in one axis, they should close to form a fist. This makes it much easier for the animator to work and keeps the animation curves much cleaner.

Joint Creation 08 Joint Creation 09 Joint Creation 10

With the orientations tidied I could finally mirror my arms and legs so that I had a complete skeleton. Unfortunately, once I had mirrored the joints, it was clear that the mesh had not been mirrored correctly and was not symmetrical. I have contacted the artist and asked if they can have a look and fix it. However, since it is the easter holidays, I have no idea when they will see my email, let alone send me the fixed mesh. I dont want to continue rigging, just in case there is a problem and they are unable to get things symmetrical for me, as that would mean rebuilding the joints for the right arm/leg seperately and I would have to redo any work I had already done.

Not quite symmetrical

April 8, 2013

I think the camera tracking has been one of my biggest worries during this project. It is something I have only fleeting experience with, but it is essential for producing a polished end product. When filming any moving camera shots we decided to keep them on a tripod and just pan to keep things simpler. The first shot I chose to track was probably one of the most important shots in the entire project: the very final shot where the elephant gives the monkey back to the mother.

I have been using Boujou as this was the program we were introduced to during a previous compositing project earlier in the year. It had produced a great result for me in the past, but I was aware that others on my course were not so fortunate. Boujou does not always get it right first time. I imported the footage and set Boujou to tracking the movement in the image. To do this, Boujou latches onto distinctive areas in the image (ie colour changes which suggest edges of objects). It then tracks how these points move throughout the footage. To get the best result, it is generally necessary to have points that occur on the x, y and z axis within the 3d space of the shot. However the shot I was working with had a flat wall in the background (which meant no track points in the z axis) and I worried that Boujou would struggle to know how close or far away the camera was throughout the shot. The next stage of the camera tracking was to ask Boujou to use these track points and their movement to create a camera in 3d space that matches the movement of the camera in the shot. Thankfully I found an option at this point to tell Boujou that the camera was nodal. This means Boujou knew that the camera was on a tripod and in a fixed position and could only rotate. Finally, I could export the information into Maya and check to see whether it worked. To my delight, the tracking seemed great. However, I quickly noticed that objects in the scene seemed to suddenly move up and down, or side to side out of time with the footage. Since I have no knowledge of how to correct this in Boujou I decided to see if I could fix it without too much effort in Maya. I played the animation until I found a moment when objects in the scene moved out of sync with the footage. I then checked the camera’s curves in the graph editor to check if there were any odd kinks or jumps. Most of the problems were extremely easy to find and fix, but there was one that was extremely frustrating. About two thirds of the way through the scene skipped sideways suddenly and then gently eased back into its original position. I could find nothing on any of the curves that would indicate the camera was rotating like this. I spent a long time trying to establish whether it was just one curve or all of them effecting it, but eventually, after some painstaking work tweaking each individual key, I managed to tidy it up so that the movement was barely noticeable.

I wanted to set up the scenes that my animators would be using before I did anything else, so I then moved on to some of the scenes with a fixed camera. I created a new Maya file for each one, referenced the elephant rig and created a new camera. I created an image plane for the camera with the .png sequence of the correct shot. I cannot believe how difficult it was to then actually manage to position the camera in 3d space so that it lined up with the footage. I had foolishly assumed that it would be the moving camera shots that would cause me the issues when all along it was positioning cameras by hand that would be my downfall. The situation was made more frustrating by the knowledge that I had assumed Motion Graphics were taught easy ways to work out the position of cameras in still shots. In 2nd year, we learnt to take a photo with the camera in the same position and an item in the shot that you knew the dimensions of. You could then use this item to help line up the camera. Since I was relying on the knowledge of my Motion Graphics students as I knew they had plenty more experience than me, I readily accepted their answer when they assured me nothing was needed for these shots. It was pretty galling when I asked them later on how I would be positioning the cameras and they answered “by eye”. I have at least learnt one lesson from this. Whenever possible, if filming for a VFX composition with a still camera, make sure there is something in shot that you know the exact dimensions of. It will make your life so much easier.

April 5, 2013

Now that I have all the footage from both shoots, I’ve finally been able to sit down and start sorting through which shots worked best. I created a new Premiere file and set to work putting it all together. Its quite exciting to see it coming together, and Im surprised at how long the video is. I sat down with a tutor to discuss timings as I was worried that some things felt a bit slow. However I wasn’t certain if this was just because the scenes were currently missing a CG elephant. He agreed with me though and so I started being a bit more harsh on the footage, cutting out unnecessary seconds at the beginning and ends of shots.

This is the final result:

YouTube Preview Image
March 14, 2013

So yesterday, I finally got the elephant rig to a point where it could be referenced into the animator’s files. One of the major things I worked on was updating all of the controllers so that they are clearer and fit the model nicely. I foolishly assumed this would be easy, but I hadn’t reckoned on the awkwardness of the curve creation tools in Maya. It took me quite a while of just repeatedly trying to create shapes and deleting them as they failed to work. I think my determination to create things that were perfectly symmetrical possibly did not help the situation, but an assymetrical controller just doesn’t look as neat and clean in my opinion. Eventually I hit upon the idea of using the snap to vertex tool and using the edges and vertices of the elephant to help me create controllers that fit nicely to the contours of the elephants body. Having drawn a selection of curves I needed to then join all the individual curves together into a single item. This involved reparenting the individual curve shapes a single curve node and then deleting the rest of the empty nodes. Frustratingly I could find no way to tell Maya to actually combine all the shapes nodes on each curve into one single curve, but each controller selects the entire item wherever you click it, so it still works, its just not as clean as I would like it to be. I then scaled the controllers out from the body slightly and coloured them. I had hoped I could then parent these new shapes to the controllers already in existence (as I had with each individual curve to make the new controller), but every time I tried, the new controllers were rotated strangely and moved away from the body. This was due to the difference in positions of the pivots of the old and new controllers. Hoping I could avoid having to reposition each new controller I decided to instead break all the constraints and set everything back up on the new controllers. It turns out I still had to reposition the pivots, and so rearrange the shapes, but at least I knew I didnt have to spend time trying to delete the shapes of the old controllers, I could just remove the entire item.

I did, however, forget to redirect the spine rotation to the new controllers, so I had a bit of a scare later in the evening when I created a global control and tried to check that everything moved as I wanted it to. When the elephant rotated 90 degrees, the spine flipped, presenting a problem I had first encountered in my 2nd year when rigging a quadroped in 3ds Max. I panicked for a while that my IK spline spine was in fact broken and I would have to come up with a completely new set up. However after I checked the IK I realised that in creating the new controllers, I had not told it to use them to decide the rotation of the spine. Thankfully, this fixed the problem.

IKspine03a

I also needed to update the rig with the new low poly model that my artist had altered for me. I brought the mesh in and whilst trying to work out how to load the skinning from the old mesh to the new mesh, I found an option that instead replaced an old mesh with a new mesh. I tried it out and it worked brilliantly. The old mesh changed to the new mesh. However, I now had two versions of the new mesh, one that was skinned, and one that was not. Assuming that the unskinned mesh was no longer needed I promptly deleted it. A couple of hours later, when testing some other part of the rig, I discovered my mesh no longer seemed to be moving with the bones. Confused I saved the file under a new name, closed it and reopened it. To my horror, the mesh was now invisible. The outliner still showed all the various parts of the mesh, but I couldn’t get them to appear.

MissingMesha

I hastily opened my previous iteration only to discover that that file suddenly had exactly the same problem. Desperately hoping I hadn’t somehow broken every single version (and so lost all my skinning) I tried the next step back. To my relief the old mesh was there and skinned and working absolutely fine. I had simply lost my day’s rigging work, but nothing else. Deciding that replacing the mesh clearly wasn’t the best method to update my rig, I started working on saving off the skinning so that I could load it onto the new mesh. Frustratingly it seemed Maya was only giving me the option to load each bones skinning one at a time. It was doable, but a bit pointlessly time consuming. Fortunately, I knew one of my classmates, had successfully, and easily, loaded skinning onto new meshes during his project. I asked him about it and he showed me a quick and easy method. It involved skinning the new mesh to the bones (but not editing it at all) and then telling the new mesh to look at the old mesh for the skinning values. Maya can load the skinning in a variety of ways, by volume, by UV map etc. It was brilliant and loaded the skinning onto the new mesh perfectly. I didn’t even need to tweak it, though Joe had warned me I might need to. This is great to know as I now know I can quickly skin the high poly elephant to the rig (and tidy it up afterwards) as soon as it is ready. I will not have to go through the time consuming process of skinning from scratch again.

The last thing I needed to build was dynamic tail. Having already gone through the long process of working out how to do the trunk, it was simple a case of repeating the method on a much simpler chain. The dynamic output curve became a blendshape for the spline whilst the controls affected the dynamic input curve. Again, unfortunately, the rig doesn’t update its position until the animation is played, but, to my current understanding of dynamics, there is no way around this.

I also created a control for the tail that will rotate all three FK controllers at the same time. I actually created three of these for the trunk as well, so that an animator can control the entire tail (or a section of trunk) without having to select a whole bunch of controllers. Every controller I create is parented to a group (with the suffix _SDK) and that group is then parented to another group (with the suffix _0). The _0 group becomes the null group, which provides a 0 point for position and rotation. The _SDK group allows me to create batch controllers whilst still making the individual controllers able to tweak the bones position. I simply wired the rotation of the batch controller to the _SDK groups of the relevant individual controllers. When the batch controller is rotated, each _SDK group wired to it also rotates. The individual controllers parented to the _SDK groups also rotate (due to the parenting) and so rotate the relevant bones. However, because the controls are not wired to anything, the animator is still able to tweak the position of the bones individually at any time.

 Tail01a

I then set up a switch for the tail to allow the animator to blend between dynamic and FK. Like the trunk I also set up some attributes to allow the animator to change the stiffness and flexibility of the curve dynamics if they wish.

Tail02a

Finally, I added some empty attributes to various controllers ready to be wired up to blendshapes when I have the highpoly mesh. I created them in advance so that it is less likely there will be any problems with the referencing when I update the rig later on. I wanted to make sure that everything that might be animated was already in place, and so it is only skinning and wiring and not controllers that will change in future files.