Tag Archives: production

June 2, 2013

So, Final Major Project is over and handed in, which means my degree is also finished. There have been odds and ends requested from tutors and its left me feeling like I’m in some sort of strange limbo. The degree is finished, but with all these extra hand ins (that dont even affect my marks) its hard to move onto the stage of tidying up my CV ready to apply for jobs.

Anyway, here is my final showreel, which has a variety of my best work from 2nd and 3rd year.
YouTube Preview Image

This is the deformations and rig demo for the elephant rig.
YouTube Preview Image

This is the deformations and rig demo for the toony monkey rig.
YouTube Preview Image

This is the unfinished version of my elephant advert. As I have mentioned previously, this project had a bumpy journey to get to hand in and sadly it never quite made it to completion. However, I thought I would upload it anyway for those curious about the project.
YouTube Preview Image

There is no doubt in my mind that what I have learnt, more than anything else, during this project, is be confident about who you choose to work with. Any concerns or worries you may have about an individual’s ability, enthusiasm or productivity probably exist for a reason. If you choose to work with them, make sure you have a contingency plan in case things fall through. A project is only as good as the sum of its parts. If one, or more, of the collaborators is unreliable then the parts they are creating could well be unreliable too.

Working in a student environment differs from the workplace in one major respect, and that is your ability to replace team members. In the workplace, if an employee is not hitting their deadlines, or reaching their targets, the employer has several options. The employee must be warned and spoken to about the problem, but if their work does not improve, the employer can take action. They can shift the individual onto other, less important, projects and, where necessary, make more meaningful threats. At university, there is a limited pool of “employees” most of whom are already “employed” by someone else. You cannot remove someone from your project if things go wrong, all you can do is look for someone else to also do the work, and use the most successful version. If every other student is already working on other projects, then there is little you can do but try to encourage those you are working with to work harder and produce something better.

In the workplace, it is also very clear what level of authority an individual has. You know who your managers are, who you need to listen to and who you should respect. As a student director, you may be in charge of the project, but you have no more real authority than any of the students working with you. This makes it much harder to put any impact or strength behind the words of a verbal warning. There is very little that you can back it up with.

These lessons were, unfortunately, learnt through the unreliability of those I worked with this year. Trying to pull everything together has been extremely stressful. Last year I promised myself I would leave at least one project that wasn’t a collaborative effort, so that if anything went wrong I had a project I could drop in an emergency so that I could devote the time to whatever needed it the most. For some reason, I forgot that promise and once again, all four of my projects were collaborative. This meant that not only were other people relying on things from me before they could start working, but I was often waiting for work from them. When there is a personal project to work on, this eases the tension of waiting for work, because you still have something to devote your time to.

Some essential time was wasted on this project due to having to find new collaborators at such a late stage and I wasnt able to start rendering until May. This meant that many of my planned scenes had to be dropped and I cut down the advert to a more manageable format. Unfortunately, in the end, so much time was lost that even this shortened version was not achieved. All three shots were rendered and passed on to the compositor, but with only a week to try and complete everything, she didn’t stand a chance. This was made harder as the animation for the final shot did not match up with the movement of the bear. The new animator had been unable to get things any cleaner in the short amount of time I gave her and my compositor was unable to match move the bear without a lot more time, which we didn’t have.

I am disappointed that I was unable to pull the elephant project through to completion, but I believe that I did everything I could to get it as far as possible. If I were to do the project again, I would pick the team I worked with more carefully. I would respond more quickly to late work and give verbal warnings sooner. I would also be faster to look for alternative collaborators if my team ignored deadlines and feedback. However, this final option would be very dependant on other students having projects they could change/drop.

May 10, 2013

Theoretically, rendering should be quite simple. Changing all the settings and creating the lighting can be quite long winded, but the process itself is done very similarly every time. However, when trying to create my depth passes I hit a real problem. It didn’t seem to matter how I changed my settings or which tutorial I followed I just repeatedly ended up with either a white silhouette, or a completely black render.

depthRemapped

Even the staff couldn’t find a problem with my settings and they were left as stumped as I was. After about a day of experimenting I worked out that it was only my camera from Boujou that didn’t seem to be rendering the depth correctly. Rendering from perspective, or a new camera, or even in a new file with the scene imported worked fine. However, the moment I tried to render from that camera, I had a silhouette again.

Eventually I was able to track down the problem. Boujou had exported my scene set up extremely small. As my rig could not be scaled, my only option was to scale the scene instead. The scale of the camera meant that the values for the depth pass also needed to be scaled accordingly.

As I wanted this camera to remain scaled so that the image plane containing the footage was in the right place (behind the elephant) I decided to make a new camera. I parent constrained it to the scaled camera with maintain offset unticked and then rendered from this new camera. This completely solved the problem and I finally had a depth pass that rendered correctly.

Shot18New_masterLayer_001

So, my artist promised me the high poly elephant would be with me on Sunday. Its the middle of the week now and I’ve heard nothing from him and not seen him in university. I was pretty frustrated to discover that hes actually gone to FMX and hes not even in the country. I dont have a problem with people going to such a major event in the animation world. I would love to have gone myself! What I do have a problem with is an individual knowing they have work to give to others and disappearing with no word on where they are or why the work has not been finished.

I am seriously unimpressed. Its looking very unlikely now that I stand any chance of getting every scene in my original edit finished. Im going to have to just pick the most important scenes, the ones that still tell the story and prioritise them and see how much I can get rendered and composited for hand in.

The final set of controls left for me to build were the face controllers. Unlike the rest of the controllers these needed to be placed very close to the mesh. I felt that the thin lines of the curves I had used for everything else would not be very obvious or easy to select for the animator. Instead, I used the top of a NURBS cylinder as this provided a small filled in circle which was much easier to select.

I wanted the mouth shape to be really flexible for the animator, so I placed a controller for each corner of the mouth as well as three along the top lip and three along the bottom lip.

Face01

I also wanted the animator to be able to easily move the position of the entire mouth, so I created a simple curve shape that I placed a small distance away from the mesh.

Face02

The nose was slightly simpler. I created two controllers that would allow the animator to flare the nostrils. I then created another simple curve that would allow the animator to move the nose about the face.

Face03

Finally I created three controllers for each eyebrow and a simple position controller for each eyebrow. I also decided to change the circles from the yellow as I felt it was too bright and distracting when looking at the face. I changed them to a dark blue instead.

Face04 Face05

Once I had built all the controllers I needed to start parenting them to the correct things. The nose and eyebrows were extremely simple, as the heirarchy was fairly obvious. However I had a lot of trouble finding something that would work for the mouth. I obviously wanted the controls on the bottom lip to move when the jaw was rotated. However I also wanted them to move dependent on where the mouth position controller was. Eventually I came up with the idea of creating two locators that were aligned with the jaw bone pivot point. I parent constrained one to the head and one to the jaw. I then point constrained the locators to the mouth controller. I could then parent constrain my individual mouth controllers to the relevant locators.

Face06

Unfortunately, the parent constraint on the locators clashed with the point constraint, so that when the jaw rotated the locators moved slightly and the mouth controllers disappeared into the mesh. It took me a little while to track down the problem, but once I had found it I decided to just replace the parent constraint to the jaw with an orient constraint and this fixed it.

Face08

Now that the first hand in has been and gone, I’ve been finishing off the rigging of the monkey. There were a few bones in the face that I had animated for the deform test that weren’t actually rigged. Firstly, I created some curves to control the ears and the two tufts of hair on the monkey’s head.

One of the most important things I still needed to create was some eye controllers. I wanted each eye to be controllable individually, but I also wanted the animator to be able to easily control them together. As such I decided to create two text curves (R and L) and aligned them with the eyes. I then used an aim constraint so that the eyes point at the controllers at all times. I created a third controller that would move both R and L together.

Eyes01

I also wanted the animator to be able to control the size of the pupil, as I have always found this to be a particularly effective tool to portray emotion when animating a “toony” character. Unsure of whether the animator would want to be able to control the pupil size from the individual eye controllers or from the dual controller. I decided to give them the ability to do it from either. I added an attribute to R and L for pupil size, and two attributes to the dual controller. I then wired these attributes to two plus/minus nodes. I have set the nodes to addition and as soon as I have the eye textures, I will wire the nodes to the scale of the texture.

Eyes02

Rigging the tongue was slightly frustrating as I really struggled to create controllers that could be selected whatever position the tongue was in. Initially I created simple curves that sat above the tongue. However, when the tongue was rotated upwards, the top curve disappeared into the back of the mouth.

tongue01 tongue02

As such, I decided to create some loops that went around the tongue and combine these with the curves above the tongue.

tongue03

April 25, 2013

Yesterday was the deadline for our first hand in. Its a chance to collate everything we have done so far and hand it in to the tutors to get some feedback before the actual deadline. Obviously nothing is finished, because we still have three weeks of work left to do, but its nice to be able to show the progress so far.

When I hand in the final versions of these I will be adding text to explain the rig demos and make them a bit clearer.

YouTube Preview Image YouTube Preview Image YouTube Preview Image
April 24, 2013

Whilst I was skinning I discovered yet another mistake with my leg (and arm) rigging. When constraining the three tibia joints so that they would always stay the right distance from knee and ankle, I used a parent constraint. I had thought this would mean they would twist nicely when the foot was rotated. What I had not thought about was that this would of course affect their rotation in all three axes, and not just the one twist axis I wanted. When I started working on the leg and looking at the movement during deform tests, I realised my mistake.

LegTwistError

The joints were rotating and so rotating the leg mesh on an axis that the leg shouldn’t be able to rotate. I had to delete all the tibia and femur constraints (including the parenting to the deform skeletons) on the IK chains. I also had to delete all the skinning I had done so far, as the deform joints were also out of position.

I used the FK joints to reorient them correctly and then used a point constraint rather than a parent constraint to control their positions relative to the knee and ankle. I then wired up the ankle and wrist controller’s on the twist axis to drive the limb’s rotation. I simply wired up each of the tiba/femur joints to have 0.25 of the twist rotation of the ankle/wrist. Combined with the cumulative rotation of the parent chain, this caused the three joints to rotate 0.25, 0.5 and 0.75 of the ankle/wrist rotation.

Then I just had to parent constrain the deform skeleton once more, wire up the various switches and I could start skinning again.

Suffice to say, lesson learnt. I will be thinking much more carefully about which constraints I use in the future.

April 23, 2013

When first starting the skinning, I like each vertex to either be 100% influenced by a joint, or not influenced at all.

Skinning01

I can define areas like the pelvis and the rib cage very clearly, and then seperate up the vertices in between and give influence to the joints in the spine. This results in deformations that have very sharp edges. However, it provides a very quick way to establish which areas of the body should be moving with which joints. It also removes any unwanted stray influences from joints that are nowhere near an individual vertex.

Skinning02

Once that is complete, I can start to smooth the influence so that multiple joints can be effecting any one vertex. The smooth tool is particularly effective for this part of the process, though it does occassionally cause an unwanted joint to gain some influence over an area of the mesh that you don’t want it to. However, by swapping between joints and smoothing gradually the deformations become much cleaner.

Skinning03