For this final project, we were challenged to work with not only a moving camera, but an animated asset that we did not create. This was a lot more challenging than I thought it would be! I’m confident in my camera tracking abilities, but I struggled a lot with finding an asset that fit the plates that I shot.
I wanted to take more advantage of our location here in Savannah with my filming, so I went out to one of my favorite places- Bonaventure Cemetery. Don’t worry, strictly historical now! I’ma lot of things, but disrespectful to the dead is not one of them. But I hit it right at the golden hour with some beautiful shadows, and I gotta say, I’m pretty happy with how these plates turned out! It bugged me pretty severely that my plates were so bland in the last projects.
Here’s a still image I shot of my location.
I also took my own 360 camera HDRs on location! There’s a few tourists in it, but that’s just funny more than anything else. (This is also just one exposure of my HDR.)
The next step was to choose an animation! Well, technically we picked them out before shooting, but I ran into some issues I’ll discuss in a second. Initially, I used motion capture data I obtained from my classmate, Austin Wright. His work can be found at https://acdafx.wixsite.com/austinwrightvfx. While there was nothing wrong with the data itself, (nice, clean uvs that don’t jitter and good mocap data) the geometry it was designed for was just way too dissimilar to the object in my scene. The model needed a flat surface to lay his hand on since he is intended to be sitting on stairs, but the plate just doesn’t have any that won’t induce major clipping.
Here’s a few screencaps to show you what I mean.
So yeah, that’s a problem. Quite a few of my classmates used free animations from mixamo.com for this project, so that was where I figured I would have the most luck. I ran into similar problems with these animations as well- lots and lots of clipping, as well as just action that just mismatched the plate.
For instance, I had a draft in which the model crawls through the scene, but the motion was just far too quick and far too horizontal for this very diagonally shot plate. The animation I settled on was one of the model standing up, which fit the line of action of the plate perfectly. Granted I did have to reverse the footage, but that’s our secret! He still clips awfully in the hands and arms, but that’s a problem for another day, and sometimes as compositors we just get bad assets. C’est la vie!
Another thing worth mentioning for this project since I spent a day on it is that I muddled around in Substance Painter for these textures! I am by no means a texture artist or look developer/lighter- I’m familiar with Mari, the process of UVing, and how to paint textures, but there’s a reason I chose to focus on compositing. I’m not strong in texturing, and I find it uninteresting in the first place. Long story short, I cobbled together some aged copper textures and I’m pretty alright with them for a first timer in Substance! I did get feedback that it should have more blue tones in the oxidization though, and I agree.
Here’s the diffuse map I painted and a little render of my guy.
So, let’s dig into the Nuke aspects of this project! This was a pretty simple plate to track, although I ended up deleting way too many markers on the foreground and that made the end result a little floaty looking. That’s definitely something that’s bugging me a lot about this project, and the very first thing I want to fix about it.
It’s got an error level of about .5, but the data loss to get there was just a really bad decision on my part. Here’s what it looks like.
I also really wish I had more time to refine that shadow, it was challenging and quite frankly looks like garbage. There’s a big hot spot right in the middle of the space between the long headstone shadow and the shadow of the leaves that made creating a shadow plate incredibly difficult. The saturation would be fine in one area, but off in another, or the shadow detail would be perfect in one area and far too much in another. A roto would fix this pretty easily I think.
I did roto out the shadow pass in the shadowed areas to avoid the dreaded double shadow, and I think that worked pretty effectively! It’s just a garbage matte, so it’s making me itch that all the grass detail is lost, but time constraints definitely came into play with that.
I also used an edge noise gizmo to try and break up the edges of my perfectly symmetrical and flat shadow, but given time, I think I would have preferred to put a displacement on my ground plane proxy and rendered the shadow out with irregularities in the first place. Nuke is wonderful, but she just doesn’t have the z data to support doing something like that unless I rendered deep data. Even then, not sure a gizmo would support that.
Here’s a couple screencaps of my node tree as well! Same as last project, the AOV splitter is a tool developed by my classmate, Sasha Ouellet, whose work can be found at http://www.sashaouellet.com/. (And who also has released a MUCH more refined version since then, but I haven’t had the time to download it yet.) I also made the occlusion node setup into a toolset, cause I always forget how the setup works. Don’t get me wrong, I understand the merge math there, but I do want to save myself the time and just load it in.