When I decided to make Banished, one of my main requirements was a definite ‘NO ANIMATION‘! Well, at least as little animation as possible. The idea of modeling, rigging, keyframing, and adding all the things required to have animations didn’t seem very feasible for a graphics programmer turned indie developer. So for the first half of Banished, everything was buildings, terrain, and a box for characters that hopped around.
Eventually came a day where I added chickens. I modeled it, textured it, rigged it, and spent probably more than a few days figuring out how to animate it. After that it wasn’t so bad. I built some simple humans an made it work. But rigging models, setting up I/K, setting keyframes, and trying to make nice animation wasn’t my favorite thing. With time I’m sure I could have gotten better at it. But with so much to do building the game myself, it wasn’t a priority.
For my current game, I’m going to need a lot more animation. I still don’t want to keyframe everything. Potentially twice too, since this time I have different male/female skeletons. There’s a lot of animations I want. A walk cycle holding each and every tool. Different animations for every tool and thing the villagers can do. Running. Sleeping. Ah! Maybe hire an animator?
But instead, I decided to buy some toys (er, tools). I purchased them so I can build a bigger game than before but not have a large increase in workload, and they’re helping me overcome the skills I don’t have.
So. I’ve been wanting one for a few years, and I am now the happy owner of an inertial motion capture suit. I put it on, it records my movements, and then I have animations. Yay!
But not all is easy in motion capture land. I’ve spent the last week or so building a pipeline that gets my animations into the game fast. I believe that traditionally, an animator would bring motion capture into another tool, modify it slightly to make it work in game, export that, and then bring it into a game engine. I want to just record and get that data into the game without a whole lot of extra work. Luckily, the motion data from the suit is very good. I don’t have joints doing funny things very often. So there’s not a lot of cleanup, if any at all.
The first technical thing to consider is that the animation is being re-targeted from my body structure, to whatever I’ve built for my characters.
Writing the re-targeting code wasn’t as bad as I feared. I export a version of the skeleton of my characters in a T-pose that matches the basic configuration of the motion capture skeleton. This doesn’t have to be the same way the character is modeled. The character could be in A-pose or N-pose or whatever, I just need to rotate bones into T-pose to match and export. Any transform differences are accounted for so I can map the rotation values from skeleton to skeleton. Then per bone I can pick which skeleton I want offset and rotation data from. I also had to write some code to move forward hip movement onto the root node which the engine needs, and reorient the entire model depending on which way it was facing in the motion capture data.
So once I mapped the motion data onto my skeletons, there are still a few issues. While the motion is good and follows how I’ve moved, based on arm length, leg length, hip width and shoulder width, the hands and feet don’t end up exactly where I expect. In many cases this doesn’t matter. But when it does, traditionally an animator would tweak the captured motion. But I’m going to use 2 bone inverse kinematics to place hands where they need to be when it matters. This also helps when interacting with objects. The same animation could be used to pick up different size objects and the I/K will put the hands in the right place. Feet sliding is taken care of by scaling the speed of forward movement based on the ratio between hip height in my characters skeleton and the hip height in the motion data.
The last issue is looping animations. In a walk cycle, the end pose needs to match the start pose. But it’s really unlikely I’ll actually match 100% by recording data. So I blend the beginning and end of the animation with the data on the other side of the loop. It’s close anyway, and by blending a few frames together the loop looks perfect.
So after all that work was done, lots of testing and tweaking, I can now build animations fast. Put on the suit, record the motion I want, Then I scrub through the recorded data to find what I want and export it. My engine can read that exported data and a simple text file configuration tells the engine how to compile the animation for a character.
So does my recorded looping walk cycle look like? Not bad!
I think this is only the 2nd time I’ve modeled a human. I’m getting better at modeling. And rigging for motion capture was certainly an education. (Does anyone enjoy applying skinning weights to vertices?) The models are a work in progress, and I built them as fast as possible to test the mocap. They will likely be edited and different as time goes on. I hope to add hair options, different clothes, skin tone, etc. But for now these models will move the game development forward.
Another toy that deals with motion capture, are motion capture gloves for recording hand movement. Currently my models have mitten hands, but I want it for recording hand positions while holding tools or having palms facing out while the thumb does other things. This way I don’t have to go into each animation and decide where the models will be gripping something versus having relaxed hands.
One cool thing about having the full hand motion capture data, is that I have full finger data, so if in the future I wanted to add the index finger, I’d just have to modify the models, and re-import the animation data. No other work needed.
Once I got it working, (which took most of a week) I made a bunch of test animations and got them into the game engine in around 40 minutes. That includes putting the suit on and setting up, recording motion, exporting to fbx, and configuring import frame ranges.
(Why is my shadow resolution so low in this screenshot? I was looking at the characters and didn’t notice til after I made the gif. Hmm…)
I think I’m still going to need to traditionally animate animals, and I might hire someone for that. But I also now have an animation re-targeting pipeline, so all similar animals could potentially share animations, reducing the work required. Or at least I could get new animals in and working while animal specific animations get built.
The last toy (er, tool) I bought a digital screen drawing pad so that I can use a pen to paint textures and models directly on the screen. Previously I did everything with the mouse, and while that’s fine and I have a workflow for it, being able to use a pen directly on a model is pretty nice. Plus maybe I’ll get better at drawing too. I don’t have any animated gifs to show that though…