top of page

The ideal workflow with motion capture as the solution of game character's animation


The traditional workflow for characters' animations from 3d graphic software to Unity is usually something like this:

1. Bake all keyframes locally first;

2. Export them with character's model as a FBX file;

3. Import these motion data in import settings of Unity by specifying the first frame and the last frame of each motion (e.g., Frame 1-100 for idle pose, Frame 101-500 for walk cycle and etc) ;

4. Then the programer could call/play these "durations" for each character whenever necessary in the game scene.

The file done in this way is usually large (due to that much data embedded) and hard to check or modify even for its author, because all frames have been "baked"and there is probably no way to return if there is a lack of version control software or the 3d artist just forgot to keep a local copy of the original animation file.

Then I started to think whether it's possible to separate the data of motions permanently from the data of 3d polygons (the model). In Unity 5 there are Animator and Animation windows (the Mecanim and State Machine) which made the data flow more easier to read and to manage.

But that's on the side of programmers. How about the artists' side?

My ideal plan on the entire workflow is something like this:

The most critical point in the chart above, is how to edit mot-cap data in Unity? It's useless if we can't modify those keys because there is no perfect motion-capture raw data.

The release of Unity 5.5 finally solved this problem with so many updates on its Animation window and keyframes' editing tool, which made the retouch of animation data possible even on a game platform:

These editing features are definitely lifesavers especially for Maya animator, because it works almost the same way as Maya's Graphic Editor, :P

Thanks to this update, the ideal workflow chart above becomes possible.

bottom of page