Sketching Articulation and Pose for Facial Animation
We present a methodology for articulating and posing meshes, in particular facial meshes, through a 2D sketching interface. Our method establishes an interface between 3D meshes and 2D sketching with the inference of reference and target curves. Reference curves allow for user selection of features on a mesh and their manipulation to match a target curve. Our articulation system uses these curves to specify the deformations of a character rig, forming a coordinate space of mesh poses. Given such a coordinate space, our posing system uses reference and target curves to find the optimal pose of the mesh with respect to the sketch input. We present results demonstrating the efficacy of our method for mesh articulation, mesh posing with articulations generated in both Maya and our sketch-based system, and mesh animation using human features from video. Through our method, we aim to both provide novice-accessible articulation and posing mesh interfaces and rapid prototyping of complex deformations for more experienced users.
E. Chang and O. Jenkins. Sketching articulation and pose for facial animation. In ACM SIGGRAPH/Eurographics Symposium on Computer Animation (SCA 2006), Vienna, Austria, Sep 2006.
[ bib | .pdf ]
E. Chang and O. C. Jenkins. Sketching articulation and pose for facial animation. In Z. Deng and U. Neumann, editors, Data-Driven 3D Facial Animation, chapter 8, pages 132-144. Springer, 2007.
[ bib | http ]
Sketching mesh poses from users sketches and video footage (8.5M)