A Data-Driven Approach to Quantifying Natural Human Motion
Most artists and animators evaluate the quality of motion by visual inspection, which is time consuming. We presented a novel tool to evaluate the animation quality (naturalness of human motion in our example) automatically. The tool used a large motion capture database to develop a statistical definition of what constitutes natural human motion. Given a motion, our tool can also pinpoint the bad part automatically. As there was no clear definition for the naturalness of human motion, we assumed it could be defined by a large motion capture database ( 4 hours ). Our tool might prove useful in verifying that a motion editing operation had not destroyed the naturalness of a motion capture clip or that a synthetic motion transition was within the space of those seen in natural human motion. The key algorithm for this tool was based on an ensemble of statistical models for individual joints, limbs and the whole body. We used existing machine learning techniques such as mixture of Gaussians (MoG), hidden Markov models (HMM), and switching linear dynamic systems (SLDS) to build the models. We also implemented a Naive Bayes (NB) model for a baseline comparison. We tested these techniques on motion capture data held out from a database, keyframed motions, edited motions, motions with noise added, and synthetic motion transitions. We presented the results as receiver operating characteristic (ROC) curves and compared the results to the judgments made by subjects in a user study.