Realtime style transfer for unlabeled heterogeneous human motion
Shihong Xia | Congyi Wang | Jinxiang Chai | Jessica Hodgins |
ACM Transactions on Graphics (August 2015)
This paper presents a novel solution for realtime generation of stylistic human motion that automatically transforms unlabeled, heterogeneous motion data into new styles. The key idea of our approach is an online learning algorithm that automatically constructs a series of local mixtures of autoregressive models (MAR) to capture the complex relationships between styles of motion. We construct local MAR models on the fly by searching for the closest examples of each input pose in the database. Once the model parameters are estimated from the training data, the model adapts the current pose with simple linear transformations. In addition, we introduce an efficient local regression model to predict the timings of synthesized poses in the output style. We demonstrate the power of our approach by transferring stylistic human motion for a wide variety of actions, including walking, running, punching, kicking, jumping and transitions between those behaviors. Our method achieves superior performance in a comparison against alternative methods. We have also performed experiments to evaluate the generalization ability of our data-driven model as well as the key components of our system.
Shihong Xia, Congyi Wang, Jinxiang Chai, Jessica Hodgins (August 2015). Realtime style transfer for unlabeled heterogeneous human motion. ACM Transactions on Graphics, 34(4).
@article{Hodgins:2017:DOE,
author={Shihong Xia, Congyi Wang, Jinxiang Chai, Jessica Hodgins},
title={Realtime style transfer for unlabeled heterogeneous human motion},
journal={ACM Transactions on Graphics},
volume={34},
number={4},
year={August 2015},