Motion retargetting refers to the process of adapting the motion of a source character to a target. This paper presents a motion retargetting model based on temporal dilated convolutions. In an unsupervised manner, the model generates realistic motions for various humanoid characters. The retargetted motions not only preserve the high-frequency detail of the input motions but also produce natural and stable trajectories despite the skeleton size differences between the source and target. Extensive experiments are made using a 3D character motion dataset and a motion capture dataset. Both qualitative and quantitative comparisons against prior methods demonstrate the effectiveness and robustness of our method.
|Number of pages||11|
|Journal||Computer Graphics Forum|
|Publication status||Published - 2020 May 1|
- CCS Concepts
- • Computing methodologies → Neural networks
ASJC Scopus subject areas
- Computer Graphics and Computer-Aided Design