Fast garment simulation with aid of hybrid bones
来源期刊:中南大学学报(英文版)2015年第6期
论文作者:WU Bo CHEN Yin XU Kai CHENG Zhi-quan XIONG Yue-shan
文章页码:2218 - 2226
Key words:data-driven; linear blend skinning; hybrid bones; interactive
Abstract: A data-driven method was proposed to realistically animate garments on human poses in reduced space. Firstly, a gradient based method was extended to generate motion sequences and garments were simulated on the sequences as our training data. Based on the examples, the proposed method can fast output realistic garments on new poses. Our framework can be mainly divided into offline phase and online phase. During the offline phase, based on linear blend skinning (LBS), rigid bones and flex bones were estimated for human bodies and garments, respectively. Then, rigid bone weight maps on garment vertices were learned from examples. In the online phase, new human poses were treated as input to estimate rigid bone transformations. Then, both rigid bones and flex bones were used to drive garments to fit the new poses. Finally, a novel formulation was also proposed to efficiently deal with garment-body penetration. Experiments manifest that our method is fast and accurate. The intersection artifacts are fast removed and final garment results are quite realistic.
WU Bo(吴博)1, CHEN Yin(陈寅)1, 2, XU Kai(徐凯)1, CHENG Zhi-quan(程志全)3, XIONG Yue-shan(熊岳山)1
(1. College of Computer, National University of Defense Technology, Changsha 410073, China;
2. Science and Technology on Parallel and Distributed Processing Laboratory
(National University of Defense Technology), Changsha 410073, China;
3. Avatar Science Company, Guangzhou 510001, China)
Abstract:A data-driven method was proposed to realistically animate garments on human poses in reduced space. Firstly, a gradient based method was extended to generate motion sequences and garments were simulated on the sequences as our training data. Based on the examples, the proposed method can fast output realistic garments on new poses. Our framework can be mainly divided into offline phase and online phase. During the offline phase, based on linear blend skinning (LBS), rigid bones and flex bones were estimated for human bodies and garments, respectively. Then, rigid bone weight maps on garment vertices were learned from examples. In the online phase, new human poses were treated as input to estimate rigid bone transformations. Then, both rigid bones and flex bones were used to drive garments to fit the new poses. Finally, a novel formulation was also proposed to efficiently deal with garment-body penetration. Experiments manifest that our method is fast and accurate. The intersection artifacts are fast removed and final garment results are quite realistic.
Key words:data-driven; linear blend skinning; hybrid bones; interactive