Moving Animation Forward, One Wardrobe at a Time
Georgia Tech @ SIGGRAPH 2015
Animated characters can mimic human behavior to such a life-like degree that the differences between the real world and computer-generated world are sometimes hard to distinguish. But there's one trick that digital denizens haven't quite yet mastered: Getting dressed.
They can perform jaw-dropping feats, yes, but putting pants on one leg at a time is another matter.
Research from the Georgia Institute of Technology has produced a systematic tool that allows animators to create realistic motion for virtual humans who are getting dressed. The new algorithm enables virtual characters to intelligently manipulate simulated cloth to achieve the task of dressing.
The animation tool can create different dressing styles for different types of garment and fabric. The research team - including Alexander Clegg, Jie Tan, Greg Turk and Karen Liu - shows that the tool can be extended to assistive dressing. It could one day allow robots to efficiently and safely dress those who need assistance.
The paper, “Animating Human Dressing,” was accepted into the Association for Computing Machinery's Conference and Exhibition on Computer Graphics and Interactive Techniques (SIGGRAPH 2015), Aug. 9-13 in Los Angeles.
A Georgia Tech seed grant program sponsored by the GVU Center, the Institute for People and Technology and the Institute for Robotics and Intelligent Machines funded the pilot study published at SIGGRAPH this year.
Georgia Tech's research presented at SIGGRAPH includes two other papers, also co-authored by Karen Liu, associate professor in the School of Interactive Computing.
Online Control of Simulated Humanoids Using Particle Belief Propagation
Iterative Training Of Dynamic Skills Inspired By Human Coaching Techniques
The GVU Center at Georgia Tech will host a reception at SIGGRAPH 2015 for alumni and friends Aug. 12, 6-9 p.m. For more information contact gvu@cc.gatech.edu.
Writer: Joshua Preston