Viz researchers eye methods for automating
generation of customized, animated avatars

 

A technique for automating the creation of animated, animal avatars to represent individuals in the virtual world is being developed by two Texas A&M Department of Visualization professors with the help of $500,000 grant from the National Science Foundation.

“Currently, computer users can design the shape, clothing, hairstyle, just about everything, regarding the look of their avatars,” said Tim McLaughlin, head of the Department of Visualization and principal investigator on the three-year project. “What they cannot do is have a similar level of personal control over how their avatars move.”

McLaughlin is conducting the study with co-principal investigator Ann McNamara, assistant professor of visualization. “Our intention,” he said, “is to enable avatar users to represent themselves in ways that combine ‘look’ and ‘motion’ to communicate more effectively.”

The problem, said McLaughlin, is that reliance on key-frame animation generated by computer animators, or motion capture generated by actors, limits the range of expressive characters simply because it is not economically feasible to have a professional animator or professional actor create the walk or run of every possible avatar that users can construct.

The project, he said, will provide a way for users to automatically animate novel digital creatures that are capable of movement portraying young or old, predator or prey, and heavy or light characteristics.

“To develop this new way of creating and managing expressive animation, we first require a better understanding of how we perceive motion itself,” said McLaughlin. “We will borrow from biological motion studies to determine the level of detail of motion information required for recognition, as well as use eye tracking technology to determine where that information is found in animal motion.”

He added that project researchers will use linear analysis of animal motion videos to isolate changes to the spatial arrangement of key joints, such as knees, elbows and shoulders, over time.

“If we do this for those joint configurations that appear most important for identification of specific characteristics, such as the weight of an animal, then we should be able to apply those spatial and time-based changes to other animal models and achieve the same or similar perception of weight,” he said.

Success of this project, states its research abstract, will contribute to the understanding of human perception of biological motion and expand the performance range and emotional impact of synthesized animation. It will also contribute to the evolution of animation tools by extending the use of perception-driven motion descriptions.

The NSF grand also funds yearlong research assistant positions for two graduate students.

 

- Posted: Sept. 24, 2010 -



— the end —

Contact:   Phillip Rollfing, prollfing@archone.tamu.edu or 979.458.0442.

 



Tim McLaughlin


Ann McNamara



Click to enlarge image

Update your contact info and share your news!

The College of Architecture strives to keep up with former students and share their successes in the archone. newsletter. Please take a moment to update your contact information and tell us what you've been up to. Click Here
bottom page borders