Is "Multiplicity" your favorite movie? Have you ever wished you could be in two places at once or create an evil twin (and name him Skippy)? A research project called LifeLike is trying to bring that a little closer to reality.
Project LifeLike is a collaboration between the Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago, and the Intelligent Systems Laboratory (ISL) at the University of Central Florida and aims to create visualizations of people, or avatars, that are as realistic as possible. While their current results are far from perfect replications of a specific person, their work has advanced the field forward and opens up a host of possible applications in the not-too-distant future.
The EVL team, headed by Jason Leigh, an associate professor of computer science, is tasked with getting the visual aspects of the avatar just right. On the surface, this seems like a pretty straightforward task--anyone who has played a video game that features characters from movies or professional athletes is used to computer-generated images that look like real people.
A still image of a Project LifeLike avatar conversing with a person. Project LifeLike is a collaboration between the Intelligent Systems Laboratory (ISL) at the University of Central Florida (UCF) and the Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago (UIC) that aims to create visualizations of people, or avatars, that are as realistic as possible. While their current results are far from perfect replications of a specific person, their work has advanced the field forward and opens up a host of possible applications in the not-too-distant future. Credit: University of Chicago/University of Central Florida
But according to Leigh, it takes more than a good visual rendering to make an avatar truly seem like a human being. "Visual realism is tough," Leigh said in a recent interview. "Research shows that over 70% of communication is non-verbal," he said, and is dependent on subtle gestures, variations in a person's voice and other variables.
To get these non-verbal aspects right, the EVL team has to take precise 3-D measurements of the person that Project LifeLike seeks to copy, capturing the way their face moves and other body language so the program can replicate those fine details later.
The ISL team, headed by electrical engineering professor Avelino Gonzalez, focuses on applying artificial intelligence capabilities to the avatars. This includes technologies that allow computers to recognize and correctly understand natural language as it is being spoken as well as automated knowledge update and refinement, a process that allows the computer to 'learn' information and data it receives and apply it independently. The end goal, Gonzalez says, is that a person conversing with the avatar will have the same level of comfort and interaction that they would have with an actual person. Gonzalez sees the aims of Project LifeLike as fundamental to the field of artificial intelligence.
LifeLike - EVL Lab Sets Out To Simulate Your (Evil) Twin
Comments