The technology opens the door to real-time social interactions in a virtual environment. Continue reading →
 Virtual reality avatars just got more personal. Now, they can imitate a computer user’s facial expressions when they laugh, smile or even frown and interact with other players’ avatars.

Researchers at the University of Southern California enlisted the help of the Oculus Rift headset for 3-D gaming to create sensors that can detect a person’s facial expressions and replicate them in virtual reality in almost real-time.

The system uses eight strain gauges embedded in the headset’s foam liner — to detect muscle movements — along with a 3-D mapping camera mounted on the headset that’s pointed at the lower half of the user’s face.

The user is required to repeat a small subset of training experiences during a calibration prior to initial use. This improves the tracking ability and provides greater accuracy when reading expressions, according to the project’s lead Hao Li. After the person’s face is fully mapped, the user can begin making expressions that will be transferred onto their VR avatar.

Watch the technology unfold in the video below. The headset’s clunky, but the technology opens the door to real-time social interactions in a virtual environment, which could be a real game changer.

Mind-Reading And Virtual Reality Gadgets Unveiled

“The resulting animations are visually on par with cutting-edge depth sensor-driven facial performance capture systems and hence, are suitable for social interactions in virtual worlds,” Li writes in the caption below his video.

The Oculus Rift headset is scheduled for release in early 2016, though the contraption enabling avatars to imitate users’ facial expressions is a work in progress.

VR Avatars Mimic Users’ Facial Expressions