Thursday, May 21

Oculus Rift hack transfers your facial expressions onto your virtual avatar

When Facebook bought Oculus VR back in March of 2014, many wondered exactly what the social network was going to do with it—let's face it, many of us are still wondering. But there are some interesting bits of tech starting to emerge from the now Facebook-owned Oculus that hint at what the future might hold for the Rift outside gaming. One such piece of tech—a "facial performance" tracking system—adds a vital element of social interaction to VR usage: facial expressions.

Researchers at University of Southern California (with help from Facebook) have devised a system that tracks a user's facial expressions and translates them onto an avatar in the VR world. It works by using an off-the-shelf Intel RealSense 3D Camera bolted to the front of an Oculus Rift DK2 in order to capture facial movements for the lower half of the face. The really clever part, though, is how it captures movements for the top half of the face, which is obviously covered up.

The researchers mounted eight strain gauges inside the foam liner of the Rift and developed software based on the Facial Action Coding System (FACS) often used by animators to integrate the data from the depth-sensing camera, strain gauges, and the Rift itself. The result is an eerily accurate representation of the user's facial expressions, down to the smallest of movements. Even better, latency was generally low, with the researchers measuring 3ms for facial feature detection, 5ms for blend shape optimisation, and 3ms for the mapping in software.

Read 4 remaining paragraphs | Comments

No comments:

Post a Comment