Skip to content

Meta’s VR headset collects personal data directly from your face

    In Nov 2021, Facebook announced it would remove facial recognition data extracted from images of more than 1 billion people and stop offering to automatically tag people in photos and videos. Luke Stark, an assistant professor at Western University, in Canada, told WIRED at the time that he viewed the policy change as a PR tactic because the company’s VR push would likely lead to a vast collection of physiological data and new ones. would cause privacy issues.

    This week, Stark’s prediction turned out to be correct. Meta, as the company that built Facebook is now called, introduced its latest VR headset, the Quest Pro. The new model adds a set of five inward-facing cameras that watch a person’s face to track eye movements and facial expressions, allowing an avatar to reflect their expressions, smile, wink or raise an eyebrow in real time. The headset also has five external cameras that in the future will give avatars legs that copy a person’s movements in the real world.

    After Meta’s presentation, Stark said the outcome was predictable. And he suspects the default “off” setting for face tracking won’t last long. “It’s been clear for several years that animated avatars act as privacy loss leaders,” he said. “This data is much more detailed and much more personal than an image of a face in the photo.”

    At the event announcing the new headset, Meta CEO Mark Zuckerberg described the intimate new data collection as a necessary part of his vision for virtual reality. “When we communicate, all of our non-verbal expressions and gestures are often more important than what we say, and the way we connect virtually should reflect that,” he said.

    Zuckerberg also said the Quest Pro’s internal cameras, combined with cameras in the controllers, would drive photo-realistic avatars that look more like a real person and less like a cartoon. No timeline was offered for the release of that feature. A VR selfie of Zuckerberg’s cartoonish avatar, which he later admitted was “basic,” became a meme this summer, prompting Meta to announce changes to his avatars.

    Companies, including Amazon and several research projects, have previously used conventional photos of faces to predict a person’s emotional state, despite a lack of evidence that the technology can work. Data from Meta’s new headset could be a new way to infer a person’s interests or reactions to content. The company is experimenting with virtual reality shopping and has filed patents for personalized advertising in the metaverse, as well as media content that adapts to a person’s facial expressions.

    In a briefing with journalists last week, Nick Ontiveros, Meta’s product manager, said the company doesn’t use that information to predict emotions. Raw images and images used to power these functions are stored on the headset, processed locally on the device, and deleted after processing, Meta says. Eye-tracking and facial expression privacy statements the company published this week state that while raw images are removed, insights gained from those images can be processed and stored on Meta servers.