On Monday morning, Meta — the corporate previously generally known as Fb — introduced that it could be shutting down “the Face Recognition system on Fb,” a expertise that has been raising privacy alarms because it debuted. In a blog post, the corporate described the transfer as “one of many greatest shifts in facial recognition utilization within the expertise’s historical past.” On Twitter, outgoing CTO Mike Schroepfer and incoming CTO Andrew Bosworth, who beforehand oversaw Fb’s Oculus digital actuality division, referred to as the announcement a “big deal” and a “very important decision.” The Digital Frontier Basis deemed it “a testomony to all of the arduous work activists have completed to push again towards this invasive expertise.”

However a assessment of Meta and Fb’s VR privateness insurance policies, and the corporate’s solutions to an in depth record of questions on them, recommend the corporate’s face identification expertise isn’t going wherever. And it is just considered one of many invasive knowledge assortment strategies that could be coming to a metaverse close to you. (Disclosure: In a earlier life, I held coverage positions at Fb and Spotify.)

Fb’s current announcement that it’s shutting off its controversial facial recognition system comes at a tough time for the corporate, which is going through important regulatory scrutiny after years of bad press lately infected by a high-profile whistleblower.

However the second can also be an opportune one. The corporate is shifting its focus to digital actuality, a face-worn expertise that, by necessity, collects an unlimited quantity of knowledge about its customers. From this knowledge, Meta may have the capability to create identification and surveillance methods which are no less than as highly effective because the system it’s placing out to pasture. Simply because it will possibly create these methods doesn’t imply it is going to. For the second, although, the corporate is leaving its choices open.

The actual fact is: Meta intends to gather distinctive, figuring out details about its customers’ faces. Final week, Fb founder Mark Zuckerberg told Stratechery’s Ben Thompson that “one of many huge new options” of Meta’s new Cambria headset “is round eye-tracking and face-tracking.” And whereas the platform has “turned off the service” that beforehand created facial profiles of Fb customers, the New York Occasions reported that the corporate is keeping the algorithm on which that service relied. A Meta spokesperson declined to reply questions from BuzzFeed Information about how that algorithm stays in use as we speak.

Meta could have shut down the facial recognition system on Fb that raised so many considerations, however on condition that it intends to maintain the algorithm that powered that system, there is no such thing as a purpose the corporate couldn’t “merely flip it on once more later,” in keeping with David Brody, senior counsel on the Legal professionals’ Committee for Civil Rights Below Legislation.

In the meantime, Meta’s present privateness insurance policies for VR gadgets depart loads of room for the gathering of private, organic knowledge that reaches past a person’s face. As Katitza Rodriguez, coverage director for world privateness on the Digital Frontier Basis, famous, the language is “broad sufficient to embody a variety of potential knowledge streams — which, even when not being collected as we speak, might begin being collected tomorrow with out essentially notifying customers, securing further consent, or amending the coverage.”

By necessity, digital actuality {hardware} collects basically completely different knowledge about its customers than social media platforms do. VR headsets could be taught to acknowledge a person’s voice, their veins, or the shading of their iris, or to seize metrics like coronary heart charge, breath charge, and what causes their pupils to dilate. Fb has filed patents regarding many of those knowledge assortment varieties, together with one that might use issues like your face, voice, and even your DNA to lock and unlock gadgets. Another would contemplate a person’s “weight, pressure, strain, coronary heart charge, strain charge, or EEG knowledge” to create a VR avatar. Patents are sometimes aspirational — protecting potential use circumstances that by no means come up — however they’ll generally supply perception into an organization’s future plans.

Meta’s present VR privateness insurance policies don’t specify all of the kinds of knowledge it collects about its customers. The Oculus Privacy Settings, Oculus Privacy Policy, and Supplemental Oculus Data Policy, which govern Meta’s present digital actuality choices, present some details about the broad classes of knowledge that Oculus gadgets accumulate. However all of them specify that their knowledge fields (issues like “the place of your headset, the velocity of your controller and modifications in your orientation like once you transfer your head”) are simply examples inside these classes, slightly than a full enumeration of their contents.

The examples given additionally don’t convey the breadth of the classes they’re meant to signify. For instance, the Oculus Privateness Coverage states that Meta collects “details about your surroundings, bodily actions, and dimensions once you use an XR gadget.” It then supplies two examples of such assortment: details about your VR play space and “technical info like your estimated hand measurement and hand motion.”

However “details about your surroundings, bodily actions, and dimensions” might describe knowledge factors far past estimated hand measurement and recreation boundary — it additionally might embody involuntary response metrics, like a flinch, or uniquely figuring out actions, like a smile.

Meta twice declined to element the kinds of knowledge that its gadgets accumulate as we speak and the kinds of knowledge that it plans to gather sooner or later. It additionally declined to say whether or not it’s at present accumulating, or plans to gather, biometric info akin to coronary heart charge, breath charge, pupil dilation, iris recognition, voice identification, vein recognition, facial actions, or facial recognition. As an alternative, it pointed to the insurance policies linked above, including that “Oculus VR headsets at present don’t course of biometric knowledge as outlined beneath relevant regulation.” An organization spokesperson declined to specify which legal guidelines Meta considers relevant.

Meta did, nevertheless, supply further details about the way it makes use of private knowledge in promoting. The Supplemental Oculus Terms of Service say that Meta could use details about “actions [users] have taken in Oculus merchandise” to serve them advertisements and sponsored content material. Relying on how Oculus defines “motion,” this language might enable it to focus on advertisements primarily based on what makes us bounce from worry, or makes our hearts flutter, or our arms sweaty.

However no less than for the second, Meta gained’t be concentrating on advertisements that manner. As an alternative, a spokesperson instructed BuzzFeed Information that the corporate is utilizing a narrower definition of “actions” — one that doesn’t embody the motion knowledge collected by a person’s VR gadget.

In a 2020 document referred to as “Accountable Innovation Rules,” Fb Actuality Labs describes its strategy to the metaverse. The primary of those rules, “By no means Shock Individuals,” begins: “We’re clear about how our merchandise work and the information they accumulate.” Responding to questions from BuzzFeed Information, Meta mentioned it will likely be upfront about any future modifications, ought to they come up, to the way it will accumulate and use our knowledge.

With out higher readability concerning the knowledge that Meta is accumulating as we speak, “prospects can’t make an knowledgeable selection about when and find out how to use their merchandise,” Brody instructed BuzzFeed Information. Extra to the purpose, it is arduous for the general public to know any future modifications Meta may make to the way it collects and makes use of our knowledge if it is by no means defined precisely what it’s doing now.

Brittan Heller, counsel on the regulation agency Foley Hoag and an professional in human rights and digital actuality, put it in another way: “The VR trade is form of in a ‘magic eight ball’ part proper now. On questions on privateness and security, the reply that flutters up says, ‘Outlook unsure: ask once more later.'”