Welcome to Virtual Reality: Valid Identification Required
While you’re exploring virtual reality, Facebook is exploring you.
Last month, Oculus VR founder Palmer Luckey’s financial support of an alt-right hate group was exposed; as a result, some groups of customers and developers vowed to stop supporting the company. But what Oculus’ recent critics — and much of the tech industry — fail to acknowledge is that Facebook’s ownership of Oculus poses far more of a threat to users and the future of VR than one executive’s bigotry.
Since its public launch in 2006, Facebook has committed many ethical violations. From harmful real name policies that discriminate against their own staff, to invasive data collection and active collusion with law enforcement to target activists of colour, the company not only violates its users with invasive surveillance, but consistently uses the data collected to further endanger its marginalized user base. Facebook’s unethical policies regarding identity verification and data collection make for a dangerous collaboration with Oculus, expanding their reach to a new horizon of vulnerability in VR technology.
The beauty of games is they allow you to confront challenges and fears within the safety of simulated danger. But the danger posed by Facebook’s acquisition of Oculus is very real: while you’re exploring virtual reality, Facebook is exploring you. The Oculus headset constantly streams personal user data, even when the Oculus is not in use. A user’s specific location, physical dimensions and private correspondence with online contacts are among many forms of data the ominous OVRServer_x64 records, as outlined in the Oculus privacy policy. Integrating Facebook into users’ Oculus account multiplies the inherent dangers of this data collection; Oculus users will inevitably be subject to the same overarching ID and data collection policies used by Facebook. Importantly, these are measures that disproportionately target, alienate and endanger marginalized people, particularly trans people, sex workers and abuse survivors. Introducing these harmful practices to games, especially an immersive medium like VR, threatens to alienate individuals turning to the medium in search of escapism, empowerment and agency that they otherwise do not possess outside of virtual space.
The All-Seeing Social Media Platform
Following Facebook’s acquisition of Oculus in March 2014, Mark Zuckerberg released a statement revealing his plans for the company:
“…this is just the start. After games, we’re going to make Oculus a platform for many other experiences. Imagine enjoying a court side seat at a game, studying in a classroom of students and teachers all over the world or consulting with a doctor face-to-face — just by putting on goggles in your home.â€
Using the Oculus as an omnipresent interface for everyday life would eradicate personal privacy as we know it; concerningly, the casual nature with which he suggests we conduct doctor’s appointments via virtual reality hints at our voluntary assent to the proceedings. Facebook has already made significant moves in that direction, beginning with requiring users to divulge an unconscionable amount of personal information to even participate in the popular social media platform. As I detailed in my article on the harmful nature of real name policies in Twitter’s exclusive verification process, Facebook’s discriminatory real name policy puts marginalized users in a precarious position, threatening to out users who, for myriad reasons, do not use their full legal names on the website. Initial controversy stemmed from the site when Facebook was repeatedly caught selling user data for profit to corporations that can legally use your name and likeness to sell any number of offensive and inappropriate products. Facebook’s identity-obsession not only compromises its users’ safety and dignity by selling to the highest bidder, it risks outing vulnerable people and putting their lives in danger. The ability to safely use your full legal name on Facebook comes from the privilege to afford to be transparent about your identity without fear of violence or discrimination, a privilege that many trans people have been unfairly denied by Facebook’s draconian identity policies.
Extending real name requirements to the virtual space of play threatens the multitude of positive aspects that gaming offers marginalized players. In one of the few realms where it’s acceptable to create names, craft avatars and build identities around our fantasies and desires, incorporating a harmful, difficult-to-bypass practice into the process will necessarily render virtual space inaccessible on countless axes. If Oculus headsets become a second, ever-present skin — as Zuckerberg hopes — there won’t be a virtual or physical space left free from Facebook’s invasive reach.
Imagine the horrifying consequences of merging Zuckerberg’s fantasies of an all-encompassing virtual interface with Facebook’s harmful real name policy, particularly Oculus’ aspirations to facilitate necessary services like medical care that require government ID. For one, this would expose any Facebook users who have managed to successfully safeguard their identity without having their account targeted, suspended or removed pending ‘real name verification’. Then, given how vulnerable, disempowering and just plain scary a trip to the doctor can be, it’s the last thing many of us would want recorded and archived as part of our digital profile. Further, it could cost people their safety and even their lives if their medical records and private visits to the doctor end up in the wrong hands. Databases containing medical information that expose patients who are transgender, intersex and/or have disabilities and chronic illness could result in as of yet unprecedented forms of discrimination, potentially outing people to their families, friends or prospective employers on a mass scale.
Removing barriers between large-scale systems in favour of one seamless, all-seeing interface, dramatically increases the ease with which predators (both the ones Facebook invite to share in their data and the ones they don’t) will be able to target, harass and harm people based on their educational merit or medical records; this is information that is already routinely used to systematically oppress marginalized people. Producing records composed of sensitive information that easily categorize people into oppressed groups could potentially result in the most thoroughly categorized list of humans yet – mass records of marginalized identities, all available to the highest bidder.
Exploiting Intimacy
As a developer, I think often of the people on the other end of my games; is the interface welcoming? Is the environment well-constructed? Is my message delivered effectively? I take great care to create experiences to both welcome and challenge players, and think far less about whether the platform I use to distribute my games will exploit the vulnerability I’m inviting people to express. The last thing I want is for players to feel like their vulnerability will be met with undue consequences or that artistic expression is yet another human experience Facebook has leveraged and corrupted to harvest for corporate gain.
Facebook is where an entire generation has gone to conduct their messiest breakups, celebrate their biggest achievements, and in all of that, even if unaware, exhibit profound emotional vulnerability. It’s where we go to share and sometimes vent, and in our seemingly well-curated universe of friends, family and acquaintances we feel safe. Not because the space itself promises safety, but because it is buffered by the presence of the people we trust; surrounded by loved ones we’re far less likely to question the safety of our surroundings.
Facebook’s foundation is built on exploiting the connections we seek with others, preying on the human desire to understand, share, and connect with our fellow humans. Where Facebook serves as a virtual space to connect with friends and family, Oculus’ integration relies on environmental immersion to collect information previously unattainable through any of Facebook’s other acquisitions. Documenting physical movements and mannerisms along with in-game choices and socialization promises to leave players — tethered in place, attention keenly focused in the reality presented to them — more vulnerable and at risk than ever before. Facebook may be where we go to connect but it’s still unmistakably an interface, an obvious proxy to sharing physical space. Immersing ourselves in virtual space catches us off guard, collecting valuable data as we move and speak and play freely.
Drawing a Line in the Virtual Sand
Photo CC-BYÂ Nadia Prigoda-Lee.
It’s important to keep in mind Oculus is just one part of Facebook’s broader strategy to seize mass surveillance and control: for years, Facebook has been steadily acquiring a roster of companies to create a system of cross-compatible platforms that all channel their user data to a singular source.Â
In addition to the company’s many acquisitions (Oculus VR, Instagram, WhatsApp), gaming platforms like PS4 and Xbox One integrate with Facebook in the form of the ubiquitous ‘log in with Facebook’ option, an appealing choice for users who want to skip the inconvenience of setting up a new account. When Facebook is chosen as a sign-in option, players’ usernames are irreversibly changed to their full Facebook name along with their profile picture. If a player wishes to use Facebook to log in but keep their real name and photo from appearing on their account, they need to first make changes to their profile settings, a process that is not clearly explained to users when selecting a sign-in option. And since 2014, Facebook has pursued an aggressive strategy to promote Messenger as a one-stop communication interface, beginning by making all other forms of mobile messaging through Facebook obsolete. This strategy seeks to limit user interactions to a singular continuous interface, giving Facebook exclusive rights to the widest breadth of personal information possible. Similar to Facebook’s sign-in integration with gaming platforms, Messenger is set to sync with your mobile device by default, requiring users to first search through the app settings to toggle automatic synchronization off. Intentionally obscuring the ability to opt out of these invasive integrations highlights Facebook’s inherent and intended deception.
Since Palmer Luckey’s outing as an alt-right political supporter, many have publicly voiced their discontent, refusing to support Oculus until he is removed from the company. But the way I see it, there will always be powerful individuals at the top who put their resources behind empowering the status quo. If we choose to focus on each individual engaged in bigotry and corruption we would break under the weight of endless boycott and outrage. Challenging a system – or in this case a vastly-networked company – offers the capacity for large-scale change both in industry policy and practice. We live in an era where a substantial amount of our financial, personal and social interactions take place in virtual spaces designed by corporations with no governing regulatory commission, leaving us without industry standards or even a discussion regarding the ethical collection and dissemination of users’ private information. As long as we continue to go without basic standards of public safety that exist in other data-collecting industries like advertising and sales, Facebook will be free to expand their invasive scope and use our data in numerous harmful ways.
Facebook’s enthusiastic cooperation with entities invested in harming marginalized people place their actions well on the level of the blatant bigotry that Luckey’s surreptitious donations revealed, but on a far wider scale. Using mined data to assist in unjustly apprehending activists of colour tips the scales of power and impunity back towards that of an oppressive state, and is just a taste of the kind of dangerous influence Facebook wields over its marginalized users – an influence that will only grow along with their increasing acquisition of social platforms.