Reckoning with a Decade of Breaking Things
The Zuckerberg Files and Facebook’s Enduring Contempt for the World
At this year’s F8 Developer conference, Facebook CEO Mark Zuckerberg stepped back from his company’s infamous approach to innovation, “move fast and break things.” For Facebook, it used to be that “unless you are breaking some stuff you are not moving fast enough.” Now, it’s “move fast with stable infra.” As Zuckerberg explained, the shift reflects a renewed commitment to efficiency since, over time, breaking things “wasn’t helping us to move faster because we had to slow down to fix these bugs and it wasn’t improving our speed.”
CC-BY Jason McELweenie, filtered.
Of course, technical infrastructure isn’t the only thing Zuckerberg and Company have approached with some abandon over its 10-year history. Facebook also has a habit of moving quickly and smashing the expectations of its users in the process. Revisions to the site’s policies and privacy controls have regularly been met with resistance, while the introduction of new features—like the ill-fated Beacon or the constantly-shifting NewsFeed—have left users’ understandings of how information flows through Facebook upended or rendered obsolete. With regard to implementing and communicating changes, the company has made taking three giant steps forward and one (often highly publicized and begrudging) step back its modus operandi.
As Zuckerberg has put it, “we make mistakes loudly” (12:52).
While the shift towards “move fast with stable infra” says something about Facebook’s maturing approach to development in a technical sense, it also tells us little about how the company plans to reckon with its history of breaking things. It’s a legacy that still presses on the everyday activities of its users, and in doing so, demonstrates an enduring contempt for the world it claims to make more open and connected.
An Enduring Contempt
On the surface, contempt may seem like too strong a word. After all, Facebook can occupy an innocuous enough place in many of our lives. We might only log in to the service now and again; we might find some value in having easy access to a large number of friends, family, and colleagues; we might find some features genuinely useful for messaging or promoting events. By offering us an opportunity to connect and share with others, it’s hard to say that Facebook is somehow actively scornful or disdainful of us as users.
But showing contempt doesn’t have to be a deliberate or intentional act—one can also show contempt simply by failing to take into account people or things that should, in fact, count. For example, when we fail to design technologies that are accessible to people with disabilities, we show contempt for those people. Or, when we design urban spaces that are hostile to homeless populations, we show contempt for the homeless. Further, just as inanimate objects can be sexist, non-human things can show contempt, too, like when a social networking site’s sign-up page requires you to identify your gender but fails to provide any options outside of “male” or “female.” In this instance, an information system exhibits bias towards a gender binary and shows contempt for non-binary identities.
CC-BY Mixy Lorenzo, filtered.
This sort of contempt is particularly troublesome as it tends to endure invisibly over time, as biases built into our technologies gain an inertia that make them difficult to perceive. The more we use a given social networking site—the more navigating its quirks and commands becomes a routine part of our lives—the less aware we become of the ways in which it guides us. Over time, we grow accustomed to doing things that once seemed strange. Clicking a “like” button, for example, once felt like a weird way of indicating approval or acknowledgement (or, counter-intuitively, to express sympathy); now, it seems NBD.
Moreover, as Facebook continues to spread its tendrils across the internet, it becomes increasingly unavoidable. Not only must we confront Facebook when we are logged in, but also while we roam the broader Web: your Facebook login can be used to gain access to numerous other social and commercial services; promotional and personal Facebook pages routinely show up on the first page of search results; “like” buttons are strewn about the Web, collecting clicks and beaming data back to Facebook. Given the sheer dominance of the site, identifying and understanding its biases has become a critical part of being an informed digital citizen.
Listening carefully to what purveyors of a given technology say can tell you a lot about how a technology fits into, works with, or challenges the world around us. This is the premise of ongoing research Michael Zimmer and I are doing in The Zuckerberg Files, a digital archive of all public utterances by Facebook’s founder and CEO. As we work, we accept Zuckerberg’s use of language to be not arbitrary, but purposeful—it actively shapes available conceptions of sharing, privacy, control, and identity in ways that serve Facebook’s interests.
The Iron String of Sharing
Upon wading into The Zuckerberg Files, one is struck by how often Zuckerberg calls on the idea of sharing to communicate his company’s message. By his own admission, Zuckerberg talks about sharing “a lot of the time.” He constantly reiterates the company’s goal “to give people the power to share and to make the world more open and connected.” His company strives to achieve this goal “by building services that help people share any type of content with any group of people they want.”
The language employed by Zuckerberg reinforces what José van Dijck calls Facebook’s “imperative of sharing.” But this imperative isn’t some altruistic maxim; it doesn’t drive us towards some utopic community. Rather, sharing, for Facebook, refers to an exceedingly narrow set of activities. As van Dijck describes it, “relates to users distributing personal information to each other, but also implies the spreading of that personal information to third-parties.” Indeed, I am often struck by just how staggeringly unimaginative the Zuckerberg’s view of sharing seems to be (1:50 here):
“What I’m talking about is you know, people will, um, push the status update—sometimes I’ll do that and make it available to everyone at Facebook, everyone at the company. You know, sometimes you’ll share photos or you might go on vacation and take photos with your family and only want to share that with your family, right? You have the control to be able to do that. So that’s kind of on the sharing side.”
For Facebook, this view of sharing makes a great deal of sense. As a business, the site depends on users sharing massive amounts of information, so Facebook wants to build tools that both enable and encourage the widespread sharing of things like status updates, photos, and “likes.”
To rephrase an old Emerson line, Facebook’s design revolves around the dictum, “share thy info: every feature vibrates to that iron string.”
Outside of the confines of Facebook, however, we know that sharing connotes a much more diverse range of social and cultural practices. As a concept, we use “sharing” to refer to a lot of things: we can share traits or interests, experiences or emotions, information or ideas. There are also countless ways we can share with others, from the most basic exchange of a look to the most complex workflows employed in organizational settings. Sharing is at once caring and contested political economic practice. It is both coping mechanism (Gaiman: “Pain shared, my brother, is pain not doubled, but halved.”) and expressive act (Angelou: “…sharing food is a form of expression.”). Further, sharing means different things in different contexts—individuals, institutions, and groups can exhibit distinct and even conflicting norms of sharing.
But this diverse landscape of social and cultural meaning that accompanies sharing gets bulldozed in the context of Facebook. For Zuckerberg, sharing is reduced to little more than the flow of information between users and third-party services.
Meeting Sharing’s Imperatives
Once sharing—and the narrow meanings Facebook attributes to the term—are brought to the fore, we can see how other important cultural concepts are transformed in the shadow of sharing’s imperatives. Zuckerberg, in his open letter to investors in advance of Facebook’s IPO, claimed that “Facebook was not originally created to be a company. It was built to accomplish a social mission—to make the world more open and connected.” But, as Zuckerberg explains elsewhere, a more open and connected world is one caught in the gravitational pull of sharing (0:40):
“Our mission is to give people the power to share and to make the world more open and connected. So what we mean by this is that more open world is there’s more information available, people can have access to more information…and more connected means that people can stay connected better with their friends and family, people immediately around them, but also people all across the world.”
For Zuckerberg, openness is reduced to little more than some quantity of information to be shared (“more information”) while connectedness denotes simply those kinds of connections that Facebook facilitates.
Within Facebook’s open and connected world, users are conceived of as little more than information distributors. We are described by Zuckerberg as possessing an “innate desire to connect with other people and share information.” Here, Zuckerberg naturalizes Facebook’s construction of sharing by connecting it to innate human sociality, casting a human desire to share as harmonious with Facebook’s imperative to share. (For what it is worth, Zuckerberg has always sold Facebook’s version of social interaction as natural. Earlier in the site’s development, he noted that “the profile is really the core of the site because it represents who people really are” (6:07 here). As if my About Me were everything about me!)
While Zuckerberg has worked to reshape notions of openness, connectedness, and users’ sociality to serve Facebook’s interests, perhaps no concept has been more distorted in the name of sharing’s imperatives than privacy.
Privacy—itself a concept with a rich legal and philosophical history—has long been a thorny issue for the company. Over the past decade, Zuckerberg and other executives have weighed in on the subject with varying degrees of contempt—usually in response to some privacy-related fiasco generated by changes made to the site’s features or policies. In spring of 2010, when Facebook decided to flip the switch and make a bunch of previously private information public, Facebook VP of Communications and Public Policy Elliot Schrage exhibited arguably the most contemptuous attitude, shifting the onus of responsibility for privacy away from Facebook and onto users entirely. He called Facebook entirely “opt-in” and patronized users by saying “if you’re not comfortable sharing, don’t.” (Of course, Schrage’s points completely overlooked the fact that users were comfortable sharing certain types of information privately, but it was Facebook that decided to suddenly force information previously subject to privacy controls out into the open.)
CC-BY Maria Elena, filtered.
For Zuckerberg’s part, he’s exhibited a contempt for privacy whenever it has conflicted with sharing and the “radical transparency” of his more open and connected world. In David Kirkpatrick’s 2010 book The Facebook Effect, for example, Zuckerberg claimed that people who use a fake name or alter ego (a common tactic used to preserve some level of privacy online) “lacked integrity.” Early on in the site’s development, he believed that limiting users to one account tied to one identity was integral to building the sort of trust people needed to share information in the first place. In fact, Zuckerberg cited as one of Facebook’s early successes helping people overcome the “hurdle” necessary to share personal information online (such as a full name, real photograph, and mobile phone number)—a hurdle that the CEO insists most users were unwilling to jump when the site first launched (see 15:22 in this interview).
Recently, however, Zuckerberg has stepped back from this position. “If you’re always under the pressure of real identity,” he’s said, “I think that is somewhat of a burden.” No doubt a response to the popularity of apps like Snapchat, the company is now recognizing that having multiple identities or preserving some level of anonymity might actually help users share more information.
Facebook’s contempt for any conception of privacy besides their own goes beyond remarks made by Zuckerberg and others. This contempt is also enshrined in the site’s governing principles: “People should have the freedom to decide with whom they will share their information, and to set privacy controls to protect those choices.” Here, privacy is reduced to little more than an issue of control, important only insofar as it helps people to feel comfortable sharing more and more information. Unless privacy works exclusively in the service of sharing, it—as Zimmer has put it—”must be overcome.”
Ultimately, the relationship of Zuckerberg and company to privacy is the definition of contempt: there’s plenty of room for it, just so long as it doesn’t run counter to Facebook’s imperative of sharing.
Keeping Track of Facebook’s Contempt for the World
CC-BY Forgemind ArchiMedia, filtered.
Facebook’s manipulation of rich social and cultural concepts exposes, as van Dijck argues, “a cultural battle to establish a new normative order for online socializing and communication.” Though not always immediately obvious, sustained attention to the language and understandings employed by Zuckerberg can reveal how Facebook resists any interpretation of openness, connectedness, sociality, and privacy that might challenge the company’s dogmatic commitment to sharing. The sum total of this resistance is a site that exhibits a subtle, but enduring contempt for those parts of the world that undermine Facebook’s ideology.
Finally, the importance of understanding Facebook’s contempt for the world goes beyond simply being more critically engaged users. Facebook’s co-opting of rich cultural concepts for narrow, ideologically-driven purposes also presents deep practical and epistemological challenges to research conducted using data obtained from the site. The information gathered on Facebook is, in fundamental ways, produced not by users, but by Facebook itself. Users are constrained by the categories and options Facebook offers; their activities are filtered through the site’s biases and framed by Facebook’s myopic view of sharing.
Researchers in both academia and the corporate sphere double-down on Facebook’s contempt when they engage in research on the site without regard to its built-in biases. It is blatantly irresponsible, for example, to refer (as one sociologist recently did) to data produced by Facebook as “natural.” Facebook data—as with “Big Data” generally—is not raw or somehow immune to the site’s influence. Consequently, we must remain aware of the ways Facebook is decidedly not natural: it exhibits values, politics, and biases that shape the ways in which users interact and exchange information both within the site and on the broader Web. In this light, the shift from “move fast and break things” to “move fast with stable infra” should remind us just how much control Facebook wields over the possibilities for conceiving of sharing and privacy — both culturally and practically — online today.