The Argument for Free-Form Input
We continue to arbitrarily trust the judgements of white, able-bodied, neurotypical cis dudes to define personhood in the digital world.
The software community fetishizes “disruption†— breaking down barriers by creating more accessible tech and new kinds of human interaction — but remains careless in how it represents individuality online. Despite the intention of opening new worlds and reaching millions of users, we select our identities from a drop-down menu. We enter one value for our names, gender, sexuality, relationships and ethnicity, constraining our digital personhood in a database schema. These designs limit expression while excluding and erasing marginalized identities. They reflect the restricted imagination of their creators: written and funded predominantly by a privileged majority who have never had components of their identity denied, or felt a frustrating lack of control over their representation. But human identity, relationships, and behaviour are all endlessly complex and diverse: our software needs to start expecting and valuing marginalized identities instead of perpetuating their erasure.
Registering for Facebook requires you to select between “female†and “maleâ€. Google gives you “Femaleâ€, “Male†and “Other.†Both require you to enter a first and last name, separately. They ask your name and gender before anything else, presenting them as core to human identity. But these forms are wrong. Requiring a first and last name is problematic for anyone whose name isn’t structured in this way — you can check out W3C’s Personal names around the world. Most forms require users to identify within a gender binary that doesn’t apply to many. To make matters worse, many sites have policies (e.g. Facebook’s real name policy) where a user must enter data abiding by the terms, or forfeit usage of the product. It may be impossible for a user to both follow the policy and enter data that correctly represents themselves.
It is unacceptable that the same industry writing thinkpieces and case studies on every minutiae of UX doesn’t question how their interfaces might invalidate the personhood of those different than them. We acknowledge that frustrated users leave quickly. We track and consider what percentage of people will leave a site if it doesn’t load in under two seconds. We acknowledge that overly strict validation and requiring a specific syntax (e.g. “MM-DD-YYYY†for dates when “DD-MM-YYYY†or “DD/MM/YYYY†is easily parsable) will make users leave or enter the wrong data. We talk about intelligent defaults, and which ordering of exclusive options will yield the highest conversion rate. Yet despite all this research, quantification and analysis, we continue to arbitrarily trust the judgements of white, able-bodied, neurotypical cis dudes to define personhood in the digital world.
Interfaces made without diversity are destined to be wrong
Photo CC-BYÂ Kevin Pham, filtered.
The state of identity online results from the lack of diversity in tech. A diverse product team recognizes that intelligent defaults go beyond populating a drop-down with the options you think will be selected most. Instead, defaults should give equal opportunity to all users, even those you can’t imagine. They shouldn’t force users to lie or ignore parts of their personhood, or force users to side with ambiguity over expression. Well-executed user interfaces are designed knowing that the assumptions we try to make about users will not be right—they allow for input past what we can imagine at first.
The many sloppy attempts of including identities outside the norm (e.g. polyamorous, non-cis normative gender, sexualities including asexuality and fluidity) highlight the lack of diversity in tech. Gender input is a particularly disheartening example, making clear that nobody these options affect is being consulted or listened to. We can start with Google and its drop-down option of female, male, or “other.†What is strikingly obvious to myself as a queer individual is that the value “Other†is an incredibly unfortunate choice. Othering is the construction of a dichotomy where there is the “norm,†and then the rest. The norm are considered the welcomed inner circle—the rest are subjected, inadvertently or not, to frequent microaggressions and choices that ignore or are counter to their interests. Being on the receiving end of Othering can make one feel like the world would prefer they did not exist, that their personhood is not valid, and that they will never be catered to. Using an “Other†category isn’t being inclusive by design—it’s a lazy reaction to being informed that the options “female†and “male†didn’t end up serving people you didn’t think of. Selecting the “Other†choice doesn’t serve my expression, it tells me that your product only cares about normalized data and advertisers—it tells me that this isn’t where I should be. This isn’t disrupting anything—this is not innovation.
Another embarrassing lack of consultation with the trans community in particular is typified by drop-down lists which contain only “femaleâ€, “maleâ€, and “transâ€. Consulting just about any trans person will tell you why this one’s wrong: a trans person can be trans and male or trans and female—they can even be neither and yet still consider “trans†to be a modifier. These items are not mutually exclusive (neither are male and female for that matter) and yet I’ve seen this many times, including on the registration form for GHC — causing a debacle that prompted Julie Pagano to write the excellent piece On Forms and Personal Information.
Also found on many sites are your standard “female†and “male†radio buttons with obscured options for queer, trans, intersex, and other individuals who don’t fit into cis norms or the gender binary—these options are put in fine print or out of the way on another page, requiring a user to search them out and oftentimes not find them. Facebook and Google both do this: their registration forms require a selection from your typical gender radio buttons, then let you use a text input in your profile later. But this still tells me that my personhood is not on level playing ground, that those outside the options you imagined are second-class citizens, that we may be unwelcome, that your product has not considered us, and that when things go wrong (such as online abuse), you will side with the users that you consider ‘normal’ over us.
Tucking away more inclusive options can be functionally equivalent to not having them at all. A user may never bother to get past this first interaction with your product, and is thus excluded from the slices of our lives that are increasingly facilitated through digital mediums. It also creates a requirement that one must be reasonably tech savvy in order to truly represent themselves: This is similar to a requirement Betsy Haibel wrote about for MVC: “You must be this technical to withhold consentâ€.
These tactics set up your data to be wrong from the start. Such user interfaces coerce users into entering data that does not describe them in a way that makes them feel safe and expressive, or simply does not describe them at all.
We can do better, easily.
We should strive to ask for as little data as possible, opt-in data policies first, and allow users to opt-out everywhere. Data isn’t always unnecessary though — sometimes the user wants to give data for the purpose of sharing or communicating their needs when interacting with others (for instance, displaying the correct pronouns to use). In these cases, there are mandatory best practices to avoid alienating, excluding, and Othering users. This not a panacea though: hiring—and funding software written by—diverse teams is a cornerstone in designing inclusive software from the beginning.
Avoiding exclusive user input starts with asking the right questions. Chances are, you are asking too much. Do you need the user’s gender and full name, or can you simply ask how they wish to be addressed (pronouns)? “Sex†is often used as a cop-out in place of “gender†so that a female or male binary input can be retained. This is still not the right question. A sexual dichotomy is a constructed model just like the gender binary. Given that no model—by definition—is perfect, many people will not fit the model and will have needs outside the dichotomy of that model. A user interface that tries to extrapolate information from a model is avoiding asking the user what they really need and instead relying on the generalization. It is better for software to ask for what is really needed and be transparent about why it’s needed. For instance, it might be interacting with a third party that expects data in a certain manner outside our control. Your users should be aware of this.
The best way to ask for information about someone’s personhood is a free-form text field where empty strings are allowed and an obvious option to opt-out is given. If suitable, you can go a step further by allowing the input to be symbolized—akin to a tagging system. This allows for symbols to be minimally auto-suggested and translatable. For instance, with gender, one could enter their value and include a modifier tag such as “cis†or “transâ€. Tags like this can be based on past submissions and weakly auto-suggested. Facebook’s gender input implementation, while not perfect, is mostly like this and allows users to select the pronouns they should be addressed by (unfortunately, this is a drop-down with just three options). Importantly, implementations on top of the free-form input should take care to store the original input, and make it easy for users to correct or remove structured additions.
Programmers tend to recoil in horror at the idea of letting users enter whatever they want. Free-form input is advantageous in nearly every way, though. Your users don’t have to lie to use your software. You exclude nobody. You misrepresent nobody. You avoid accidentally creating a barrier-to-entry you didn’t think of (and remember, you will never think of everything). It implicitly welcomes users more than exclusive user input ever could. There is less room for sloppily-designed sets of choices that result in bad data. Users are less susceptible to cognitive biases such as anchoring. Advertisers can even be happy as this is more targeted data than a gender binary or false set of options that a user had to enter but doesn’t actually care about. Free-form data can likely be normalized if desired anyway. We should consider that if data from a free-form input ends up too varied to normalize, then perhaps it should never have been normalized in the first place.
Marginalized individuals experience many barriers in digital spaces. Our identities are commonly regarded as edge cases, pathological behaviours (and thus not part of our personhood, and “not validâ€), or merely “preferred.†Digital spaces have the potential to be safe spaces that do indeed break down barriers found IRL, as the tech industry insists. But we’re not there yet. The schemas and input fields supporting digital interactions are only one of the areas the software community needs to do better on: Riley H. wrote a piece for MVC titled So You’ve Noticed Trans People Exist… Now What? which contains some important considerations on safety after going “beyond mere inclusionâ€. We can certainly do better: for all this talk about code being neutral, it’s time to see marginalized individuals considered equally.
“People aren’t edge cases.â€
– Carina C. Zona, Schemas for the Real World