Written by DLINQ staff
“In Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech, author and web consultant Sara Wachter-Boettcher makes use of the classic engineering term “edge case” traditionally used to describe scenarios that are considered extreme or beyond the norm. In the design of products and services — including technological products and services — these cases are typically not addressed, acknowledged, or incorporated when designing new features, since they do not target what is considered to be the average or “normal” cases. In her book, Wachter-Boettcher outlines how tech companies use “edge-case thinking” to “shrug their shoulders at all the people whose identities don’t fit into what their narrow definition of ‘normal’ is” (137) and in so doing, fail to meet the needs of those not deemed “normal” by mainstream standards.
Who, exactly, constitutes an “edge case”? First, let us be clear that “edge cases” are in fact people, whose experiences and needs deserve to be seen, heard, and incorporated into design processes and products. In the United States, “edge cases” often refer to marginalized people such as people with disabilities, women, minorities, transgender, gay, and non-white individuals. For example, when a web form that asks a user to identify their gender offers only binary male/female options, it fails to include people who identify as transgender and gender non-conforming. In the context of algorithmic systems, Wachter-Boettcher argues that designating edge cases as impractical or not worth designing for can lead to ongoing and sustained algorithmic bias by failing to make visible and valid the realities of edge case scenarios. As Wachter-Boettcher writes
“We’ll only be successful in ridding tech of its excesses and oversights if we first embrace a new way of seeing the digital tools we rely on — not as a wonder, or even as a villain, but rather as a series of choices that designers and technologists have made… [E]ach of these choices reinforces beliefs about the world, and the people in it” (200).
Following this line of thinking, when tech companies and designers fail to design for edge cases, the marginalized in society continue only to be further marginalized.
In this day and age, when many of us have become complacent about the algorithms which significantly determine the details of our daily lives such as the groceries we buy, the health we track, the people we date, and more — what steps can we take to change the course of edge-case avoidance in the design of digital products and services we use to manage our lives?
The first step is to realize we have a choice, and to strike a critical stance in our examination of the options that exist before us. What does a critical stance imply? A critical stance asks us to look openly at situations without our own bias or preferences interfering, to question what we see from a neutral mindset, and to examine clear evidence in making a determination about how to proceed. In today’s world, to be a “user” of digital products and services is a serious task, one which requires scrutiny and care. Below, we offer a few small moves that can be used to acknowledge the need to include marginalized voices and experiences in the design process.
- Advocate for including edge cases in digital services and products:
- Send customer service complaints when edge cases aren’t addressed by a digital product or service that you use;
- Inform the media about a digital product or service’s failure to address edge cases;
- Write your elected officials regarding digital tools and services that avoid edge cases;
- Support an alternative product which addresses edge cases in a way that a mainstream product does not.
- Sign on to, and incorporate, the Design Justice Network Principles into your design decisions
- Creative Reaction Lab’s Equity-Centered Community Design framework is also a great tool for incorporating marginalized voices into the design process
- Personal Histories by Sara Wachter-Boettcher
- Design for Real Life: An Interview with Sara Wachter-Boettcher by Mica McPheeters
- Inadvertent Algorithmic Cruelty by Eric Meyer
- Methods of Crisis by Vasilis van Gemert
Did you miss our previous Detox articles? View them on the DLINQ blog