A common misconception among STEM practitioners is that technology and the development thereof are neutral. The idea is that technology is neutral, and we as humans apply this technology for good or evil. However, this point of view doesn’t recognize that humans create technology, and our values are embedded in what we build (and what we don’t).
My research involves the development and translation of medical technologies to improve patient lives. Over 70% of students in my field choose to study it because they want to help people [ref]. Ironically, because we’re in medicine and care about improving the world we assume that what we are doing is good. We don’t always think critically about how our technology is actually used and who benefits from the development of that technology. Unless considered carefully, designers create devices to help “default” people with “default” problems. This benefits the lives of white, male, middle or upper class, able bodied, straight, etc. folks disproportionately compared with others. These technological advances then tend to reinforce existing power structures and are not, in fact, neutral.
Values are Embedded in What We Build
Unless they consider otherwise, designers will build for who they expect will use their devices. Part of my master’s thesis project was to redesign a flexible ECG measurement system previously developed in my research group. The system was integrated into a single, flexible PCB that, when worn on the chest, could bend and break solder connections when worn by someone with significant amounts of breast tissue and/or large pectoralis muscles. The engineer who developed the system did not consider body types of anyone other than those with flat chests (i.e. relatively thin males). Despite good intentions, this device could not be worn by about half of the world’s population. The values of the designer were embedded into the device itself. Fortunately, this mistake was realized early in the design process and could be corrected in a subsequent revision; however, inequities in designs aren’t always addressed promptly, if at all.
…and What We Don’t
For several years I ran a meetup group for women and nonbinary people interested in self-tracking. One of the most common topics to come up was hormones: managing the menstrual cycle, birth control, (peri)menopause, transitioning, trying to get to the bottom of chronic illnesses. What continued to surprise me was how little we understand about the female reproductive system and hormones, and what people tend to put up with because their healthcare providers tell them to just deal. This recently inspired two women I met to create a website dedicated to understanding and helping women through perimenopause. Folks are taking it on themselves to figure things out because the healthcare system fails them.
When I attended the Make the Breast Pump Not Suck Hackathon last month, the common theme was that as a society we know more about milk production from cows than we do from humans. At the event, I heard stories from parents from even historically privileged backgrounds discuss how difficult it is to provide breast milk for their child while navigating uncomfortable pumps and unaccommodating workplaces.
What should we do?
Once we appreciate that technology is not neutral, a next step is to consider what biases we, our colleagues, our institutions, and society at large bring into our designs. It’s also important to consider the history of technology and how it’s helped (and hurt) different groups of people in different ways. I explored this as part of an assignment in grad school, which can be found here. We can use this information to work toward more inclusive designs.
There are many other examples of designers not considering the breadth of their users, such as the assumptions made in period tracking apps. There is also extensive discussion about algorithmic bias and data structures (see, for example, Safiya Noble’s recent book). Seeing Like a State discusses, among other things, how the production of knowledge and data is based on the states’ interpretation of the world around it.
The absence of technology is somewhat harder to write about, but another example to consider is, for example, how women’s pain is not taken seriously. This is also true of people who are overweight and people of color. This represents just a snippet of the ways in which the needs of certain individuals are prioritized over others in visceral ways.