Assumptions Embedded in Design(s) – Part 3: Societal Values

This is Part 3 of a series about assumptions embedded in design. This work was inspired by the recent Sketch Model workshop at Olin College and will be the basis for a future course at Swarthmore College. For additional context and the start of the series, check out Part 1 (Bodies) and Part 2 (Identities).

Societal Values

The last embedded category of assumptions embedded in design is societal values. What we create says a lot about what we consider important. This one can be the most challenging to appreciate because societal values are so deeply embedded into our culture and our minds.

An example of societal values embedded in technology is ankle monitoring devices. The creation of such technologies at all is predicated on a society that believes in incarceration and surveillance as important and acceptable. If our society didn’t believe in imprisonment there would be little needs for these devices. Assumptions based on societal values intersect and extend beyond both bodies (e.g. device sizing) and identities (e.g. who is wearing the device, more likely black and brown folks).

Technological interventions in the social realm are harder to consider as the creation of one technology cannot necessarily change all of society around it; because the assumptions are systemic, it can be hard for any one designer or technology to move the needle. In this case, critical design (especially through art), science and technology studies, and anti-oppressive design are some of the interventions available. These methods can challenge and critique existing societal structures, seek to invert hierarchies, or explore what the world might be like if our values were different.

Art in particular can be a powerful way of questioning “the way things are” (see e.g. Sara Hendren’s blog Abler). An example of critical design and art work is this video of Doreen Garner’s work (CW: simulated genital mutilation and surgery, thanks to Sara Hendren for the link). Doreen goes beyond describing the historical devaluation of black women’s bodies — she demonstrates it. Work like this can both provide catharsis and encourage others to think critically about the past, present, and future.

Assumptions Embedded in Design(s) – Part 2: Identities

This is Part 2 of a series about assumptions embedded in design. This work was inspired by the recent Sketch Model workshop at Olin College and will be the basis for a future course at Swarthmore College. For additional context and the start of the series, check out Part 1.

Identities

Assumptions about identity encompass not only the body and its abilities, but adds additional elements of the social realm. For example, many toys are gendered “for boys” or “for girls” based on presumed interests of those groups, rather than specifically about the ability of those groups to use those toys. Like with bodies, many designers assume that only certain people will be interested in their products.

One example of unintentional exclusion based on identity is the assumptions embedded in period tracking apps. The app Glow, for example, has an onboarding screen with the following options:

1_4XD7ZbUeavUauEgRPlc6dQ

Note how all the options fall into the “baby / no baby” binary – you’re either tracking to get pregnant or not to. What seems like a benign onboarding question actually demonstrates how the designers made assumptions about who tracks their period and why – it assumes that one is fertile, sexually active in a relationship that can get one pregnant, and that one is tracking for reasons specifically to do with pregnancy itself. The app isn’t designed with other users in mind.

Interventions regarding assumptions about identity include a focus on “human centered” or “user centered” design: rather than necessarily creating a tool that works for everyone like universal design (see the bodies section), human/user centered design focuses on understanding how identities, needs and preferences affect what users want from technology and how they interact with it. This enables designers to create better products accordingly.

An example of human/user centered design in the period tracking space is Planned Parenthood’s period tracker called Spot On. The app has some of the best onboarding I’ve ever seen, including options to specify from a myriad array of birth control methods (including none!). The app also automatically adjusts different aspects of your cycle depending on which (if any) birth control methods you are using, which most apps don’t do. Spot On won Fast Company’s Innovation by Design Award.

Assumptions Embedded in Design(s) – Part I: Bodies

This past June I had the privilege of attending the Sketch Model Workshop at Olin College. The workshop brought together practitioners across engineering, art, design, and the humanities to discuss our “dissatisfactions” with engineering education. We worked in groups and individually to brainstorm ideas to improve upon existing practice through, among other things, integration with the arts and humanities. My dissatisfaction, which I plan to channel into a course at Swarthmore College in Spring 2020, was that engineers design for the “default” in an uncritical, unintentional way. Assumptions about a user and society more broadly are embedded into the technology we build (and that we don’t). I grouped the assumptions into three major categories: those related to bodies, those related to identities, and those related to societal values. In the next few blog posts I will discuss exactly what I mean about each of these different categories, and talk about design interventions that challenge the status quo.

Bodies

Assumptions about bodies refer to 1) physical traits of someone’s body (e.g. their calf circumference, weight, height, etc.)  and 2) that individual’s abilities (e.g. walking certain distances, lifting certain weights, seeing certain objects/distances). For example, if we consider a wrist watch, the different sizes of the watch band are driven by the expected size of individuals, and the designers assume that the user is capable of interpreting the watch face. Sometimes these design decisions are made intentionally (and that can be a good or bad thing), but often not. Many designers unintentionally design for a “default” user, and this lack of critical thinking can prevent certain people (often those with less privilege in some way) from using their products.

Universal Design has emerged as an intervention to move technology away from only working for “default” bodies. From the Centre for Excellence in Universal Design website:

An environment (or any building, product, or service in that environment) should be designed to meet the needs of all people who wish to use it. This is not a special requirement, for the benefit of only a minority of the population. It is a fundamental condition of good design.

Universal design situates the body and that body’s abilities as key considerations for good design. Everyone should be able to use a given technology or tool.

A classic example of universal design is the origin story of the company OXO. Sam Farber’s wife Betsey had arthritis and was having difficulty holding a vegetable peeler. The two then worked together to design kitchen tools that don’t hurt your hands. Since then, the vegetable peeler and other OXO kitchen tools have won numerous design awards. It turns out that not only did the new peeler improve access for those with arthritis; the peeler was better for everyone.

 

 

Technology is not neutral

A common misconception among STEM practitioners is that technology and the development thereof are neutral. The idea is that technology is neutral, and we as humans apply this technology for good or evil. However, this point of view doesn’t recognize that humans create technology, and our values are embedded in what we build (and what we don’t).

My research involves the development and translation of medical technologies to improve patient lives. Over 70% of students in my field choose to study it because they want to help people [ref]. Ironically, because we’re in medicine and care about improving the world we assume that what we are doing is good. We don’t always think critically about how our technology is actually used and who benefits from the development of that technology. Unless considered carefully, designers create devices to help “default” people with “default” problems. This benefits the lives of white, male, middle or upper class, able bodied, straight, etc. folks disproportionately compared with others. These technological advances then tend to reinforce existing power structures and are not, in fact, neutral.

Values are Embedded in What We Build

Unless they consider otherwise, designers will build for who they expect will use their devices. Part of my master’s thesis project was to redesign a flexible ECG measurement system previously developed in my research group. The system was integrated into a single, flexible PCB that, when worn on the chest, could bend and break solder connections when worn by someone with significant amounts of breast tissue and/or large pectoralis muscles. The engineer who developed the system did not consider body types of anyone other than those with flat chests (i.e. relatively thin males). Despite good intentions, this device could not be worn by about half of the world’s population. The values of the designer were embedded into the device itself. Fortunately, this mistake was realized early in the design process and could be corrected in a subsequent revision; however, inequities in designs aren’t always addressed promptly, if at all.

…and What We Don’t

For several years I ran a meetup group for women and nonbinary people interested in self-tracking. One of the most common topics to come up was hormones: managing the menstrual cycle, birth control, (peri)menopause, transitioning, trying to get to the bottom of chronic illnesses. What continued to surprise me was how little we understand about the female reproductive system and hormones, and what people tend to put up with because their healthcare providers tell them to just deal. This recently inspired two women I met to create a website dedicated to understanding and helping women through perimenopause. Folks are taking it on themselves to figure things out because the healthcare system fails them.

When I attended the Make the Breast Pump Not Suck Hackathon last month, the common theme was that as a society we know more about milk production from cows than we do from humans. At the event, I heard stories from parents from even historically privileged backgrounds discuss how difficult it is to provide breast milk for their child while navigating uncomfortable pumps and unaccommodating workplaces.

What should we do?

Once we appreciate that technology is not neutral, a next step is to consider what biases we, our colleagues, our institutions, and society at large bring into our designs. It’s also important to consider the history of technology and how it’s helped (and hurt) different groups of people in different ways. I explored this as part of an assignment in grad school, which can be found here. We can use this information to work toward more inclusive designs.

Further Reading

There are many other examples of designers not considering the breadth of their users, such as the assumptions made in period tracking apps. There is also extensive discussion about algorithmic bias and data structures (see, for example, Safiya Noble’s recent book). Seeing Like a State discusses, among other things, how the production of knowledge and data is based on the states’ interpretation of the world around it.

The absence of technology is somewhat harder to write about, but another example to consider is, for example,  how women’s pain is not taken seriously. This is also true of people who are overweight and people of color. This represents just a snippet of the ways in which the needs of certain individuals are prioritized over others in visceral ways.

 

Why Engineering Inclusively

Over the course of my research career as a medical device designer, I’ve learned that inclusive design is a challenge even for those with the best of intentions. Forward progress will require a dedicated effort on the part of those in the field (in collaboration with other stakeholders) to understand the challenges and pioneer best practices and alternatives. While conversations around ethical design practices and the impact of technology on society are not new, most engineers are not taught how to think critically about the work they engage in. I decided to become a professor to help shape the next generation of engineering students to think critically about the impact their work has on society. I intend to use this blog to share what I’m learning in this process.

I will start things off by posting about my takes on inclusive design, including topics such as:

  • Technology is not neutral
  • Lack of inclusion in technology and the consequences
  • The flawed “standard” engineering design process
  • Where to go from here

I intend to write these posts from my personal perspective, but will also make an effort to provide links for further reading to resources elsewhere, particularly the works of folks in science and technology studies; they do not receive enough credit for their important work. I welcome any suggestions for additional readings or resources to include.