1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites

Do Siri and Alexa need a feminist reboot?

December 17, 2020

Use a voice assistant? Is "she" a slave to white male prejudice? Researchers Yolande Strengers and Jenny Kennedy say AI-driven voice devices in the home reinforce sexist stereotypes. But they've got a plan.

https://p.dw.com/p/3mqPJ
Woman speaking to a voice assistant device in a kitchen
Intelligent home assistants are practical but are they reinforcing negative gender stereotypes?Image: Imago/AFLO

DW: There's been a fair bit of writing and discussion about gender stereotyping and other bias with voice assistants and similar artificial intelligence systems. And your book The Smart Wife — Why Siri, Alexa and other Smart Home Devices need a Feminist Reboot addresses that. Before we get into your recommendations, tell us why voice devices seem so prone to bias? 

Yolande Strengers: It's not so much bias as it is a deliberate strategy to help people learn to like these devices, to accept them and to welcome them into our homes and into our lives. It just so happens that feminine stereotypes, when they're attached to objects that are designed to perform traditional feminine tasks, and that are in the form of an AI, in our homes, that we're more likely to be comfortable with them if they have that female form. So, it's not completely unintentional, which is what unconscious bias often assumes. But it's more of a [commercial] strategy, which makes a lot of sense really, as to why this is happening.

So, the idea of using male voices or genderless, mid-frequency range voices, how does that affect the situation from your perspective?

Bossy: Artificial, intelligent and female

Jenny Kennedy: It doesn't really address one of the main problems, which is the type of work that these devices are brought into the home to do and the most appropriate ways of doing that kind of labor. It's very calculated and algorithmically managed. But it's also about how that kind of "wife work" is valued in the home, and the way in which we have previously valued or undervalued the people that are doing that work. 

In the book, we mention voice assistants used in other contexts. But we're looking specifically at voice assistants in the home. And there's still this very rigid and limited ideal of what the home is, and the roles people play. We still operate on the basis of 2.4 people and assume heteronormative relations between the adult couple. It's that context that makes all these devices really problematic, because they are reenergizing that outdated ideal.

The Smart Wife — Why Siri, Alexa and other Smart Home Devices need a Feminist Reboot by Yolande Strengers and Jenny Kennedy
Image: MIT Press

And you're calling for a "feminist reboot." What is a feminist reboot in practical terms and how would it help here?

YS: The feminist reboot is a set of proposals that Jenny and I make for how we can improve the situation. We look at this across the spectrum of industries and ideas, from how the devices are designed and the personalities they have, right through to how they are represented in the media. We talk about the way we often blame the feminine device rather than the companies that make them, and how that reinforces negative stereotypes towards women.

But we also look at the Sci-Fi industry and how the representation of smart wives on screens provides the inspiration for what we end up with in our homes and how we actually also need to change and challenge the film industry to come up with some new imaginations about what these kinds of creations or these helpers could be to help inspire the roboticists, AI developers and the whole computing industry.

Speech-recognition software doesn't understand everybody

The European Commission is introducing new default requirements under its Horizon Europe funding program from 2021 that all grant recipients include — what they are calling — the "gender dimension." The responsible commissioner, Mariya Gabriel, says that "integrating sex and gender-based analysis into research and innovation, and [considering] intersecting social categories such as ethnicity, age or disability, is a matter of producing excellent research to the benefit of all European citizens." That covers health, urban planning, climate change, AI and machine learning, facial recognition and "virtual assistants and chatbots: analysing gender and intersectionality in social robots." What do you think about that? Will it help?

YS: It's a fantastic move. We know that so often gender just isn't considered, as well as a range of other important intersectional issues. It reminds me of the book Invisible Women by Caroline Criado-Perez, which documents at length all the various ways in which women are invisible in research and data analysis.

Kids dress a teaching virtual assistant at a school in India
Kids dress a teaching virtual assistant at a school in India — What are these devices teaching kids about their identities in society?Image: Getty Images/AFP/I. Mukherjee

But it can't just be a case of having a diverse range of people on a team, or can it? Take the European Space Agency, for example: Among the top 11 management positions, there's just one woman. Is that necessarily a problem?

JK: You can't really claim to have explored all perspectives if you have a limited number of people who are able to put forward their perspective. It's about having a diversity of people at all levels. So, I do think it's a problem, like a group of 11, where there's only one woman. And it's not just that group, it's everything that feeds into there being only one woman able to have placed themselves in that position.

It's about paying attention to matters of diversity and gender from the very beginning, rather than building a product and then having the gender and diversity team come in to do a usability patch up on it.

The other issue is that of trust. And European politics wants stronger standards in that area. How do standards help to improve trust in technology?

JK: One of the proposals we have in the reboot section is a form of verification to provide users with some confidence in the design process behind the types of the device they're using, extending to the kinds of representations a device is going to perpetuate in their household.

A voice assistant toilet on show at 2020 CES in Las Vegas
Impossible to escape? Voice assistants may soon be everywhere in your home, even in the privacy of your bathroomImage: Reuters/S. Marcus

For instance, is this device capable of assisting all persons in the household, or is it only really suitable for a particular user, namely a white man?

And this is where the female persona and the voice comes in, because often what people are talking about is whether they trust the device in the home. But the bigger question is whether they trust the corporation behind the woman, the corporation that is harvesting all their data — that is, a larger commercial machine that the female voice helps to obscure.

The Smart Wife — Why Siri, Alexa and other Smart Home Devices need a Feminist Reboot by Yolande Strengers, associate professor in the Department of Human Centred Computing at Monash University, and Dr. Jenny Kennedy of RMIT, Melbourne, is published by MIT Press (2020)

Zulfikar Abbany conducted the interview.   

DW Zulfikar Abbany
Zulfikar Abbany Senior editor fascinated by space, AI and the mind, and how science touches people