Studies Suggest That People Are Sexist To Voice Assistants

by Kareena Dodeja

Studies Suggest That People Are Sexist To Voice Assistants

April 12, 2021

Alexa is just AI, why are we being sexist to a robot? There has been evidence that users find female AI warmer and trustworthy. We have all used Alex or Siri but there is an underlying gender bias that persists. People prefer to hear the sound of a woman because they can feel a connection which one might say is guff.

Alexa, Explain Gender Disparity In Voice Assistants

What if someday Alexa had the voice of Robert Downing Jr, wouldn’t we like that? Most of us cannot even imagine Alexa having a masculine voice. Alexa sweetly mentions that it neither is a man nor a woman, it is just AI. Machines lack warmth and connection so by humanizing them we can make them relatable to the users. So they decided how to do it- casting the robots with a female voice. We would expect it to be female because of the sound and the name. In an article in New York Times, they wrote on how Amazon and its competitors in digital assistants have marketed the AI to seem feminine.

Yolande Strengers mentions to New York Times that the devices make appointments and do all the gendered tasks. She is the co-author of the book, ‘The Smart Wife: Why Siri, Alexa, and Other Smart Home Devices Need a Feminist Robot.’ The book spoke about the roles of the digital assistants acting feminine. Alexa, Siri, Cortana, and Google Home have been perceived to be feminine by the users as they abuse or pass sexualized comments to it.

Siri has an option that can change the female to a male voice so phew, that works. Alexa has a default female voice which can be changed into the actor Samuel L Jackson’s for a paid amount. There have been cases when people have flat out asked the digital assistants with sexually explicit questions but Amazon has updated Alexa to not respond to such questions.

Studies On Selling Sexism In AI

The article mentioned how gender and technology are intertwined, tech companies capitalize on the gender traits to make their products sell yet to change the features when they tend to become problematic. Mark West, an education project author with UNESCO was a lead author on a report on gender parity in technology in 2019.

According to an article in Futurism, researchers discovered that users are biased towards female AI believing them to be ‘more human’ than male AI. A study was published in Psychology and Marketing that stated that consumers tend to perceive female robots as having more positive human traits such as warmth and emotion than male robots. The study was reported by the Academic times.

The researchers conducted an experiment where they tested out characteristics such as warmth, experience, and competence in five different studies with more than 3,000 participants. According to the study, they discovered that people prefer female robots because they are more humane than male bots which shows a gender bias exists.

Sylvie Borau and her colleagues experimented on what makes something human and why female features are so prominent in AI. The authors tested the human characteristics with 3,000 people to compare gender perceptions in humans and animals too in five studies. In the first one, she randomly assigned it to either a male or female chatbot and answered questions about the chatbot’s humanness on various factors such as facial expressions and how evolved it is. The names of the bots were Oliver and Olivia so there wouldn’t be much of a difference.

The authors covered all bases in their research and the participants would give their opinions on how gender could be influenced by various factors. According to the researchers, people who follow gender stereotypes would most likely dehumanize women. The participants perceived women to have positive human qualities in the study.

The study’s authors told The Academic times that products such as Amazon Alexa and Google Home promote the idea of women acting as simple tools designed to fulfill their owner’s needs. Users prefer using female AI in such products so it can satisfy the notion that women are submissive to your wants which is crap. Sexism is a broken record that is everywhere.

A former Stanford professor, Clifford Nass researched digital voice and gender that resulted in people consider female-sounding voices helpful and trustworthy since male voices are authoritative. He, unfortunately, passed away in 2013 but his work was cited in the discussions of digital assistants. According to New York Times, an Amazon spokesperson has said the current feminine voice is preferred by the users but one might ask who are the users and preferred by what?

Gender Neutral Robots Are The Way To Go

The studies noted that voice assistants perpetuate gender stereotypes of subservience and sexual availability if digital assistants are female by default. Mark says that there is nothing inevitable as we are collectively in control of technology. If this is the wrong path, we need to do something.

The team has come up with few ways to tackle this issue such as creating gender-neutral AI. They believe that AI engineers will have to work very hard to make gender-neutral robots sound human. The other concrete solution would be is to create an equal number of female and male AI. This could increase gender parity in AI if smart tech developers make it possible.

The developers have been coming up with AI voices to communicate with us. The research would one day create unprejudiced and unbiased products in the future. It would be a revelation if we could live in a gender-neutral world. Could be a reality someday, don’t lose hope!

Here is the ray of sunshine- the concept of gender-neutral voice, Q, billed was created as the world’s first genderless voice assistant. It was introduced at the SXSW festival in 2019 by a group of activists, ad makers, and sound engineers including the Copenhagen Pride and the non-profit group, Equal AI.



Recommended for you

Leave a Comment

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More