By Lea Rabaron
Science Editor
Over the past few years, digital assistants have come to play a central role in the common household. Whether it be in taking care of mindless tasks or for informational purposes, digital aids have become a near essential asset for many people. They play the role of convenient, happy helpers, sitting in the background of our homes, always readily available to answer questions and perform tasks on command.
In every one of our devices lies a quiet attender, whether that be Siri, Alexa or Cortana. Although they were produced by different companies (Apple, Amazon and Microsoft, respectively), there has been a noticeable trend in the creation of these automated devices. However different Siri, Alexa and Cortana may seem, they share one distinguishable characteristic: a feminine sounding voice.
Although this may seem harmless and typically goes by unnoticed, the question arises as to why our digital aids all have stereotypical female voices. Michael Britt, host of the renowned Spotify podcast, “The Psych Files,” addressed this issue in an episode called “Giving Voice to our Digital Assistants.” In his episode, he reviews a UNESCO report published in 2017 called “I’d Blush if I Could,” which discusses the implications of gendered voices in digital aid technology and its inherent faults.
Dr. Britt, who currently teaches psychology at Marist College in Poughkeepsie, New York, has been the host of “The Psych Files” since 2007, when he released his first few episodes. Throughout his podcast, Dr. Britt discusses the topic of gender many times, especially female gender stereotypes.
“The topic of gender equity has always been an important one for me. I suppose it has to do with the fact that my upbringing included some strong women. My grandmother brought my mother to America from Italy in 1920 and neither one spoke any English. Despite that, my grandmother and grandfather created a warm and loving environment and raised 7 children. They made a living building homes and growing a large garden and bringing up children who respected people no matter what their gender, color or sexual orientation was,” Dr. Michael Britt said in an interview via email.
When asked about the reason behind his publishing of “Giving Voice to our Digital Assistants,” (ep. 326), Dr. Britt was quick to comment.
“I think I recorded the episode on feminized digital voices when my mother was still alive and in a nursing home. I noticed that her care-workers were all female and that many women who came to this country from places like Haiti were employed as home health aides. It’s tough work, but it seems like it is almost always work that women do. And the pay is very low despite being so hard. I have a distant uncle who works in the insurance industry and he makes a lot of money. It’s just not fair the way we compensate people who work in stereotypical “women’s work” compared to stereotypical “men’s work,” Dr. Britt said.
For years, the idea of “Woman” or “Womanhood” was always associated with “Servant” or “Servantship.” Still today, these seemingly outmoded ideals and conceptualizations of women’s role in society subliminally affect our perception of the female gender. What used to be overt and blatant misogyny turned to covert jabs at the Woman and the female role, which usually tend to be overlooked since they are not outwardly conspicuous.
Since women have always been seen as the caring, servile gender, it made sense for companies such as Amazon, Apple and Microsoft to use feminine voices for their digital aids. A feminine sounding assistant is more likely to be portrayed as subservient, while a male sounding assistant would be seen as amiable and practical, rather than obedient and dutiful. Essentially, companies in charge of producing these assistants use feminized voices as a marketing technique to sell the idea of a compliant servant, rather than a user-friendly device.
“One reason for this is that the employees/programmers at these organizations are predominantly men. Since women aren’t encouraged at a young age to consider computer programming (or really any STEM science) as a career path for them, we wind up with work teams that don’t contain women. So a woman isn’t there on the team to speak up and say, “Hey wait a minute. Is it really best for the default voice of this device to be female? Is there a way we can give the customer an option to use either male, female or a non-gender specific voice?” Without someone like this present on the team, the default voice will probably always be female. I think the situation is changing, however, as the workplace becomes more diverse,” Dr. Britt said.
In giving our digital aids a feminine voice, these companies are subverting the work of thousands of women around the world in their fight for gender equity by flagrantly advertising against it. Alexa, Sira and Cortana are meant to respond to direct commands using the same helpful tone, no matter how they are being addressed. Whether the command be kind or aggressive, they will always answer in the same way.
This is one of the many topics addressed in UNESCO’s published piece “I’d Blush if I Could.” The article claims that our digital assistants’ inability to modify their tone according to the way in which they are being addressed perpetuates the idea that women should always respond obligingly under authority. In this, misogynistic ideals of gender roles are continuously being held up in subliminal and unconscious ways, making them acceptable because they are not explicit.