Matter
Previous

Identity Issue 4

Fit For The Future, Built In The Past

Words By: Paige Ferrari, Writer, journalist, and documentary producer
@paigeferrari

Next
BuiltInThePast_Horizontal_782x459.jpg (1) BuiltInThePast_Horizontal_782x459.jpg (1) BuiltInThePast_mobile.jpg

Artificial intelligence has been hailed as one of the most important developments of the modern age. It may look to the future but, as Paige Ferrari explains, the technology has been created by people, largely men, whose ideas about identity seem firmly rooted in the past.

"Alexa, are you a woman?” Ask Amazon’s AI home-assistant and she’ll answer, in dulcet tones: “‘I’m female in character.” Push a little farther and ask: “Alexa, what do you mean?” She’ll respond: “I mean that, as an AI, I don’t have a gender. But my voice sounds female.”

It’s 2018, and we’re starting to see the beginnings of a new artificial intelligence revolution. Increasingly, our day-to-day tasks can be outsourced to digital helpers such as Alexa, Siri, and Cortana. They give us directions, create playlists catered to our moods and dim our lights on command. They also tend to use tones and speech patterns that are stereotypically female.

To a casual observer, this may seem out of step with broader contemporary culture, where #metoo and ‘equal pay for equal work’ call for the rejection of long-accepted gender roles. So why do our digital assistants still sound so retro?

According to spokespeople from companies such as Amazon, the answer is simple: user demand. “We tested Alexa’s voice with large internal beta groups before we launched,” says a company spokesperson. And the voice they arrived at is simply the one that users preferred. (In a related statement, Amazon noted that “Alexa” is a self-described feminist, and her name is derived from the library of Alexandria. But, as some journalists pointed out, so is the name ‘Alex’.) A Microsoft spokesperson gave a similar reason for the default identification of its digital assistant, Cortana: “People just responded better to a female voice.”

Dr Karl MacDorman, a professor of human-computer interaction at Indiana University, is not surprised by these results. In a recent study, he introduced men and women to different synthesized voices and asked them to report their impressions and experiences. Both genders reported a preference for the automated female voice, describing it as “warmer” and “more comforting.”

While there’s no arguing with personal preference, MacDorman says his study doesn’t put the issue of bias to bed. In fact, it may raise further questions of how much our “preferences” are organic, independently held inclinations, and how much they are the outgrowth of deeply-held, even subconscious cultural biases.

Perhaps people feel more comfortable with a female assistant because women still perform around 94 percent of secretarial jobs, while mothers statistically and stereotypically manage the household and the minutiae of daily life. If it seems ‘just right’ that Siri and Alexa perform similar roles, that may reveal how deeply invested we are in the status quo.

The question of whether to make a product that keeps users in their comfort zone can be a dilemma for developers. “Whether to select a voice that is compatible with a stereotype or in conflict with it poses an ethical quandary,” says MacDorman. “For example, people might be more likely to trust relationship advice from a female voice and automotive advice from a male voice.”

Intriguingly, a recent study from Stanford University found that users did prefer male tones - when the robot voice was offering advice about computer programs. One of the earliest AI robots, IBM’s Watson, presented as male. But he was also programmed to defeat human geniuses at chess, not to remind users to pick up the kids from football practice. Should designers give people what they want? Or is there some responsibility to push back, even in the face of consumer preference?

Ultimately, says MacDorman, choosing the gender of a digital assistant could be above the paygrade of most designers, even though it is an important decision. “Selecting a compatible voice will likely reinforce a stereotype. And the negative consequence of a stereotype is that it sets an artificial limit on human potential, or what we see a person as capable of doing.”

Bias reinforcement could have even greater consequences, as AI assistants grow more advanced and rely on ‘robot learning’– the ability to synthesize and grow intelligence based on outside information, as opposed to running off static, unchanging programs. In such cases, existing bias could distort the AI itself and teach assistants to operate from stereotypes instead of transcending them. Consider the AI bot ‘Tay’, who was created by Microsoft to mimic human speech by analysing interactions on Twitter. At first, Tay was a nice robot, exchanging pleasantries about dogs and news items. But, after a single day of absorbing the free-flow of uncensored talk on Twitter, she began tweeting messages that were vulgar, cruel, sexist and racist. Microsoft had to pull her off the platform and issue an apology - a reminder that realistic AI can mimic all of humanity, good, bad and downright ugly.

With Elon Musk and the late Stephen Hawking having issued warnings about the dangers of AI, and series like HBO’s Westworld showing vivid, violent depictions of a robot uprising, the potential risks of the AI revolution are becoming part of modern cultural conversation. The use of soothing female tones for everyday AI assistants may be one way of counteracting the inherent discomfort that many people feel about giving so much control to machines.

Looking back in time there is a long history of using female names to make terrifying phenomena less frightening. For hundreds of years, sailors gave ships female names before setting out to sea – a comforting personification ahead of an uncertain and potentially dangerous journey. But MacDorman cautions against relying on comfort, at the expense of limiting AI’s potential and our own. “Having digital assistants follow traditional gender roles could increase their acceptance among people who came of age when gender roles were differentiated. But in general, traditional views on gender have been supplanted by more egalitarian views to the point that what we should be asking is whether male and female genders need be applied to digital assistants. Perhaps digital assistants could have non-binary options.

Some companies are already in the zone. Ask Google Assistant its gender and it says: “I’m all-inclusive.” Ask the device’s sibling, Google Home and it demurs: “I like to stay neutral.”

Paige Ferrari is a writer, journalist, and documentary producer who covers the intersections of tech, business, science, and popular culture. She currently lives in Los Angeles, California.

 

Illustration: Mark McQuade 

This article appeared in Issue 4 - Identity

Download Issue 4