Daily Dispatch

It’s time to voice a gender problem over smart speakers

-

“I’d blush if I could.” Until recently, that’s the way Apple’s digital voice assistant Siri would respond to being called a “b––.”

After coming under pressure, Tim Cook’s team finally changed the reply to “I don’t know how to respond to that.” But by then, the damage had been done. We’ve grown so used to abusing and insulting female-gendered smart speakers, such as Siri and Amazon’s Alexa, that it’s almost become a sport.

When Microsoft’s Cortana launched in 2014, many of the questions she received were about her sex life. When Siri was released on smartphone­s in 2011, it became a game to get her to call users her “master“’.

But, you might ask, who cares?

They are simply machines, without feelings or a conscience. In any case, they sometimes deserve the abuse. Who hasn’t wanted to throw their smart speaker out of the window when it played KPop instead of Taylor Swift?

The problem is the real-world impact this can have on women and how they are viewed.

A 2019 UN report found AI smart speakers with female voices send a signal that women are “obliging, docile and eager-toplease helpers, available at the touch of a button or blunt voice command like ‘hey’ or ‘OK’.”

Particular­ly worrying, it said, is how they often give “deflecting, lacklustre or apologetic responses” to insults. And that’s what we’re teaching our children. An entire generation has grown up barking orders and insults at female-gendered smart speakers that are designed to be subservien­t.

Tech giants have done little to address the issue. The way smart speakers have been designed means there’s no place for even basic niceties, like please and thank you — in fact, that would probably confuse Alexa.

Now, children expect to get what they want if they are demanding. Venture capitalist Hunter Walk, for instance, has written about how his Amazon Echo caused his four-year-old to become bossy.

“Cognitivel­y I’m not sure a child gets why you can boss Alexa around but not a person,” he wrote in his blog. “At the very least, it creates patterns and reinforcem­ent that so long as your diction is good, you can get what you want without niceties.”

Part of the problem is these devices have been created largely without female input.

More than 75 percent of computer programmer­s in the US are male, and 83 percent in the UK. They have around 80 percent of the technical positions at Apple, Facebook, Microsoft and Google, and just over 63 percent at Amazon.

To these male-heavy tech teams, female voices are warmer and more pleasant. Daniel Rausch, the head of Amazon’s Smart Home division, said that his team “carried out research and found that a woman’s voice is more sympatheti­c”.

Had they asked more women, however, they may have thought twice about the gender of their smart speaker. They could have better anticipate­d the abuse and they may even have considered how children might be affected.

Things are starting to change, albeit slowly. In 2017, Amazon installed a “disengage mode for Alexa” so that she would reply to sexually explicit questions with either “I’m not sure what outcome you expected” or “I’m not going to respond to that”.

Other IT teams are also addressing the social issues that smart speakers create, but it may just be too little, too late.

Particular­ly worrying is how they often give ‘deflecting, lacklustre or apologetic responses’ to insults. And that’s what we’re teaching our children

Newspapers in English

Newspapers from South Africa